HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Generalizing MLPs With Dropouts, Batch Normalization, and Skip Connections

Kim Taewoon

Generalizing MLPs With Dropouts, Batch Normalization, and Skip
  Connections

Abstract

A multilayer perceptron (MLP) is typically made of multiple fully connectedlayers with nonlinear activation functions. There have been several approachesto make them better (e.g. faster convergence, better convergence limit, etc.).But the researches lack structured ways to test them. We test different MLParchitectures by carrying out the experiments on the age and gender datasets.We empirically show that by whitening inputs before every linear layer andadding skip connections, our proposed MLP architecture can result in betterperformance. Since the whitening process includes dropouts, it can also be usedto approximate Bayesian inference. We have open sourced our code, and releasedmodels and docker images at https://github.com/tae898/age-gender/

Code Repositories

tae898/age-gender
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
age-and-gender-classification-on-adienceRetinaFace + ArcFace + MLP + Skip connections
Accuracy (5-fold): 90.66
age-and-gender-classification-on-adience-ageRetinaFace + ArcFace + MLP + IC + Skip connections
Accuracy (5-fold): 60.86

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp