MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
-
Updated
Jan 20, 2017 - Python
MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
A module for making weights initialization easier in pytorch.
RNN-LSTM: From Applications to Modeling Techniques and Beyond - Systematic Review
Program implements a convolutional neural network for classifying images of numbers in the MNIST dataset as either even or odd using GPU framework.
Add a description, image, and links to the weight-initializers topic page so that developers can more easily learn about it.
To associate your repository with the weight-initializers topic, visit your repo's landing page and select "manage topics."