All the code files related to the deep learning course from PadhAI
-
Updated
Apr 13, 2020 - Jupyter Notebook
All the code files related to the deep learning course from PadhAI
MNIST classification using Multi-Layer Perceptron (MLP) with 2 hidden layers. Some weight-initializers and batch-normalization are implemented.
Implementation of key concepts of neuralnetwork via numpy
Library which can be used to build feed forward NN, Convolutional Nets, Linear Regression, and Logistic Regression Models.
A repository to show how Xavier initialization in Neural Networks help to initialize the weights of the network with random values that are not too small and not too large.
Fully connected network on MNIST data using Tensorflow
My extensive work on Multiclass Image classification based on Intel image classification dataset from Kaggle and Implemented using Pytorch 🔦
Experiments with Tensorflow to Replicate Experiments for the Xavier Initialization Deep Learning Optimization
Generic L-layer 'straight in Python' fully connected Neural Network implementation using numpy.
My beautiful Neural Network made from scratch and love. It plays the game Flappy-Birds flawlessly, in 3 to 9 generations!!
Xavier Initialization in Deep Learning
A facial keypoint detection system.
Neural Network from Scratch with Python
3-layer linear neural network to classify the MNIST dataset using the TensorFlow
Add a description, image, and links to the xavier-initializer topic page so that developers can more easily learn about it.
To associate your repository with the xavier-initializer topic, visit your repo's landing page and select "manage topics."