Compare SELUs (scaled exponential linear units) with other activations on MNIST, CIFAR10, etc.
-
Updated
Nov 1, 2017 - Python
Compare SELUs (scaled exponential linear units) with other activations on MNIST, CIFAR10, etc.
EddyNet: A Deep Neural Network For Pixel-Wise Classification of Oceanic Eddies
Interesting python codes to tackle simple machine/deep learning tasks
A DCGAN implementing all the tricks from recent papers up to 2020 and from all over the internet. Trained on CelebA at 157x128. "GAN Hacks 2", if you will.
🤖 Implementation of Self Normalizing Networks (SNN) in PyTorch.
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
Deep Learning concepts practice using Cifar-10 dataset
This project will observe 4 different choice of hyperparameter on their effect to CNN with ReLU and SELU as activation function
This project focuses on using Convolutional Neural Networks (CNNs) to classify food images from the Food-11 dataset.
Deep Learning concepts practice using Cifar-10 dataset
Add a description, image, and links to the selu topic page so that developers can more easily learn about it.
To associate your repository with the selu topic, visit your repo's landing page and select "manage topics."