activation-functions-comparison-pytorch Comparison of common activation functions on MNIST dataset using PyTorch. Activation functions: Relu Sigmoid Tanh Best result: Relu