Skip to content
This repository has been archived by the owner on Jul 28, 2022. It is now read-only.

Latest commit

 

History

History
70 lines (49 loc) · 1.92 KB

README.md

File metadata and controls

70 lines (49 loc) · 1.92 KB


Fashion MNIST Classifier

GitHub language count Repository size License

Implementation

Scoring

With 1 dense layer

Overall accuracy: 0.7950

With 2 dense layers

Baseline accuracy: 0.8484
Best validation accuracy: 0.8640

How this neural network is structured?

Input, output and hidden layers

layers

Summary

Model

Type: "sequential"

Layer (type) Output Shape Param #
flatten (Flatten) (None, 784) 0
dense (Dense) (None, 256) 200960
dense_1 (Dense) (None, 128) 32896
dropout (Dropout) (None, 128) 0
dense_2 (Dense) (None, 10) 1290

Total params: 235,146
Trainable params: 235,146
Non-trainable params: 0

Optimizer

Adam (SGD)
Adam optimization is a stochastic gradient descent method that is based on adaptive estimation of first-order and second-order moments.

Loss Function

Sparse categorical crossentropy

References

Xiao, H. et al. Fashion MNIST Dataset. GitHub; 2015.
Sampaio, C. Deep Learning. AOVS Sistemas de Informática S.A; 2019.
Chollet, F. et al. Keras. GitHub; 2015.