Building an MLP Neural Network from scratch, using only numpy. My main goal is to reduce the degree of "black-box" understanding with NNs, specifically back-propagation.
To run:
- Clone the repo
- Run
python testingEnvironment.py
Epochs (iterations): 400
Average loss after training: 14.65~
Accuracy: // TODO: ADD ACCURACY TESTING
You can read my detailed thoughts here: Notion Page
The following is a visualization of the forward propogation alg for each layer, but note that the activation functions might change