Skip to content

Latest commit

 

History

History
17 lines (14 loc) · 1.57 KB

File metadata and controls

17 lines (14 loc) · 1.57 KB

Generic Feed Forward Neural Network

Developed python library implementing neural networks with support for custom number of hidden layers & neurons in each layer. Implemented back propogation from strach for updating the weight parameters of the model. We are given an MNIST dataset to train our model. The MNIST database is a large database of handwritten digits that is commonly used for training various image processing systems. Studied various hyperparameters such as learning rate, batch size, activation function, number of hidden layers, number of neurons in each layer and their effects on the model performance.

Resources

Chapter 6, Neural Networks - A Classroom Approach by Satish Kumar Additional References:

  1. Understanding backpropagation
  2. NNets and backpropagation

Alternative Approach (Using Computation Graph):

  1. Backpropagation and Neural Networks using Computation Graph

Extra topics studied for the assignment

  1. Understanding Overfitting and underfitting using a complete example
  2. Overfitting and Underfitting
  3. Regularization