Skip to content

Latest commit

 

History

History
11 lines (9 loc) · 924 Bytes

File metadata and controls

11 lines (9 loc) · 924 Bytes

Neural-Network-from-scratch

Summary of Training a Neural Network

  1. Randomly initialize the weights.
  2. Implement forward propagation to get hθ(xi) for any xi.
  3. Implement the cost function.
  4. Implement backpropagation to compute partial derivatives of weights and biases.
  5. Use gradient descent to minimize the cost function with weights in θ.

My Words

I first learned neural network algorithm from a free online ML course series by Andrew Ng. Mr. Ng did a very great job on explaining every detail, and I had a general understanding of neural network. Nonetheless, I still tried to build neural network from scratch, and this exercise indeed helps me gain a much deeper comprehension of how different algorithms, such as backpropagation and gradient descent, come together to build a neural network.