Simple model
In this model we have some input as our dendrites and output is the result of our hypothesis function (axons).
Or we can say our input nodes (input layer) go into another node (hidden layer), and are output as the hypothesis function (output layer).
For a full connect linear propagation, we need understand BackPropagation(反向传播法):
If we have some input data , target , output and initial weight . Our aim of Neural Network is to let the estiamte output close to the true output.
Input Layer to Hidden Layer
Note that the weight is between input layer and hidden layer.
The output of hidden layer with sigmoid function is
Hidden Layer to Output Layer
Similar to the above and we can get
By using the initial weight the always far away from the true output result then we need to use BackPropagation to renew the weight.
BackPropagation
Find the Total variance of output:
For example in the figure above, the can also write like
To renew the a weight, we want to know how much of influence of this weight to the total variance. Then this in Math is partial derivative: