In a multi-layered neural network weights and neural connections can be treated as matrices, the neurons of one layer can form the columns, and the neurons of the other layer can form the rows of the matrix. The figure below shows a network and its parameter matrices.

The meanings of vectors and matrices above:

*n ^{in}_{l }*: the input of the

*l.*layer.

*n*: the output of the

^{out}_{l }*l.*layer. The input vector of the neural network is

*n*and the output vector is

^{out}_{0},*n*.

^{out}_{L}(l=1…L)*b _{l}*: the bias (threshold) vector of the

*l.*layer.

*W*: Weight parameter matrix between layers

_{l}*l*and

*(l-1)*.

*n ^{out}_{l}=f(n^{in}_{l})*: Activation function of the neurons.

For example the weight matrix of the 3^{rd} layer can be expressed as below:

Using matrices for forward propagation:

The backpropagation algorithm:

The vector *e* means the error of the current layer, and *t* is the current target vector. After determining the errors on all layers the gradients can be computed in one single forward-propagation step:

where P means the number of the training patterns. The algorithm above must be executed for all patterns.

//

## One thought on “The matrix form of the Backpropagation algorithm”