The matrix form of the Backpropagation algorithm

In a multi-layered neural network weights and neural connections can be treated as matrices, the neurons of one layer can form the columns, and the neurons of the other layer can form the rows of the matrix. The figure below shows a network and its parameter matrices.

nn_matrices
Matrices used in modelling a multi layered neural network

The meanings of vectors and matrices above:
ninl : the input of the l. layer.
nout: the output of the l. layer. The input vector of the neural network is nout0, and the output vector is noutL (l=1…L).

bl: the bias (threshold) vector of the l. layer.
Wl: Weight parameter matrix between layers l and (l-1).

noutl=f(ninl): Activation function of the neurons.

For example the weight matrix of the 3rd layer can be expressed as below:

weightmatrix

Using matrices for forward propagation:

forwardprop

The backpropagation algorithm:

backpropThe vector e means the error of the current layer, and t is the current target vector. After determining the errors on all layers the gradients can be computed in one single forward-propagation step:

computegradientswhere P means the number of the training patterns. The algorithm above must be executed for all patterns.

//

Advertisements

One thought on “The matrix form of the Backpropagation algorithm

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s