Artificial Intelligence Fight II. – introducing parallel processing

Dear Reader I have been working on a multithreaded implementation of the Backpropagation algorithm. The most computationally intensive parts of a learning iteration are the forward-propagation and the backpropagation steps which are used in all algorithms to determine the gradients. The main difference between these algorithms is how these gradients are used to update the […]

C++11 DLL library for the Matrix-RProp algorithm

Dear Reader, This project is currently in progress but I thought I would publish it anyway. I have created a modern C++ DLL from the code extracted from the Borland C++ Builder project, as my 15 year old code is hardly going to be useful for anyone these days. https://github.com/bulyaki/NNTrainerLib What this library currently does […]

The matrix form of the RProp algorithm

Since the RProp algorithm uses if/else conditional statements while determining update values some special helper matrix-functions and helper matrices must be introduced. These functions will allow us to express the conditional statements more elegantly, only using matrix operations. Some matrices needed to make the above functions work: D: decision matrix, M: meta-gradient matrix, U: update […]

The matrix form of the Backpropagation algorithm

In a multi-layered neural network weights and neural connections can be treated as matrices, the neurons of one layer can form the columns, and the neurons of the other layer can form the rows of the matrix. The figure below shows a network and its parameter matrices. The meanings of vectors and matrices above: ninl […]