Universal Bidirectional Activation-based learning (UBAL) is a novel learning algorithm for artificial neural networks, which is based on the workings of real biological neurons. Instead of propagating error derivatives it uses local activation values for updating its weights. It extends contrastive Hebbian learning, which uses presynaptic and postsynaptic activation values for the updates. It is also bidirectional for which it uses two different weight matrices for each activation direction. The ideas for UBAL comes from its predecessor algorithms recirculation and GeneRec. UBAL has never before been implemented in a convolutional version, which was the main aim of the master thesis. Convolutional neural networks are usually better suited for processing images, therefore our hypothesis was that convolutional UBAL also yields better results in the image classification task compared to the current, fully connected version. We implemented convolutional UBAL and make experiments in the programming language Python, namely the Pytorch library. We tested the implementation on the famous MNIST dataset. The results show that we were reasonably successful with our best classification accuracy of 89.678% compared to 91.05% for backpropagation neural network. Apart from convolutional UBAL, we explored the influence of target encoding on success of UBAL network. Namely, we used simplified binary images of digits instead of one-hot encoding in the output neurons. Our results show, we did not achieve better results with this approach.
|