Download link – Click Here
COPYRIGHT AND DISCLAIMER – I impose no restriction in copying, modifying or using this code in anyway or purpose. Having said that, I also disclaim any liability from its use.
Perceptron-based multilayer neural network is one of the few closet methods of emulation we have of biological learning. We use samples to ‘train’ the network to recognize features of input that can be used to classify the input to a certain output. On feeding input to the network in each learning cycle, the network moves its weight vectors closer and closer towards solution space. After a good amount of training and verification, the network can classify inputs that are were not even used to train the network. In other words, the network can ‘generalize’ for lack of a better word. These types of network and its variations have been used implemented to classify a lot of non-linear problems with good accuracy particularly in Optical Character Recognition domain.
Personally, I find the biggest learning curve on this topic to be Mathematics behind it. It is frustrating to have to decode mathematical syntax that seems to vary from author to author. Especially if you are newbie to Mathematicians’ circle and Greek symbols keep confusing you (having to keep looking back on what they meant), you can expect to spend a fair amount to time decoding the process. Books on Neural Networks seem to focus heavily on providing proofs and then laying out these mathematical equations rather than providing a step by step worked out guide.
So, I wrote a simple MATLAB class file that can be used to implement a three layer (input, hidden and output layers) gradient decent back propagation neural network. It makes heavy use of matrix multiplications, the reason why we love MATLAB so much, avoiding ‘for’ loops wherever possible. Sure, MATLAB’s Neural Network Toolbox has a good GUI and probably is GPU-accelerated, but my aim here is to help you understand how things work under the hood and cut down the learning curve. With this class you can save trained weights, calculate cycle errors, change activation function and of course the usual – train and generate output from input.
Here are some things you might want to know about the class:
Class name is ‘NeuralNet‘
- To calculate cycle error, be sure to call ‘StartNewCycle‘ function before starting a new training cycle. At the end of each cycle, be sure to call ‘EndCycle‘ function to get the calculated error for that cycle.
- The class in its original form uses ‘bipolar activation function’. So, if you decide to use any other activation function change the ‘f2‘ private function handle and its partial derivative represented by ‘f2_dash‘.
- To save learned weights, just use MATLAB’s ‘save‘ function with the object created from this class. To load, just call ‘load‘. Refer to MATLAB documentation for more information.