Have you ever wondered how neural networks work? The best you can do to better understand how they work is to program one yourself. Mine is in Python, and is inspired by the Machine Learning course in Coursera by Andrew Ng.

The architecture of the network is flexible (number of layers, input and output units). The neural network classifies using regularized logistic regression. The gradients are computed with backpropagation and are checked numerically. The network is optimized with the SciPy nonlinear conjugate gradient algorithm. When several regularization parameters are used, the optimization is parallelized. Finally, learning curves are computed to evaluate the performance of the neural network. |

The code is available at GitHub. It's not thoroughly commented, but you can at least get an idea of how things work by taking a look at it.

If you are interested in using more reliable neural networks, I recommend Lasagne (use the wrapper nolearn, or follow the instructions here).

For a nice introduction to (deep) neural nets, check this thorough deep learning tutorial.

If you are interested in using more reliable neural networks, I recommend Lasagne (use the wrapper nolearn, or follow the instructions here).

For a nice introduction to (deep) neural nets, check this thorough deep learning tutorial.