The architecture of the network is flexible (number of layers, input and output units). The neural network classifies using regularized logistic regression. The gradients are computed with backpropagation and are checked numerically. The network is optimized with the SciPy nonlinear conjugate gradient algorithm. When several regularization parameters are used, the optimization is parallelized. Finally, learning curves are computed to evaluate the performance of the neural network.
If you are interested in using more reliable neural networks, I recommend Lasagne (use the wrapper nolearn, or follow the instructions here).
For a nice introduction to (deep) neural nets, check this thorough deep learning tutorial.