It has a stochastic gradient descent algorithm, that uses a convolution layer, a perceptron layer (fully connection layer), and a softmax output layer. The convolution layer includes sublayers of linear convolution, nonlinear activation, and max pooling. The network can be composed with more layers. Demo cases are given for MNIST and CIFAR-10.
The convolution layer uses 4-level for-loop for convolution and gradients computations. It can be further optimized, but probably not necessary as a simple tutorial.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4