R-based implementation of a Stochastic Gradient Descent method neural network with a width of 50 and one hidden layer. The algorithm performs a parameter update through a forward - backward propagation. minibatching size of 50. The aim is to estimate the parameters of the model from a training dataset of a 1-dimensional function.
LeeoBianchi/SDG_NeuralNetwork
Folders and files
| Name | Name | Last commit date | ||
|---|---|---|---|---|