A from scratch implementation of matrix autograd along with a simple proof of concept implementation of a deep neural network that achieves 82% accuracy on MNIST handwritten digit classification.
Clone the repo:
git clone [email protected]:jagdeepsb/autograd.gitEither install Python dependencies with conda:
conda env create -f environment.yml
conda activate autogrador with pip:
pip install -r requirements.txtTo see a simple example of matrix autograd, you can run:
python simple_computation.pyFor a simple training script of the MNIST digit classifier, run:
python train_example.pyThe autograd Tensor class supports many common operations. Let a and b be two tensors of the same shape. Let c be a constant (float). This implementation supports the following:
result = a + bresult = a + cresult = c + a
result = a - bresult = a - cresult = c - aresult = -a
result = a * bresult = a * cresult = c * a
result = a / c
result = a ** c
Let W be a Tensor with shape (m x n) and x be a Tensor with shape (n x 1):
result = W @ x
- Transpose:
result = a.T - Sum:
result = a.sum - Sigmoid:
result = a.sigmoid - ReLU:
result = a.relu
Feel free to contribute by making a pull request to the repo! You will need to install the following pip packages: mypy, pytest, pytest-pylint, docformatter, yapf and make sure your contributions pass the following styling guidelines:
mypy . --config-file mypy.ini
python -m py.test --pylint -m pylint --pylint-rcfile=.cbt_pylintrcAdditionally, run the following to format your code:
docformatter -i -r .
yapf -r --style .style.yapf .