A minimalist neural networks library built on a tiny autograd engine. Very much inspired by the micrograd library created by Andrej Karpathy.
This project aims to:
- demonstrate automatic differentiation, a core concept of modern Deep Learning frameworks like PyTorch and TensorFlow;
- define a simple API for training neural nets, somehow mimicking Keras and PyTorch Ignite;
- follow good coding practices, including type annotations and unit tests.
The demo notebook showcases what pyfit is all about.
- Autograd engine [ source | tests ]
- Neural networks API [ source | tests ]
- Metrics [ source | tests ]
- Optimizers [ source | tests ]
- Data utilities [ source | tests ]
- Training API [ source | tests ]
pyfit uses the following tools:
Run the following commands in project root folder to check the codebase.
> pylint pyfit/* tests/* # linting (including type checks)
> mypy . # type checks only
> pytest # test suite