Unlocking efficient reverse mode automatic differentiation for seamless gradient computation from scratch with a touch of retro flair.
Retrograd is a lightweight autograd library inspired by the early days of automatic differentiation (AD) frameworks like PyTorch and TensorFlow. It provides a custom Tensor class that enables reverse mode automatic differentiation for tensor operations. With Retrograd, you can easily compute gradients of complex mathematical functions, making it useful for tasks like gradient-based optimization, neural networks, and machine learning.
- Custom Tensor class: Retrograd introduces a specialized Tensor class that enables automatic differentiation by tracking the operations applied to tensors.
- Reverse mode AD: Retrograd implements reverse mode automatic differentiation, which allows efficient computation of gradients for a wide range of functions and operations.
- Lightweight and easy-to-use: Retrograd is designed to be a minimalistic autograd library, making it simple to understand and incorporate into your projects.