Skip to content

Unlocking efficient reverse mode automatic differentiation for seamless gradient computation from scratch with a touch of retro flair.

License

Notifications You must be signed in to change notification settings

ayush-vatsal/Retrograd

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Retrograd

Unlocking efficient reverse mode automatic differentiation for seamless gradient computation from scratch with a touch of retro flair.

Retrograd is a lightweight autograd library inspired by the early days of automatic differentiation (AD) frameworks like PyTorch and TensorFlow. It provides a custom Tensor class that enables reverse mode automatic differentiation for tensor operations. With Retrograd, you can easily compute gradients of complex mathematical functions, making it useful for tasks like gradient-based optimization, neural networks, and machine learning.

Features

  • Custom Tensor class: Retrograd introduces a specialized Tensor class that enables automatic differentiation by tracking the operations applied to tensors.
  • Reverse mode AD: Retrograd implements reverse mode automatic differentiation, which allows efficient computation of gradients for a wide range of functions and operations.
  • Lightweight and easy-to-use: Retrograd is designed to be a minimalistic autograd library, making it simple to understand and incorporate into your projects.

About

Unlocking efficient reverse mode automatic differentiation for seamless gradient computation from scratch with a touch of retro flair.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages