Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[FEA] distributed autograd primitives (for tensors of variable size o…
…n each rank) (#105) * initial commit * add tests and fix bugs * format code * proper pytest markups * add a few docstrings * add test remark * update changelog * remove usage of all_to_all_single, address feedback * update docstrings to be more precise about bwd ops * format code * fix typo --------- Co-authored-by: Maximilian Stadler <mstadler.nvidia.com>
- Loading branch information