A gradient learner (well, hopefully some day).
gradstudent
is a learning project. The goal is to build a simple library for manipulating tensors (multidimensional arrays)
and computing the derivatives of these manipulations via (reverse-mode) autograd. Emphasis is placed on accomplishing these
goals with modern C++. Performance alone is not the main objective.
See the examples directory.
Presently, gradstudent
implements a Tensor
class that acts as a container for (strided) multidimensional arrays.
Several simple operations are supported:
gradstudent
also contains the following utilities:
The next main hurdle:
- automatic differentiation.
Other things that could be interesting to explore:
- improvements to kernels/optimizations;
- lazy evaluation;
- support for different data types.
Enhancements that would be desirable for the development process:
- Test coverage: gcov/lcov.
An example Dockerfile containing the requirements used by this project is given. For the convenience of VSCode users, a devcontainer configuration is given as well.
Briefly, the requirements are as follows:
- Build: CMake, clang (recommended);
- Documentation: Doxygen;
- Testing: GoogleTest;
- Benchmarking (limited for now): Google Benchmark;
- Code quality: clang-format, clang-tidy, cppcheck.
git clone https://github.com/bencwallace/gradstudent.git
cd gradstudent
cmake -B build
cmake --build build -j
cmake --build --target test
Git hooks can be found in the ./tools/git
directory. From the repository root, they can be installed as follows:
git config core.hooksPath tools/git
Scripts for linting and generating documentation can be found in the ./tools
directory. They can be used as follows:
./tools/format.sh # run formatter
./tools/makedocs.sh # build docs
./tools/lint.sh # run linter