Vector Symbolic Architectures (VSAs) are one approach to developing Neuro-symbolic AI, where two vectors in
CSPS
conda create --name csps python=3.9 -y && conda activate csps
- PyTorch v1.13.1+cu116
XML
conda create --name xml python=3.9 -y && conda activate xml
Classical VSA Tasks
- Torchhd
pip install torch-hd
- CSPS
- MNIST, SVHN, CIFAR10, and CIFAR100 are the standard datasets that come with PyTorch library. MiniImageNet can be downloaded from Kaggle.
- XML
- For XML experiments pre-processed features are used. All the datasets can be downloaded from the benchmark website.
The code is organized in two folders. CSPS code is in the CSPS Exp/
folder and XML code is in the XML Exp/
folder. For both of them, individual folders are used by the name of the dataset. For example, network, and training
files related to CIFAR10 datasets are the cifar10/
subfolder, and so on. We have compared the proposed HLB method
with HRR, VTB, and MAP vector symbolic architectures. Results of HRR are taken from previous papers. Our training code
file name for VTB, MAP, and HLB methods are named as
Similarly, the XML Exp/
folder contains subfolders for each dataset containing training files. Once again, the
training files are named as dataset_fast.py
is a fast
dataloader for smaller datasets (Bibtex, Mediamill, Delicious). dataset_sparse.py
is a dataloader for loading
larger data files. Code regarding the classical VSA tasks is in the Classical VSA Tasks/
folder.
To get more information about the proposed method and experiments, please go through the paper. If you use this work or find this useful, cite the paper as:
@article{alam2024walsh,
title={A Walsh Hadamard Derived Linear Vector Symbolic Architecture},
author={Alam, Mohammad Mahmudul and Oberle, Alexander and Raff, Edward and Biderman, Stella and Oates, Tim and Holt, James},
journal={arXiv preprint arXiv:2410.22669},
year={2024}
}