Skip to content

DECOLLE (Deep Continuous Local Learning) implemented on SL-Animals-DVS dataset for training Spiking Neural Networks

License

Notifications You must be signed in to change notification settings

ronichester/SL-animals-DVS-decolle

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 

Repository files navigation

SL-animals-DVS training with DECOLLE (improved)

This repository contains a custom DECOLLE (Deep Continuous Local Learning) implementation on the SL-Animals-DVS dataset using Pytorch and the DECOLLE package software. The first results reported in this repository were an initial atempt to reproduce the published results. Additionally, improvements were made to the original implementation, optimizing the input data and the spiking neural network in order to enhance the gesture recognition performance. Details of the techniques applied can be found in the following conference paper, published in LASCAS 2024.

A BRIEF INTRODUCTION:
DECOLLE is an online training method that directly trains a Spiking Neural Network (SNN), updating the network weights at each single time step. Therefore, it is a suitable method for online training SNNs, which are biologically plausible networks (in short).
The SL-animals-DVS is a dataset of sign language (SL) gestures peformed by different people representing animals, and recorded with a Dynamic Vision Sensor (DVS).

The reported results in the SL-animals paper were divided in two: results with the full dataset and results with a reduced dataset, meaning excluding group S3. The results achieved with the implementation published here fall short of the published results, but get fairly close, considering the published results have no code available to reproduce them.

The implementation published in this repository is the first publicly available DECOLLE implementation on the SL-animals dataset (and the only one as of may 2023, as far as I know). The results are summarized below:

Full Dataset Reduced Dataset
Reported Results 70.6 +- 7.8 % 77.6 +- 6.5 %
This Implementation 64.14 +- 3.61 % 65.16 +- 2.86 %
Optimized Version 75.12 +- 0.91% N/A

Requirements

While not sure if the list below contains the actual minimums, it will run for sure if you do have the following:

  • Python 3.0+
  • Pytorch 1.11+
  • CUDA 11.3+
  • decolle (installation instructions here)
  • python libraries: os, numpy, matplotlib, pandas, sklearn, datetime, tonic, pyyaml, h5py, tensorboardX

README FIRST

This package contains the necessary python files to train a Spiking Neural Network with a custom DECOLLE method (a slightly modified version) on the Sign Language Animals DVS dataset.

IMPLEMENTATION
Package Contents:

  • dataset.py
  • decole_tools.py
  • sl_animals_decolle.py
  • train_test_only.py
  • parameters/params_slanimals.yml
  • decolle1 (folder with the custom decolle method)

The SL-Animals-DVS dataset implementation code is in dataset.py, and it's basically a Pytorch Dataset object. The library Tonic was used to read and process the DVS recordings.
Some auxiliary functions to slice the dataset, split the dataset, plot and animate dataset samples and some getters are in decolle_tools.py.

The main program is in sl_animals_slayer.py, which uses the correct experimental procedure for training a network using cross validation after dividing the dataset into train, validation and test sets. A simpler version of the main program is in train_test_only.py, which is basically the same except dividing the dataset only into train and test sets, in an effort to replicate the published results. Apparently, the benchmark results were reported in this simpler dataset split configuration.

The file params_slanimals.yml contains the main parameters that can be customized like batch size, sampling time, sample length, neuron type, data path, and many others.

Finally, the decolle1 folder contains the original decolle software implementation, slightly modified for outputting plots, easier display while training, and minor changes.

Use

  1. Clone this repository:
git clone https://github.com/ronichester/SL-animals-DVS-decolle
  1. Download the dataset in this link;
  2. Save the DVS recordings in the data/recordings folder and the file tags in the data/tags folder;
  3. Edit the custom parameters according to your preferences in parameters/params_slanimals.yml. The default parameters setting is functional and was tailored according to the information provided in the relevant papers, the reference codes used as a basis, and mostly by trial and error (lots of it!). You are encouraged to edit the main parameters and let me know if you got better results.
  4. Run train_test_only.py to start the SNN training:
python train_test_only.py
  1. The network weights, training curves and tensorboard logs will be saved in src/results. To visualize the training with Tensorboard:
  • open a terminal (I use Anaconda Prompt), go to the src directory and type:
tensorboard --logdir=results
  • open your browser and type in the address bar http://localhost:6006/ or any other address shown in the terminal screen.

References

Copyright

Copyright 2023 Schechter Roni. This software is free to use, copy, modify and distribute for personal, academic, or research use. Its terms are described under the General Public License, GNU v3.0.

About

DECOLLE (Deep Continuous Local Learning) implemented on SL-Animals-DVS dataset for training Spiking Neural Networks

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages