Code Repository for Liquid Time-Constant Networks (LTCs)
-
Updated
Jun 3, 2024 - Python
Code Repository for Liquid Time-Constant Networks (LTCs)
Repository for the tutorial on Sequence-Aware Recommender Systems held at TheWebConf 2019 and ACM RecSys 2018
Liquid Structural State-Space Models
Efficient Python library for Extended LSTM with exponential gating, memory mixing, and matrix memory for superior sequence modeling.
Implementation of GateLoop Transformer in Pytorch and Jax
Pytorch implementation of Simplified Structured State-Spaces for Sequence Modeling (S5)
Sequential model for polyphonic music
Contains various architectures and novel paper implementations for Natural Language Processing tasks like Sequence Modelling and Neural Machine Translation.
The Reinforcement-Learning-Related Papers of ICLR 2019
Repo to reproduce the First-Explore paper results
Source code for "A Lightweight Recurrent Network for Sequence Modeling"
Python package for Arabic natural language processing
An implmentation of the AWD-LSTM in PyTorch
Audio and Music Synthesis with Machine Learning
Human Activity Recognition using Deep Learning on Spatio-Temporal Graphs
Tensorflow implementation of Long Short-Term Memory model for audio synthesis used for thesis
Deep, sequential, transductive divergence metric and domain adaptation for time-series classifiers
VOGUE: Variable Order HMM with Duration
The course studies fundamentals of distributed machine learning algorithms and the fundamentals of deep learning. We will cover the basics of machine learning and introduce techniques and systems that enable machine learning algorithms to be efficiently parallelized.
An unofficial implementation of "TransAct: Transformer-based Realtime User Action Model for Recommendation at Pinterest" in Tensorflow
Add a description, image, and links to the sequence-modeling topic page so that developers can more easily learn about it.
To associate your repository with the sequence-modeling topic, visit your repo's landing page and select "manage topics."