This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
-
Updated
Sep 23, 2021 - Python
This repository contain various types of attention mechanism like Bahdanau , Soft attention , Additive Attention , Hierarchical Attention etc in Pytorch, Tensorflow, Keras
Code for WWW2019 paper "A Hierarchical Attention Retrieval Model for Healthcare Question Answering"
[ACL 2023] Official resources of "HAHE: Hierarchical Attention for Hyper-Relational Knowledge Graphs in Global and Local Level".
Scalable Hierarchical Self-Attention with Learnable Hierarchy for Long-Range Interactions
Exploring Linguistic Signal for Suicidality in Social Media
Add a description, image, and links to the hierarchical-attention topic page so that developers can more easily learn about it.
To associate your repository with the hierarchical-attention topic, visit your repo's landing page and select "manage topics."