attentions_pytorch A repository for implementations of attention mechanism by PyTorch. Introduction This repository contains implementations of attention mechanism by PyTorch. Implementations Dot-Product Attention Scaled Dot-Product Attention General Attention Multi Head Attention Author SeungHyun Lee @whsqkaak Contacts: whsqkaak@naver.com