Skip to content

Latest commit

 

History

History
8 lines (6 loc) · 487 Bytes

readme.md

File metadata and controls

8 lines (6 loc) · 487 Bytes

Transformer playground

This is my playground with Transformer seq2seq model implementation in PyTorch. Implementation is basically taken from excellent The Annotated Transformer guide, refactored, updated for pytorch 0.4 and more commented.

Update:

  • BLEU evaluation implementation has been added.
  • Beam Search implementation inspired by implementation in openNMT have been added.