Skip to content

"Attention is all you need" Transformer implemented in keras.

License

Notifications You must be signed in to change notification settings

ViktorStagge/transformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

29 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Attention is all you need

A keras implementation of the Transformer in "Attention Is All You Need" (by Vaswani et. al).

Written after implementing the Compressive Transformer (originally created by Rae et. al) - as everything was already in place. Furthermore, the original Transformer is much better suited for being implemented in keras, as compared to the Compressive Transformer. This, as it has no internal states that resides outside of the own model's computational graph, nor does it use multiple different losses for its respective models.

About

"Attention is all you need" Transformer implemented in keras.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published