Symbolic Transformers #37
LukasZahradnik
announced in
Announcements
Replies: 3 comments 9 replies
-
this will be huge, we really need to find ways to propagate it... |
Beta Was this translation helpful? Give feedback.
0 replies
-
I was just checking that nothing had changed from October and then I found this! Amazing :) do you have any result/benchmark on this new feature? |
Beta Was this translation helpful? Give feedback.
0 replies
-
@joaquincabezas We only have benchmarks for GNN models, although those need to be updated, as we made a lot of performance improvements. But I'm considering adding benchmarks on transformers. |
Beta Was this translation helpful? Give feedback.
9 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Additions:
Transformer
,TransformerEncoder
, andTransformerDecoder
.Attention
,MultiheadAttention
.PositionalEncoding
module.add
), subtraction (sub
), and modulo (mod
).Changes:
Variable
/V
factory - only the first letter is capitalized.This discussion was created from the release Symbolic Transformers.
Beta Was this translation helpful? Give feedback.
All reactions