diff --git a/README.md b/README.md index 02d760e..57ddb20 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ # KANBert -KANBert combines a transformer-based autoencoder architecture with Kolmogorov-Arnold Networks (KANs) for an innovative approach to language understanding tasks. +KANBert combines a transformer-based encoder only architecture with Kolmogorov-Arnold Networks (KANs) for an innovative approach to language understanding tasks.

Kalmogorov @@ -8,9 +8,9 @@ KANBert combines a transformer-based autoencoder architecture with Kolmogorov-Ar BERT

-## Transformer Auto-Encoders +## Transformer Encoders -Transformer auto-encoders are relatively smaller models compared to their generative counterparts but are highly effective in performing language understanding tasks such as classification, named entity recognition, and embedding generation. The most popular transformer auto-encoder is BERT, and the parameters here are inspired by this model. +Transformer encoders are relatively smaller models compared to their generative counterparts but are highly effective in performing language understanding tasks such as classification, named entity recognition, and embedding generation. The most popular transformer encoder is BERT, and the parameters here are inspired by this model.

BERT Architecture