Skip to content

Commit

Permalink
Update README.md
Browse files Browse the repository at this point in the history
  • Loading branch information
ilanaliouchouche authored Nov 25, 2024
1 parent 980335d commit 064a4ec
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,16 @@
# KANBert

KANBert combines a transformer-based autoencoder architecture with Kolmogorov-Arnold Networks (KANs) for an innovative approach to language understanding tasks.
KANBert combines a transformer-based encoder only architecture with Kolmogorov-Arnold Networks (KANs) for an innovative approach to language understanding tasks.

<p align="center">
<img src="./images/kalmogorov.webp" alt="Kalmogorov" width="100" height="133">
<img src="./images/arnold.jpeg" alt="Arnold" width="100" height="133">
<img src="./images/bert.jpeg" alt="BERT" width="100" height="133">
</p>

## Transformer Auto-Encoders
## Transformer Encoders

Transformer auto-encoders are relatively smaller models compared to their generative counterparts but are highly effective in performing language understanding tasks such as classification, named entity recognition, and embedding generation. The most popular transformer auto-encoder is BERT, and the parameters here are inspired by this model.
Transformer encoders are relatively smaller models compared to their generative counterparts but are highly effective in performing language understanding tasks such as classification, named entity recognition, and embedding generation. The most popular transformer encoder is BERT, and the parameters here are inspired by this model.

<p align="center">
<img src="images/bert_archi.png" alt="BERT Architecture" width="300">
Expand Down

0 comments on commit 064a4ec

Please sign in to comment.