PHP BPE Text Encoder / Decoder for GPT-2 / GPT-3
-
Updated
Mar 23, 2023 - PHP
PHP BPE Text Encoder / Decoder for GPT-2 / GPT-3
Implementation of GPT from scratch. Design to be lightweight and easy to modify.
Easy-to-use scripts to fine-tune GPT-2-JA with your own texts, to generate sentences, and to tweet them automatically.
This project is used to generate a blog post using Natural Language processing, Hugging Face Transformers and GPT-2 Model.
ChatKeke - a simple GPT-2 web chat (web frontend & Flask+TensorFlow backend)
Scripts easily generate tweets by GPT-2 Japanese model fine-tuned by your timeline.
A simple CLI chat mode framework for local GPT-2 Tensorflow models
Cricket commentary generation using Video Vision Transformer and GPT-2
Programming assignments covering fundamentals of machine learning and deep learning. These were completed as part of the Plaksha Tech Leaders Fellowship program.
GPT-2 Language Model for Thai Poem Generation in Phra-Aphai-Manee Style
Saint valentin's GPT-2 letters.
Train own hugging face language model
Add a description, image, and links to the gpt-2-text-generation topic page so that developers can more easily learn about it.
To associate your repository with the gpt-2-text-generation topic, visit your repo's landing page and select "manage topics."