A replication of Diffusion-LM Improves Controllable Text Generation
This is a replication of Diffusion-LM Improves Controllable Text Generation.
To run this codebase, you need pytorch and transformers.
You also need to prepare pretrained weights for BERT models is you want to save some GPU hours. I recommand following models:
https://huggingface.co/prajjwal1/bert-mini
https://huggingface.co/bert-base-uncased
Enjoy!