Skip to content

FredyRivera-dev/Flux2-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Flux2-from-scratch

This repository implements the Flux2 model from scratch, specifically focusing on training the Flux2 Transformer. To simplify the process, I'm leveraging the existing AutoEncoder and Text Encoder.

The base implementation is taken from the official black-forest-labs/flux2 repository.

Note: I'll explain the entire implementation in detail on my blog once the project is complete.

Datasets

The following datasets will be used for this project:

Installation

uv pip install torch==2.9 transformers==4.57.6 https://github.com/mjun0812/flash-attention-prebuild-wheels/releases/download/v0.7.0/flash_attn-2.8.3+cu128torch2.9-cp312-cp312-linux_x86_64.whl flashinfer-python https://github.com/FredyRivera-dev/Flux2-from-scratch.git

Note: I'm providing a pre-compiled version of Flash Attention for PyTorch 2.9, so you don't have to wait to compile it from scratch.

About

This repo proposes to implement the Flux2 model from scratch

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors