Skip to content

lavinal712/EDiT

Repository files navigation

EDiT

About The Project

This repository provides code for training Diffusion Transformer (DiT) models.

  • Improved DiT training script with accelerate
  • Supports multiple models.
  • Easy to modify the code for your own research.

Supported Models

Model Training Inference
DiT
SiT
REPA
Large-DiT

Getting Started

Installation

conda create -n edit python=3.11 -y
conda activate edit
pip install -r requirements.txt

Acknowledgments

  • DiT: Scalable Diffusion Models with Transformers
  • fast-DiT: Scalable Diffusion Models with Transformers
  • SiT: SiT: Exploring Flow and Diffusion-based Generative Models with Scalable Interpolant Transformers
  • REPA: Representation Alignment for Generation: Training Diffusion Transformers Is Easier Than You Think
  • Large-DiT: Large Diffusion Transformer
  • PixArt-α: PixArt-α: Fast Training of Diffusion Transformer for Photorealistic Text-to-Image Synthesis
  • LlamaGen: Autoregressive Model Beats Diffusion: Llama for Scalable Image Generation

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages