Simple example of using Epochraft to train HuggingFace transformers models with PyTorch FSDP.
🌟 News: We are thrilled to announce the release of two new models: Japanese Stable LM Gamma 7B and Japanese StableLM 3B, both trained using our codebase.
pip install -e .
python train.py gpt2_testrun.yaml # 1 GPU
torchrun --nproc-per-node=8 train.py gpt2_testrun.yaml # 8 GPUs
pip install -e .[development]
mypy .; black .; flake8 .; isort .