[Feature Request] Mixtral training support #31
Labels
feature request
Feature request pending on roadmap
good first issue
on roadmap
Feature request on roadmap
For reference, LLaMA-Factory claims that using their toolkit you can QLoRA fine-tune mixtral with 28GB of VRAM.
The text was updated successfully, but these errors were encountered: