Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue Running Fine-Tuning Script in Chapter 5 #36

Open
adamscarlat opened this issue Jan 15, 2025 · 0 comments
Open

Issue Running Fine-Tuning Script in Chapter 5 #36

adamscarlat opened this issue Jan 15, 2025 · 0 comments

Comments

@adamscarlat
Copy link

First of all, I want to thank you for writing such a great book. I’ve really been enjoying it so far, and the explanations are clear and insightful. It’s been a pleasure working through the material, and I’ve already learned a lot.

I’m currently working through Chapter 5 and tried to run the fine-tuning script at the end of the chapter. However, I ran into some issues:

  1. I couldn’t run the script locally since I don’t have access to a GPU with CUDA support (using MacOS m2).
  2. I tried running it on both Kaggle and Colab with GPU-enabled environments, but in both cases, I couldn’t install the required dependencies. Specifically, the installation of flash-attn would get stuck at the step:

Building wheels for collected packages: flash-attn

I waited about an hour and gave up.

I’m wondering if there’s a better way to run the script, or if you could share how you managed to get it running during your testing? Any guidance would be greatly appreciated.

Thanks again for the great book and all the effort you’ve put into it—I’m really looking forward to the remaining chapters.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

1 participant