You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
InstructLab 0.19.3 and its dependencies limit PyTorch to <2.4.0. Intel Gaudi software 1.18.0 just came out. It comes with a custom build of Torch 2.4.0. We need to use the new version for performance improvements and better Gaudi 3 support.
Describe the solution you'd like
Please lift the restrictions to either torch>=2.3.0,<2.5.0 or torch>=2.4.0,<2.5.0.
Additional context
I'm aware that it's not a trivial change. IIRC some instructlab subpackages have a upper version limit as well. Several 3rd party packages like vLLM and flash-attn may need to be updated, too. It might be easier to start with torch>=2.3.0,<2.5.0 everywhere, then set torch>=2.4.0,<2.5.0 in instructlab CLI last.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
InstructLab 0.19.3 and its dependencies limit PyTorch to <2.4.0. Intel Gaudi software 1.18.0 just came out. It comes with a custom build of Torch 2.4.0. We need to use the new version for performance improvements and better Gaudi 3 support.
Describe the solution you'd like
Please lift the restrictions to either torch>=2.3.0,<2.5.0 or torch>=2.4.0,<2.5.0.
Additional context
I'm aware that it's not a trivial change. IIRC some instructlab subpackages have a upper version limit as well. Several 3rd party packages like vLLM and flash-attn may need to be updated, too. It might be easier to start with torch>=2.3.0,<2.5.0 everywhere, then set torch>=2.4.0,<2.5.0 in instructlab CLI last.
The text was updated successfully, but these errors were encountered: