We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A Triton inference server might be useful for the open-source models
https://github.com/triton-inference-server
The text was updated successfully, but these errors were encountered:
inference server by huggingface https://github.com/huggingface/text-generation-inference
we may get an instance soon with StarCoder
Sorry, something went wrong.
@GGmorello has set up RepairLLama over HuggingFace Spaces thanks to our zero-gpus account
@FredBonux is able to use Mixtral and LLama over groq for free, see https://www.groq.com
No branches or pull requests
A Triton inference server might be useful for the open-source models
https://github.com/triton-inference-server
The text was updated successfully, but these errors were encountered: