-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for Mixtral 8x7B #146
Comments
Hi, this codes may meet your demand. # Get the available models (not hardcore)
models: list = chatbot.get_available_llm_models()
# Switch model to the given index
chatbot.switch_llm(0) # Switch to the first model
chatbot.switch_llm(1) # Switch to the second model |
['meta-llama/Llama-2-70b-chat-hf', 'OpenAssistant/oasst-sft-6-llama-30b-xor', 'codellama/CodeLlama-34b-Instruct-hf', 'tiiuae/falcon-180B-chat'] Those are the only ones available from chatbot.get_available_llm_models(), is there a way to have Mixtral 8x7B show up here? |
What version of hugchat are you using? pip3 show hugchat |
So sorry I just updated and it's option 5 on the list. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Not sure how easy it is to add models, but this one is proving the best so far, and is available on Hugging Chat.
Model: mistralai/Mixtral-8x7B-Instruct-v0.1
The text was updated successfully, but these errors were encountered: