Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Invalid models show up in get_remote_llms function #148

Closed
Whitelisted1 opened this issue Dec 27, 2023 · 0 comments
Closed

Invalid models show up in get_remote_llms function #148

Whitelisted1 opened this issue Dec 27, 2023 · 0 comments

Comments

@Whitelisted1
Copy link
Contributor

When we use the function chatbot.get_remote_llms() it returns the model mistralai/Mistral-7B-Instruct-v0.1 which is not available for use and causes an error

This is shown with the code below:

from src.hugchat import hugchat, login, cli

import logging
logging.basicConfig(level=logging.DEBUG)

EMAIL = "..."

chatbot = hugchat.ChatBot(cookie_path=f"usercookies/{EMAIL}.json")
print([m.id for m in chatbot.get_remote_llms()])
# ['meta-llama/Llama-2-70b-chat-hf', 'codellama/CodeLlama-34b-Instruct-hf', 'tiiuae/falcon-180B-chat', 'mistralai/Mistral-7B-Instruct-v0.1', 'mistralai/Mistral-7B-Instruct-v0.2', 'openchat/openchat-3.5-1210', 'mistralai/Mixtral-8x7B-Instruct-v0.1']

The model does not show up in the HuggingChat GUI, confirming the fact that the model is not supposed to be showing up as an output from the function
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant