Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: Background sync of models #54

Open
sdmorrey opened this issue Aug 2, 2024 · 4 comments
Open

[Feature Request]: Background sync of models #54

sdmorrey opened this issue Aug 2, 2024 · 4 comments
Assignees
Labels
enhancement New feature or request

Comments

@sdmorrey
Copy link

sdmorrey commented Aug 2, 2024

Problem Description

When I start a new chat often I want to chat a different model than the last one I talked to. At present this is handled via a dropdown with an inscrutible list of names I have to try and mentally translate. Worse I have to remember which one(s) I've already downloaded. This is a huge amount of cognitive load before my morning coffee kicks in.

While that's not so bad, what makes it truly problematic is that at present we don't start downloading a model until it's needed for the first time.

The download process at least on my end seems to not detect when there's been an interruption to the download process.

The end result is if I want to chat a new model I have to sit and stare waiting for it to finish loading and often I have to manually stop and restart the download. Often this occurs because I selected the wrong variant of a model I already downloaded.

Solution Description

A better solution would be to have a configuration option with a list of models available for download, their sizes in GB and a description or at least a link to their HF page. A simple checkbox would do. From there the model(s) would download in the background and it would be lovely if the app could restart failed downloads automatically.

This would replace the current model select in chat, with a list of only the available, already downloaded models. That way I don't start chatting a model I don't actually have and wait ages for it to download.

Alternatives Considered

No response

Additional Context

Great Job so far by the way!

@sdmorrey sdmorrey added the enhancement New feature or request label Aug 2, 2024
@Neet-Nestor Neet-Nestor self-assigned this Aug 3, 2024
@Neet-Nestor
Copy link
Collaborator

Neet-Nestor commented Aug 3, 2024

Thanks for the suggestion. This is very similar to something on our roadmap and thank you for bringing some additional valuable points to it.

Due to the considerable amount of work for this, it won't be there very soon but it will be there sometimes in the future :)

@sdmorrey
Copy link
Author

sdmorrey commented Aug 3, 2024

Maybe I can pitch in soon. I've been planning to build something like a BitTorrent for AI for awhile now.

@Marinacarro
Copy link

parla in italiano

@Issues-translate-bot
Copy link

Bot detected the issue body's language is not English, translate it automatically.


speaks in Italian

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants