Have a parameter to backup or move chats from chat_cache #121
Closed
tan-wei-xin-alez
started this conversation in
Ideas
Replies: 1 comment
-
Currently OpenAI has several GPT models with increased token limit that would partially solve the issue. Other than that, it seems this issue might be uniquely personal. Consequently, I'm closing this issue to maintain focus on broader user concerns. Please feel free to reopen it if additional relevant information arises. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm getting the following error after using the
--chat
parameter for a while and I think it's because of the token limit as mentioned at https://help.openai.com/en/articles/4936856-what-are-tokens-and-how-to-count-themCorrect me if I'm wrong, but I think the entire history is being sent every time
--chat
is used to preserve context? Because the error goes away when I remove all of the previous messages (except the 1st prompt and response) withWould it make sense to have like a
--backup-chat
or--move-chat
parameter to automatically perform the above actions? Or is something similar being worked on to combat the above issue?Beta Was this translation helpful? Give feedback.
All reactions