-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Piping ( | ) to yai #53
Comments
Hello 👋 Thanks for the feedback. Piping is already supported, see second CLI example from the doc: https://ekkinox.github.io/yai/examples/#cli-mode |
For the previous context feature request, it's something I need to think about: sending to open ai the incremental previous contexts could lead very quickly to reach max tokens limit, and generate additional open ai api usage costs for the user. Maybe we could make it configurable with a sensible limit by default (in YAI config file) |
Thanks I tried piping but it didn't work, I'll see that I have the latest release and try again I don't worry about the context getting to big because most times the sessions are pretty short, I can get in and out of yai 10 times, each time it's usually no more a command line or a short paragraph, so all together it's like a page... I would allow to set the rolling window size. For example I would set it to minimum 3 sessions back , and even some more up to a max of tokens. Would be nice if there was a counter of tokens. |
Piping is available since 0.2.0 Ok for your suggestion, flagged as feature request |
Thanks :)
@ekkinox thoughts? |
Hi there, I really like your project :)
It is the one i use most often along side ai-shell which i also like
Wanted to request a feature to allow data piping in and out of the ai
please draw inspiration from openai_pipe 👈
there are really good examples showcased in the main Readme
also, please add context from previous session (i can imagine a hotkeys to save context, load from file, and reset)
its really annoying to exit, enter, and it just forgets the last thing we talked about
Thanks a lot and all the best!
The text was updated successfully, but these errors were encountered: