Skip to content

Issues: BerriAI/litellm

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

test
#7701 opened Jan 12, 2025 by mess14
[Bug]: Azure OpenAI O1 not working due to Stream error bug Something isn't working
#7693 opened Jan 11, 2025 by niklasfink
[Feature]: Support wildcard route tab on UI enhancement New feature or request
#7692 opened Jan 11, 2025 by krrishdholakia
[Bug]: Can't get proxy working with Ollama bug Something isn't working
#7686 opened Jan 11, 2025 by jruokola
[Bug]: Files missing in docker Litellm bug Something isn't working
#7649 opened Jan 9, 2025 by daniyalsaif200
Failed to use deepseek
#7646 opened Jan 9, 2025 by sdw777
[Bug]: How to use Local downloaded Hugging face model bug Something isn't working
#7645 opened Jan 9, 2025 by mustangs0786
v1.56.3-stable is missing from ghcr bug Something isn't working
#7642 opened Jan 9, 2025 by trashhalo
[Bug]: #7594 broke typing on Router.acompletion bug Something isn't working
#7641 opened Jan 9, 2025 by jamesbraza
[Bug]: bedrock fallback not working bug Something isn't working
#7637 opened Jan 8, 2025 by zegerius
[Feature]: gemini-1.5-flash supporting empty tool args enhancement New feature or request
#7634 opened Jan 8, 2025 by jamesbraza
ProTip! no:milestone will show everything without a milestone.