Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Degraded performance for proxying requests responding with Server Sent Events #918

Closed
noyoshi opened this issue Feb 13, 2024 · 3 comments
Closed
Labels
bug Something isn't working stale All issues that are marked as stale due to inactivity

Comments

@noyoshi
Copy link

noyoshi commented Feb 13, 2024

Report

When the connection that the HTTP add on is proxying is handling server sent events, we get weird behavior where the events are all buffered and returned in several large chunks, rather than a continuous stream.

Expected Behavior

The events come back just as they would without the proxy sitting between the client and the server - as a continuous stream.

Actual Behavior

The events come back in several large chunks, all at once, rather than as a stream.

Steps to Reproduce the Problem

We are doing the following:

The server is a model running with this: https://github.com/predibase/lorax

The server has an endpoint called /generate_stream which returns a stream of events using SSE

If you put the keda proxy in front of a pod running a lorax model, and try calling the /generate_stream endpoint, you get the described behavior.

Logs from KEDA HTTP operator

example

HTTP Add-on Version

0.5.0

Kubernetes Version

1.28

Platform

Other

Anything else?

No response

@noyoshi noyoshi added the bug Something isn't working label Feb 13, 2024
@JorTurFer
Copy link
Member

JorTurFer commented Feb 14, 2024

Hello,
Could you confirm that you are using the version 0.5.0? I'm checking the diff between 0.5.0 and 0.7.0 and there are several improvements related with the routing and the streaming:

Copy link

stale bot commented Apr 14, 2024

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale All issues that are marked as stale due to inactivity label Apr 14, 2024
Copy link

stale bot commented Apr 22, 2024

This issue has been automatically closed due to inactivity.

@stale stale bot closed this as completed Apr 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working stale All issues that are marked as stale due to inactivity
Projects
Status: Done
Development

No branches or pull requests

2 participants