-
Notifications
You must be signed in to change notification settings - Fork 159
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Embed ResponseModel converter middleware in service API #1804
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think returning pydantic instance for only those functions with check_api_params_v2
decorators would be enough. I'm not sure we really need middleware for this issue. How about just reverting response type and value of delete function instead of adding middleware for converting responses?
I reverted and tried deleting service model and It worked fine.
https://github.com/lablup/backend.ai/pull/1752/files#diff-d21fbe306a9b579a2491f344fc4de5eff8407147aac0c18568f4ca95880b27c2L409-R577
web_response.StreamResponse
as is
web_response.StreamResponse
as isThis reverts commit 1dd2c33.
Thank you for reviewing. But this is a good way to use a middleware because all service APIs should be filtered by a certain function and this is not a global middleware. |
Co-authored-by: Kyujin Cho <kyujin.cho@lablup.com>
follow-up #1752
Currently we have
check_api_params_v2
decorator function which check request's parameters and parse response before return it. But there are some API functions that isn't decorated by it and just returnsResponseModel
, which occurs error. So,Plus, fix some pydantic APIs usage when parse ResponseModel to web response.
TypeAdapter.dump_json()
toTypeAdapter.dump_python()
becausedump_json()
returns serialized bytes.BaseModel.model_dump_json()
toBaseModel.model_dump()
becausemodel_dump_json()
returns JSON string, notdict
.Checklist: (if applicable)
docs
directory