This repository has been archived by the owner on Sep 13, 2023. It is now read-only.
Expose GTO's model version in FastAPI's interface.json
#665
Labels
customer
Request from customer
serialization
Dumping and loading Python objects
serve
Serving models
Follow-up for #664. Sometimes it's desired to store predictions along with the specific model version that returned those predictions. There are at least two ways to support that in MLEM:
{"prediction": [0.4, 0.6], "version": 0.1.3}
. I've seen some generic ML frameworks doing this IIRC.interface.json
- we already have MLEM version there, so adding model version looks logicalRegarding how we get this info into the service. Again, there are two approaches:
mlem.api.save
server
First seems more reasonable to me. Since this will require some under-the-hood integration with GTO, I'd do this after #664 - which have the same decision to make.
fyi @omesser
The text was updated successfully, but these errors were encountered: