Skip to content
This repository has been archived by the owner on Sep 13, 2023. It is now read-only.

Inferencing types at saving is not feasible sometimes #691

Open
aguschin opened this issue Jul 11, 2023 · 0 comments
Open

Inferencing types at saving is not feasible sometimes #691

aguschin opened this issue Jul 11, 2023 · 0 comments
Labels
feedback User's feedback serialization Dumping and loading Python objects

Comments

@aguschin
Copy link
Contributor

The type inference is nice. But it requires running the model on the machine packaging the model. This is a problem if I want to package a model requiring a GPU, on a machine which does not have a GPU (e.g. in CI/CD), or with a GPU too small to run the model with the parameters we want to have in production.

@aguschin aguschin added serialization Dumping and loading Python objects feedback User's feedback labels Jul 11, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
feedback User's feedback serialization Dumping and loading Python objects
Projects
None yet
Development

No branches or pull requests

1 participant