-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Set up remote ML logging to different workspaces for AzureML (dev/prod) #853
Comments
Doing CI integrations (at least on a PR) basis seems quite difficult as secrets would be part of such testing. Anyhow, a piece of me becomes more convinced that us becoming dependent on any specific logging in this project is wrong. Maybe it's better to introduce some sort of "forwarder" option, where the model builder will POST the metadata to some endpoint if provided in the config. The users of Gordo are then able to decide their logic entirely within their implementation. For example, in infrastructure we can then contain all our AzureML bs into one instance which will accept such POST requests and log them using AzureML and all that.. as well as giving us the opportunity to do integration testing as those secrets should be available to PRs since all forks are also private (I assume, but could be wrong though) |
I like the sound of that. That goes in line with @epa095 's idea of having a metadata ingestion service in the deployment. |
Status update: Igor is setting up two new workspaces, prod and dev to set mlflow up with. |
Suggested approach: Create two (?) new workspaces, dev and prod. Make a prod and dev overlay in the gordo-infrastructure app, so you choose on a cluster-to-cluster basis if it is dev or prod.
Create Workspaces for dev and test environments, perhaps analysis environment also.
Along with this, perhaps we could have a CI integration test sending some metadata to the test environment to make sure any alpha AzureML updates don't break the logging.
The text was updated successfully, but these errors were encountered: