Run Inference on Selected Modelling Objective in DevOps Deployment

I have created a series of models and a modelling objective, and I have selected a model and deployed it using batch inference. However, I am also using DevOps, and am not able to package the modelling objective or the batch model in DevOps.

Ideally, a batch deployment could be packaged in DevOps. In the meantime, is it possible to do transform inference from the selected model from a DevOps deployment? I can wire up directly to the backing model, but that totally bypasses the whole the modelling objective controls.

Hi @devonyates ,

Modeling Objectives are indeed not supported in DevOps and using a ModelInput in a transform is the preferred option if the use case is to be packaged via DevOps.

The ModelInput class accepts a model_version parameter, which can be copy-pasted from the left sidebar on the model page and looks something like ri.models.main.model-version.efec6a27-149c-4bdc-b6f0-99754f4812f0. If set, the inference job will use that version instead of the latest on the current branch. That way, the exact model version used to run inference can still be controlled, although not with the full Objectives feature set.

Hope this helps!
Julien

Thanks for that clarity. Its a bit of a bummer, becuase i used the Jupyter notebooks to develop the model, and will now have to do a fair amout of work to migrate to a repo.

If you’d prefer, the model training code can remain in the Jupyter notebook — it’s only the job that uses the model for inference that would need to be in a Code Repository (rather than use an Objective batch deployment).

I would also recommend setting the use_sidecar parameter of the ModelInput class to True if you are struggling to reconcile the Python dependencies from the Jupyter notebook with that of the repository, as mentioned in the documentation. This runs the model in a separate container, so no Python dependencies would need to be brought to the repository.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.