Pinning package versions in code workspaces

Hello,

I am finding it impossible to pin versions of packages in code workspaces. Inevitably (on refresh/restart) they seem to always revert to latest versions even with version numbers pinned in meta.yml

I’m finding this to be particularly true for palantir_models and palantir_models_serializers, which currently have a latest version that produces environment errors for me.

Any guidance on how to do this? I would prefer not to but the updated versions of palantir_models have created problems.

Lastest version of palantir_models results in:

ERROR ERROR: No matching distribution found for palantir-models==0.1484.0
INFO Installing pip environment
ERROR × Failed to install environment, exit status: "1"

Best,
Tom

Also, it looks like you are using a “-” instead of a “_” in the package name, I think that might actually be your problem…

Hey Tom,

You might want to try using the visual UI under “Libraries” to install it with your desired version. It should use Conda to install it and the “models-assets” dependencies that it requires.

I have found this, in general, to work pretty well for me and my projects.

-Guy

Hi Tom,

Installing in the UI may only pin a lower bound (e.g. palantir_models>=0.1484.0), but you can get an exact pin using a single “=” in meta.yaml or by running the appropriate command. For example, for Jupyter® notebooks with managed environments, run this command in the Terminal:

maestro env conda install palantir_models=0.1483.0 palantir_models_serializers=0.1483.0.

Some versions of palantir_models, like 0.1484.0, might not be available in your environment, but you can see which are available to choose from in the Packages UI, as described above.

Also, would you mind sharing the details of the environment error you’re getting with the latest version of palantir_models and palantir_models_serializers so we can investigate it?

Thank you
Julien

Thanks all. A couple of additional pieces of info:

Regarding “-” vs “_”: This seems to be a difference between how the packages are named in conda vs pip. I currently only have these installed via conda, but they also appear in the pip block in hawk.yml. Sometimes the packages have transiently appeared in the Pypip tab when searching, but they always appear in the conda tab. To be honest what appears in these search results feels very inconsistent search-to-search and I don’t trust it. I dont know why these packages would be installed via both conda and pip, but perhaps there is a reason for it.

Regarding the pinning in meta.yaml: I have done this, because I have noticed that the install button from the libraries panel always uses a >= command. However, even when pinning in meta.yaml, what is resolved/installed is always ultimately the latest package version, and I see this in the build failure.

The environment: nothing too custom here. Basically a default jupyter workspace with the addition of cplex, docplex, and scikit-learn

package:
  name: '{{ PACKAGE_NAME }}'
  version: '{{ PACKAGE_VERSION }}'
source:
  path: ../src
requirements:
  run:
  - jupyter_core>=5.7.1
  - palantir_models
  - palantir_models_serializers
  - numpy
  - ipykernel
  - pip
  - foundry-transforms-lib-python
  - pandas
  pip:
  - scikit-learn
  - docplex
  - cplex

Update:

There is weirdness here where the palantir_models libraries, though added through conda (and existing in conda) are being added to the pip portion of the meta.yaml and lock files.

I was able to get around my issue by:

  1. pinning the version palantir_models and palantir_models_serializer in the conda section of meta.yaml
  2. deleting the references to any palantir_models packages from the pip section of the hawk.lock and hawk.lockbak files
  3. deleting the packages from the pip block of the meta.yaml.bak file
  4. reinitialize env / restart workspace

These packages don’t exist in pip, which was the source of my build error. However they were never installed with pip, so they should never have been put in any pip blocks in the first place.

1 Like