API error when trying to dynamically define transform profiles

Even if you solve the networking issue, your desired setup can’t work with your suggested logic.

What you’re expecting is that your code that you have before the transform will be evaluated at runtime with each build, but that will not be the case!.

Foundry datasets always have a jobSpec that contains all necessary informations for a build, e.g. resources that should be attached to it and the jobSpec is only updated with a commit made to your transformation logic in a code repo :frowning:

You can pull dynamical information at runtime but only if it’s within the transformer.

There is a way around it though, a bit ugly solution but it works, I did apply it successfully to a different project with different requirements, but still kinda make sense to share.

You need to have a way to automatically commit and merge code to the repo, I did use the Logic flows with an API call that creates a file in a folder whenever I need to update the jobSpec.

Potential scenario lets say that you need to have more resources every Monday because you want to retrain a ML model

an API call from a different process creates a file → logic flow creates a PR that is automatically merged → your jobSpec is updated as the function that sets the parameters is reevaluated.

Keep in mind this will only work with very loose merge policies in your main branch as you want the merge to happen automatically, so again, kinda technically possible but make sense for specific use cases and not a generic solution that you’re probably searching for, maybe future automations could be more useful for your case and hopefully at some point we would be able to define the automations ourself :crossed_fingers: :pray:

In your case I would go with @arochat suggestion and use the dynamic allocation and not over engineer it too much :see_no_evil:

1 Like