Do functions v2 support importing llms in model catalog?

I just bootstrapped a new functions v2 repo (the recommended option) and I don’t see an option to import a model like gpt5-mini. How can I use models from the model catalo in functions v2?

Looking at the available libraries, this isn’t fully implemented yet.

A workaround is registering the bare LLM calls as TSv1 functions and using those, as you’ll run out of input context before you run OOM in TSv1 anyways. But it’s annoying as this splits development between multiple repos.

1 Like