LLMs in the Model Catalog do not allow all options

I’ve noticed that support for options in the public documentation, such as structured outputs, is not supported as options for the model found in the model catalog. For example https://ai.google.dev/gemini-api/docs/structured-output#javascript. But the params defined for the Gemini family of models in AIP are:

type Parameters = {
    "stopSequences"?: Array<string> | undefined;
    "temperature"?: FunctionsApi.Double | undefined;
    "maxTokens"?: FunctionsApi.Integer | undefined;
    "topP"?: FunctionsApi.Double | undefined;
};

Ideally, we could use the SDK from Google, OpenAI, or a similar package like Vercel’s AI offering. Is there a reason we are forced to use the proxy API in AIP? Is there a workaround?

You can either register in the flavour of Bring Your Own Model or your can call it as an external system via your functions as well to get the full suite of functionalities.

The issue with the one-size-fits-all approach of AIP is that what one doesn’t support, all can’t support, so some API/model specific functionalities must be sacrificed on the altar of broad support.

This shouldn’t stop you however.

Have you tried defining this schema as the output of e.g. an AIP Logic function? Not fully sure what you are looking for in the Gemini API that this output format can’t deliver, so feel free to elaborate if there’s more to your use case than a structured output.