The gemini series of models is proxied by functions generic chat completions:
import { Function } from "@foundry/functions-api"
// NOTE: Model Imports must be manually added through "Resource Imports" in the left-hand panel
import { Gemini_2_5_Flash } from "@foundry/models-api/language-models"
/**
* Used to send a text completion request to the model based on user input
* @param {string} userInput - Text input to send to model
*/
export class MyFunctions {
@Function()
public async createGenericChatCompletion(userInput: string): Promise<string | undefined> {
const response = await Gemini_2_5_Flash.createGenericChatCompletion({
params: {
"temperature": 0,
"maxTokens": 1000,
},
messages: [{ role: "SYSTEM", contents: [{ text: "This is system prompt for completion" }] }, { role: "USER", contents: [{ text: userInput }] }],
});
return response.completion;
}
}
I examined the types for this AI and there does not appear for be support for json schemas:
import * as FunctionsApi from "@foundry/functions-api";
import * as FunctionsExperimentalApi from "@foundry/functions-experimental-api";
import * as FunctionsInternalApi from "@foundry/functions-typescript-runtime-internal-api";
export type GenericChatCompletionResponseTypeRef = {
"completion": string;
"tokenUsage": TokenUsageTypeRef;
};
export type TokenUsageTypeRef = {
"maxTokens": FunctionsApi.Long;
"promptTokens": FunctionsApi.Long;
"completionTokens": FunctionsApi.Long;
};
export type GenericMediaContentTypeRef = {
"mimeType": string;
"content": string;
};
export type GenericMessageContentTypeRef = {
"text"?: string | undefined;
"genericMedia"?: GenericMediaContentTypeRef | undefined;
};
export type GenericMessageTypeRef = {
"role": string;
"contents": Array<GenericMessageContentTypeRef>;
};
export type ParametersTypeRef = {
"stopSequences"?: Array<string> | undefined;
"temperature"?: FunctionsApi.Double | undefined;
"maxTokens"?: FunctionsApi.Integer | undefined;
"topP"?: FunctionsApi.Double | undefined;
};
declare const functions: FunctionsInternalApi.FunctionsGlobal;
export async function createGenericChatCompletion(parameters: {
"params"?: ParametersTypeRef | undefined;
"messages": Array<GenericMessageTypeRef>;
}): Promise<GenericChatCompletionResponseTypeRef> {
return functions.functionsQueryProvider.execute("ri.function-registry.main.function.80524dd6-4c41-4486-92f4-79ddc0a53135", "1.107.0", parameters);
}
Below is an example of using JSON schema with Google’s SDK:
import { GoogleGenAI } from '@google/genai';
import { getTokenomics } from './utils';
import type {
EditOp,
Tokenomics,
} from '@codestrap/developer-foundations-types';
import { EditOpsJsonSchema } from '@codestrap/developer-foundations-types';
export async function generateEditOps(
user: string,
system: string,
): Promise<{ ops: EditOp[]; tokenomics: Tokenomics }> {
// Configure the client
const ai = new GoogleGenAI({});
// Configure generation settings
const config = {
// tool use with json output is unsupported
responseMimeType: 'application/json',
responseJsonSchema: EditOpsJsonSchema,
temperature: 0,
};
// Make the request
const response = await ai.models.generateContent({
model: 'gemini-2.5-flash',
contents: `${system}
${user}`,
config,
});
const ops = JSON.parse(response.text || '{"ops": []}').ops as EditOp[];
const tokenomics = getTokenomics(response, 'gemini-2.5-flash');
return {
ops,
tokenomics,
};
}
Why this matters - without the ability to use json schemas you are asking developers to take a significant hit is stability of their solutions. Structured outputs drive out almost all errors related to JSON output. This is so sever that in many cases I have to remove the use of AIP all together.
LLM proxies for the Gemini family of models would unlock a lot of value here. Trying to keep pave with the API’s the labs create is going to be an almost impossible task. You are on the hook to support all the features and to provide the documentation. Providing proxies is a better DevEx and likely a much lower cost to maintain. If an LLM proxy is not possible when will json schema’s be supported for the Gemini falimy?