INVALID_ARGUMENT when passing JSON Schema via responseFormat to GPT_4o.createChatCompletion

I’m trying to enforce a strict JSON schema on my LLM responses by using the responseFormat.jsonSchema parameter in the Palantir Foundry wrapper. I built my schema as a plain object, serialized it to a string, and wrapped it in a FunctionsMap<string,string>

I’ve confirmed that my FunctionsMap import is correct and that it serializes under values, but the wrapper still rejects the argument. Has anyone successfully passed a custom JSON schema into responseFormat.jsonSchema? What is the exact map shape that the Foundry LLM wrapper expects? Any guidance or working example would be greatly appreciated.

Code snippet of what I want to achieve:

        const response = await GPT_4o.createChatCompletion({
            messages: [
                {
                    role: "user",
                    contents: [{ text: prompt }],
                }
            ],
            params: {
                responseFormat: {
                    type: "json_object",
                    // TODO: Add schema
                    // jsonSchema: {...}
                }
            }
        });