Hello All,
I am reaching out to inquire about the capabilities of the llm model libraries in Code Repository, specifically regarding the use of GptChatCompletionRequest. I am interested in knowing whether it is possible to pass structured outputs or JSON mode arguments, similar to those available in the OpenAI ChatGPT Completion API (found here https://platform.openai.com/docs/guides/structured-outputs ).
While I have observed that structured outputs can be utilized within the pipeline builder and AIP logic files, I have not been able to locate a corresponding option or argument within the palantir_models, language_model_service_api, transforms-aip libraries.
Thank you for your assistance.
Best regards,
Shyam