[k-LLM] How to enable LLMs not provided?

Foundry / AIP allows for quick enablement of models from a multitude of sources. However, some models are regionally restricted, requiring you to use alternative solutions or your own custom model(s). How do users use models beyond what is made available via Control Panel: AIP Settings?

Hi @MSL0727 - welcome to the Developer Community!

Assuming you want to use an LLM that is hosted outside the Palantir Foundry platform, you could integrate it as an External Model. You can then call that model from either your Python transform or Functions, depending on your use case.

1 Like

Thanks!

Will this enable the use of this external-model in point-and-click solutions (i.e. Pipeline Builder) as well?

Edie: Hey @MSL0727 yes this should be the case for models like Llama3 and Claude. (Some other models may not automatically show up if they aren’t supported yet or are in the process of being deprecated)

This was in reference to an external model not hosted and made available by the Foundry platform. Once I manually configure an external model (i.e. MichaelsLLM), where exactly will it be available for use? Will I have access to it in Ontology-based apps as well? Foundry typically manages what LLMs are accessible in what internal apps, so I’m trying to understand how that all works, and how that applies to configured external models.

This is currently not supported in Pipeline Builder but we’re currently working on getting this functionality into the tool