Local inference connector for AIP Logic — DDIL + classified data use cases

Howdy -

Built an open-source connector that routes AIP Logic webhook calls to a local LLM via TARX (OpenAI-compatible API at localhost:11435). Uses @palantir/lohi-ts for Ontology sync on the OSDK side. Zero cloud exposure — designed for DDIL environments and data that can’t leave the device.

Repo: https://github.com/tarx-ai/tarx-palantir-connector

Two questions for the community:

  1. Network egress policy config for localhost Data Connection sources — any prior art or examples?

  2. Right path for Community Registry submission?

JOHN | TARX