Can I stream outputs from AIP LLMs without using Agent Studio

I am well aware of using the sessions API to stream responses from AIP Agent studio agents. I do not want to do that. Agents built in agent studio proxy the model responses, almost always summarize or alter them, and are designed for tool calling which I do not want. Do not tell me to use AIP Agent Studio, I do not want it. What I want is the ability to stream outputs from models in the model catalog (via a TypeScript function because that is still the only way to use them, why, just why). Is this possible? When are we going to be able to use open source SDKs and leverage the inference endpoints in AIP like we do when working directly with the labs?