Are static class properties shared across concurrent function invocations in Foundry Functions?

Hello,

I’m working on optimizing performance in my Foundry Functions TypeScript V1 code by implementing caching to avoid redundant database calls within a single request. I have a question about the execution model and memory isolation between function invocations.

Context:

I have a service class that frequently calls Objects.search().MyOntologyObject().allAsync() and always looks at the first row to get some config data. Within a single function execution, this same data might be retrieved multiple times by different services, so I want to cache it.

My Concern:

I’m worried that static properties might be shared across different function invocations/requests, which could cause:

  1. Request A caches a value

  2. Request B gets the stale cached value from Request A instead of fresh data

  3. Race conditions if Request A modifies the cache while Request B is reading it

Questions:

  1. Are static class properties shared across concurrent function invocations of the same version in Foundry Functions?

  2. What is the isolation model between different function executions of the same version? Does it make a difference Request A and Request B are the same or different published function?

  3. Does the behavior differ between Functions and OntologyEditActions?

  4. Is each function invocation guaranteed to run in its own isolated context, or might they share memory/static variables?

  5. What’s the recommended pattern for request-scoped caching in Foundry Functions?

Shared state is only a concern for deployed Python functions. For Typescript functions (which are always serverless) and serverless Python functions, there is no shared state between invocations.

It doesn’t make a difference; for serverless functions, executions are fully isolated.

There is no difference from the perspective of cross-invocation shared state between regular functions and OntologyEditFunctions.

For serverless functions, they are guaranteed to run in their own isolated context.

Since functions run in their own isolated context, you can use whatever you want here; the Lodash memoize function, lru-cache, etc.

Thanks for the quick and extensive answer!