Best Practises for Configuration Management for Applications

I’m interested in best practices on how to handle configuration management for applications using/combining a lot of ontology based Foundry tools.

For example in large data transformation focused pipeline we store important pipeline/application parameters in a central datasets that is then loaded by each transform that can be parameterized. Within the transform then the necessary parameters are extracted from the config dataset and used. This allows us to have all configuration parameters in one place, allow versioning of configs and easy provisioning of configs for different environments (DEV, TEST, PROD). Editing configs is also easy as you have all parameters in one place.

What would be a good approach in a ontology based application where I need to configure a variety of building blocks (functions, actions, monitors, UIs, etc)?

1 Like

Check out these resources on Solution Designer • AIP Architect • Palantir

Hi,

thanks for the response. Maybe I’m missing something but I do not see how the documentation page you gave, links to my question on an configuration management. Can you please explain your thoughts?

Best

Hi @robroe-tsi

You don’t mention specifics regarding the building blocks, or what you want to configure specifically, so please excuse the generalised approach:

I’ll generally use Market Place for application versioning. It allows you to split application development, without having to split into different name spaces, and also provides a good way to deploy updates to production workflows.

You will likely need several config files, but it can be done in this way:

  • All of your TypeScript functions will have one configuration file, that loads in the params needed (e.g. which variables, models, etc. to use)
  • If you are looking to deploy many similar applications, but with different input data, you can create a pipeline that creates objects from datasets, and then have your input config set what the input datasets should be, alongside any other variables. Only manual configuration here would be to create a prefix to the Ontology objects per installation.
  • The above objects can be used in Workshop without any change (as long as the schema stays the same)
  • This can also create a configuration-Object type in the ontology, that can be loaded into Workshop and used to provide default values for e.g. active tabs, visibility of widgets, etc.

While this is more complicated than the equivalent in your data transforms, remember that it is a more complicated problem to solve. And I’ll promise that if you need to do this many times over, it can save you a lot of time.

Foundry branching might change this for some of my use cases, but not all, and perhaps not for yours either, so now you have a couple of different options.

1 Like

Sounds like some combination of “Release Management” and “Workflow Builder” may help?

https://www.palantir.com/docs/foundry/devops-release-management/overview/

https://www.palantir.com/docs/foundry/app-building/overview/

1 Like

Hey Robroe,
Best practices start with a good plan. The Solutions Designer app can help get your (connectable,linkable, LIVE) pieces on the table. If you are working with a team this can also give you a single pane of glass to collaborate.

As I cannot see the data these are generalized solutions & ideas to try out.

Workshop • Publishing and versioning • Palantir

https://www.palantir.com/docs/foundry/administration

Hi @jakehop

many thanks for the explanations.

I like the idea of a configuration object type that stores either one or multiple configurations and can be loaded by functions or UI elements.

Would it be possible to explain the first bullet point with the “one configuration file” for TypeScript a little bit. That might be new to me.

Also I have some problem understanding your bullet point 2. My case meet your req. as we want to implement multiple similar (but not identical) workflows. Does you mean the usage of DevOps/Marketplace?

Best

Hi,

thanks for the two links.

The description of Workflow Builder (and the name) sounds promising but for now the functionality seems to be restricted on visualizing and describing existing applications.

DevOps/Marketplace: Yes we use that partially for configuring resources (like schedules, inputs, etc.) which often can not be done from within a pipeline. But from my experience it is difficult to configure the behavior/logic within building blocks (within a transform or within a function).

Thanks for the context, here are my responses for how I’d proceed:

  1. Similarly to how you use config files in Python transform code, you can do the same in TypeScript. There a multiple ways of doing this, but a pretty simple and fool-proof implementation is this: Have a config.ts-file, where you keep your configuration setup and export one function, which returns the configuration variables. Use this function to import the configuration in other files in your repository.
  2. Use Market Place (if the deployment model fits your use case) and either use that to select which datasets function as inputs for your Object types, or build a pipeline around this which requires manual configuration of input datasets. Pipeline builder can deploy objects. The inputs to the pipelines deploying these objects can be configured on a per-use case basis, allowing you similar workflows with different data inputs.

Hope this explains my thinking better.

1 Like

Here are some additional notes regarding option 2 on jakehop’s list.

To promote resources between environments using the Marketplace in Palantir Foundry, follow these steps:

1. Understand Environment Separation

  • Foundry uses spaces to represent different environments, such as development, testing, and production. This separation ensures that changes can be tested and validated before being promoted to production 1.
  • Each environment can have specific configurations, such as environment-specific data, behavior, and integrations.

2. Export and Import Products

  • Marketplace products can be exported from one environment and imported into another. This allows you to share resources between environments or enrollments.
  • Ensure you have the required permissions for exporting and importing products. Permissions are managed in the Control Panel under Marketplace store settings 2.

3. Steps to Promote Products

  • Export a Product:
    • Navigate to the Marketplace store in the source environment.
    • Select the product you want to promote and export it.
  • Import the Product:
    • Go to the Marketplace store in the target environment.
    • Import the product into the new environment.
  • During the import process, ensure that dependencies (inputs and linked products) are correctly mapped to the target environment 23.

4. Automatic Upgrades and Versioning

  • Enable automatic upgrades for installations to accept new product versions without manual intervention. This ensures that the latest updates are applied across environments 4.

5. Linked Products

  • Modularize workflows into linked products to simplify promotion. For example, separate data-cleaning pipelines and Ontology objects into distinct products, and link them for automatic mapping during installation 3.

6. API Name Consistency

  • When promoting Ontology-based products, ensure API names are consistent across environments to avoid deployment issues 5.

7. Release Management

  • Use Foundry DevOps for release management to view and manage environments. This includes configuring maintenance windows and release channels for controlled promotion 64.

By following these steps, you can effectively promote resources between environments using the Marketplace. Let me know if you need further clarification or assistance with specific steps!

2 Likes