Palantir_models inference issue with null values

With palantir_models, I added a validation step in the model adapter in order to return null if the input parameter is outside the validity domain of the training data (in order to avoid model extrapolation). If the input data is outside the validity domain, I would like to return null. Unfortunately, it does not work with the DistributedInferenceWrapper, because spark refuses to write the dataframe with null values.

Any idea how to solve this problem?

Problem solved.

In order to accept null values, one must set the required field to False in the ModelApiColumn definition.

1 Like