Schedule not working after migrating project through marketplace

Hi, my issue is that my pipeline has a schedule that builds a few target datasets but it doesn’t work after migrating the project through marketplace.

For whatever reason, the target datasets, after updating the marketplace product, are considered up to date and the scheduler won’t run. It ignores every run that is triggered because the target dataset is considered up to date. Also, sometimes the scheduler does run, but every once in awhile and it seems random. There are upstream datasets of the target datasets are considered out of date, but still the scheduler doesn’t run. I tried manually including the upstream datasets but it didn’t work either. I have to manually check the schedule as well since marketplace doesn’t seem to natively/easily support schedule health checks.

The next thing I’m trying is not building the datasets on marketplace product updates, but I’m not sure what other options I have after that. I could force the build, but I’d rather save the compute where possible so it’s a temporary fix. Do y’all have any suggestions on what I could or troubleshoot to resolve the issue? Maybe I just need to delete the reinstall the entire product or something? (hopefully not)

The main reason a build is skipped in a schedule is because it is considered “up to date” (i.e., there is no new data in their immediate upstream datasets since last build, and there is no new build logic). I’m assuming this is not the case here. [1]

The other reason could be related to the “build scope of the schedule”. Schedules in project-scoped mode will only build the dataset that are inside their projects, and any other dataset (even though they are explicitly set as a target in the schedule) will be skipped in the builds. [2]

To check this, navigate to your schedule and check the projects listed under the “Build scope” section. See if all the targets that you’re trying to build are in those projects. If a target is in another project that is not listed under “Build scope” projects, that would explain the situation.

Regarding the reason why a project wouldn’t be listed in build scope - this could potentially be due to a recent bug where build scope of Marketplace schedules used to be calculated wrongly. This bug was fixed roughly 1 month ago, so if your Marketplace product was created before this time, it might be affected. If you think this is the case for you, I’d recommend simply re-creating the Marketplace product as the fastest way to unblock here (we have plans to auto-fix the affected products but that could take some time).

If these doesn’t apply to your case, helping you further would require debugging using information specific to your enrollment and setup, I’d recommend filing a Foundry Issue with our support team. [3]

[1] https://www.palantir.com/docs/foundry/optimizing-pipelines/troubleshoot-schedules/#scheduled-builds-were-ignored

[2] https://www.palantir.com/docs/foundry/data-integration/schedules#project-scope

[3] https://www.palantir.com/docs/foundry/getting-help/file-support-ticket/#3-report-the-issue

Hi, wanted to share an update troubleshooting and fixes.

After double-checking, the datasets and their upstream resources were all in the build scope. We did find some curious things. For context, one of our pipelines (I’ll use it as a model for every pipeline that didn’t run its schedule) is structured as: data sync → raw → clean → transform → backing, with the input/trigger being the data sync that creates the raw dataset and backing being the target. We found that since marketplace directly updated the datasets for the transform and backing dataset, they were considered up to date, even though the other resources such as clean were not. We weren’t exactly sure why the schedule wouldn’t run in this case, since there were upstream resources that were out of date, but we figured it was something to do with a weird interaction between the scheduler and marketplace.

What ended up fixing most of the datasets is that we swapped from a full-build schedule to a connecting build. As long as the inputs were set right, the scheduler took over as normal and ran everything as expected. Additionally, we disabled the ‘build datasets on marketplace upgrade’ setting in Dev Ops to prevent any kind of potential marketplace interference/interaction (hopefully).

I’m not sure if this is the most ideal solution, but this managed to get things working with relative ease (even if troubleshooting was a pain!).