How to bulk configure spark profiles

I want to add Spark profile settings to a certain project.
(I want to increase the number of Executor executions and memory).

I know that I can add settings for each job using @configure in the source code repository,
but it’s a pain to configure each job one by one in a repository with a large number of jobs (datasets),
so I’m looking for a way to set it for the whole project.
The Spark profile did not take effect when I imported it,
so does anyone know if there’s a way to add it all at once in the settings screen, etc.?

1 Like

Indeed, importing the profile into the project only gives the possibility for transforms to use that profile. It takes no steps to apply it everywhere.

To do what you want, you could write a new decorator that is something like @configure_default. You’d have to replace all usages of @configure with @configure_default, and you can then manage the @configure_default profiles in a single location.

Do you know about dynamic resource allocation? This should help larger jobs access to more resources without having to maintain the profiles. It does not affect the memory per executor however.

1 Like

I was looking for a way to set/change the profile without modifying the programs
(because there were many programs and the cost of modifying them was high),
but it seems there is no such method, so I will try my best to modify the programs.
Thank you for checking.