Hi,
I tried running the following Python transformation in code workbook. The only error I get is “Failed to execute transformation” without any details.
If anyone is able to review the sample code and let me know why there is a failure, I would really appreciate it.
I also see that code workbooks are in legacy mode? Is this application going away in the future?
from pyspark.sql import functions as F
def LETS_GO_SPARKY(CODE_VALUE, VISIT):
# Filter visit as needed
visit_df = visit.filter(F.col("VISIT_ID") == 12345678)
# Get all columns ending with _CD
cd_columns = [col for col in visit_df.columns if col.endswith('_CD')]
# Start with the filtered DataFrame
result_df = visit_df
# For each _CD column, join CODE_VALUE to get the display value
for c in cd_columns:
display_col = f"{c}_DISPLAY"
code_value_display = CODE_VALUE.select(
F.col("CODE_VALUE").alias(c),
F.col("DISPLAY").alias(display_col)
).dropDuplicates([c])
result_df = result_df.join(code_value_display, on=c, how="left")
return result_df
Error screenshot:
