HI all small question.
How do I access the spark context within the nodes in execution? I want to add names to the spark jobs within the nodes code
Im doing the following, but im wandering if there is another way to access it directly.
• Im using one of the input dataFrames in my node function to retrieve the sparkSession and then access the context
(pays_df is just an example dataFrame being input to the function)
spark = pays_df.sparkSession
spark.sparkContext.setJobDescription("test")