HI all small question. How do I access the spark ...
# questions
f
HI all small question. How do I access the spark context within the nodes in execution? I want to add names to the spark jobs within the nodes code Im doing the following, but im wandering if there is another way to access it directly. • Im using one of the input dataFrames in my node function to retrieve the sparkSession and then access the context
Copy code
(pays_df is just an example dataFrame being input to the function)
spark = pays_df.sparkSession
spark.sparkContext.setJobDescription("test")
d
so a I think lifecycle hooks are your friend here https://docs.kedro.org/en/stable/hooks/introduction.html
f
thanks