Hi fellows, I am running a `kedro` project on Data...
# questions
f
Hi fellows, I am running a
kedro
project on Databricks (and have good hope to convince my team to go for
kedro
). The documentation is very well written, thanks for that. Scrolling through the messages in Slack, I did not find a way to directly use the object
spark
, the
SparkSession
provided directly in the Databricks notebooks. Is there any way to do so?
ā¤ļø 5
j
hi @Flavien, thanks for the shoutout! props to @Jannic Holzer and @Jo Stichbury for those pages, they took a lot of effort to write šŸ™ŒšŸ¼
about using the
SparkSession
directly, let me check
were you thinking of accessing it from within a node?
f
No, I had in mind its "implicit" usage in the data sets. I am using a
SparkHiveDataSet
which is getting the session through
SparkSession.builder.getOrCreate()
. I am not familiar at all with the inner workings of Spark and Databricks. As Databricks provided a dedicated session (properly) setup, I would have liked it to be used when running:
Copy code
with KedroSession.create() as session:
    session.run(pipeline_name="mon_petit_pipeline")
as per the documentation.
šŸ‘€ 1
I tried to "trick" the data set by loading the object with a
locals().get("spark")
without success.
j
@Flavien I see two ways to access the `SparkSession`: ā€¢ the
SparkHiveDataSet
has a static
_get_spark
method that, as you spotted, gets or creates an existing one. if I understand correctly how Databricks works, this should get the appropriate
SparkSession
, even if it's created by Databricks. ā€¢ when passing a
DataFrame
around, you can access its
.sparkSession
property, which yields the session that created it. I'm mindful these answers are somewhat generic (I promised I generated them with my own brain and not ChatGPT šŸ˜¬) but, if none of them are quite what you were looking for, would you mind sharing a bit more detail?
f
Thanks @Juan Luis. Checking the sessions, they indeed seem to be identical. One step closer to use
kedro
in my team ā€” mentioning
kedro
every week for a year now šŸ˜….
šŸŽ‰ 2
j
awesome! feel free to keep dropping #questions šŸ’ŖšŸ¼