Hello team, happy Friday!
Does somebody have experience running Spark+Kedro locally?
We're having some issues when trying to have a spark-master, a spark-worker and a kedro container in the same docker compose to test it all locally (if somebody has a better way to run it all locally please us know)
We are trying to write a pyspark dataframe into a parquet file partitioned by year/month/day/hour.
We are able to do this running just spark in docker, but when we add Kedro, there are some path permission issues even if the kedro user has the right access to the correspondent bind mounts.
Thank you in advance!