Juan David Patiño Guerra
12/13/2023, 9:48 AMdatajoely
12/13/2023, 10:06 AMfilepath does not exist or not accessible
at the very top of your screenshotdatajoely
12/13/2023, 10:06 AMdatajoely
12/13/2023, 10:07 AMJuan David Patiño Guerra
12/13/2023, 10:47 AMMichał Madej
12/13/2023, 11:20 AMMichał Madej
12/13/2023, 11:23 AMJuan David Patiño Guerra
12/13/2023, 12:53 PMspark_version: 13.3.x-cpu-ml-scala2.12
and I get the (new) error shown attached - I made screenshots of the full trace (even more cryptic). Any ideas? Thanks for your quick help so far guys!datajoely
12/13/2023, 12:59 PMspark.yaml
?Juan David Patiño Guerra
12/13/2023, 1:08 PMdatabricks.yaml
from the DABs? If so, I think that one needs to stay in the root of the repo.datajoely
12/13/2023, 1:09 PMvalue cannot be null for spark.app.name
datajoely
12/13/2023, 1:10 PMMichał Madej
12/13/2023, 2:40 PM/Workspace/Users/${workspace.current_user.userName}/.bundle/${bundle.target}/${bundle.name}
, try using it in parameters: ["--conf-source", "here", ...]
Juan David Patiño Guerra
12/18/2023, 10:46 AMhooks.py
. Because this project path is in databricks/driver
folder in databricks jobs , it was failing there. The solution was to point the ConfigLoader to look for the config folder in the dbfs that I copied it to when I had to run the pipeline.
After that it works! I hope that with the development and growth of Kedro with deployment as databricks jobs gets better and smoother!datajoely
12/18/2023, 10:52 AMdatajoely
12/18/2023, 10:52 AMJuan David Patiño Guerra
12/18/2023, 11:14 AMValueError: Given configuration path either does not exist or is not a valid directory: /databricks/driver/conf/base
. Is just that seeing it fail in that directory "felt" complex, but in the trace you could see that it was going through the hooks. It is a bit of a Databricks extra complexity that didn't help.datajoely
12/18/2023, 12:12 PMJuan David Patiño Guerra
12/18/2023, 12:23 PM/dbfs/FileStore/wine_model_kedro/conf/