Hi team, we are using Kedro and have two question ...
# questions
Hi team, we are using Kedro and have two question and wonder if we can receive any help here: 1) While Kedro is trying to process one year worth of data - Reading from SQL Server and writing back to Blob Storage, it fails with following error - DatasetError: Failed while saving data to data set CSVDataSet(filepath=<path>/history.csv, protocol={'sep': |}, save_args={'index': False, 'sep': |}). Failed to upload blockTimeout on reading data from socket! 2) When we deployed Kedro to Databricks, Kedro is default looking for Local folder for configuration and since local folder is sensitive information where our credentials are kept, we are not sending it to databricks. Kedro if no local folder found then suppose to look for Base folder but erroring out with following message on databricks - Given configuration path either does not exist or is not a valid directory: /project/conf/local Any help would be greatly appreciated 🙂
For config, kedro run --conf-source <path>
For the first, how big is the data? Is it databricks raising the error can you include a more detail stack trace?