Anton Nikishin
09/26/2025, 9:46 AMconf/dev/databricks.yml with the following code:
default:
tasks:
- existing_cluster_id: 0924-121047-3jcdtqh1
But running kedro databricks bundle --overwrite raises an error:
AssertionError: lookup_key task_key not found in updates: [{'existing_cluster_id': '0924-121047-3jcdtqh1'}]Nok Lam Chan
09/26/2025, 9:50 AMAnton Nikishin
09/26/2025, 12:50 PMdatabricks.yml looks like this:
default:
tasks:
- task_key: default
existing_cluster_id: 0924-121047-3jcdtqh1Nok Lam Chan
09/26/2025, 12:53 PMAnton Nikishin
09/26/2025, 12:53 PMAnton Nikishin
09/26/2025, 12:54 PMdefault:
job_clusters:
- job_cluster_key: default
new_cluster:
node_type_id: Standard_DS3_v2
num_workers: 1
spark_env_vars:
KEDRO_LOGGING_CONFIG: /\${workspace.file_path}/conf/logging.yml
spark_version: 15.4.x-scala2.12
tasks:
- job_cluster_key: default
task_key: default
And it tries to create a cluster like thatNok Lam Chan
09/26/2025, 12:55 PMNok Lam Chan
09/26/2025, 12:55 PMAnton Nikishin
09/26/2025, 1:02 PMNok Lam Chan
09/26/2025, 1:04 PMAnton Nikishin
09/26/2025, 1:10 PMAnton Nikishin
09/26/2025, 1:10 PMJens Peder Meldgaard
09/29/2025, 6:16 AM