Hi team, how can we define the same data set in a ...
# questions
d
Hi team, how can we define the same data set in a catalog file across multiple environments? I have a two
catalog.yml
files save like so:
Copy code
conf/base/catalog.yml
conf/test/catalog.yml
In each file I have the same datasets but with different types, filepaths, etc. and when I try to run my pipelines with
kedro run --env=test
I get the following error:
Copy code
ValueError: Duplicate keys found in [...]/conf/test/catalog.yml and [...]/conf/base/catalog.yml: input_files@{type}
It says in the docs here that duplicate keys should be resolved with OmegaConfigLoader, but am I missing something? Thanks 🙂 (btw I am running Kedro 0.19.2)
K 1
d
within one env you have duplicate keys
I think
d
Even when I updated the two
catalog.yml
files to only have one data set entry each the error still occurs
d
hmm that doesn’t sound right 🤔
d
The error is even comparing the files across environments
d
can you share more of the stack trace?
d
ah looking at the stack trace helped me figure it out...the error was caused by a hook trying to load in the catalog incorrectly - thanks!
d
🙏
ty 1