What can be the cause of parameters not being load...
# questions
a
What can be the cause of parameters not being loaded in?
ValueError: Pipeline input(s) {'params:features', 'params:sources', 'params:general', 'params:feature_extraction'} not found in the DataCatalog
e
1. You’re loading from the different env 2. You namespace your pipeline but not the parameters
K 1
a
It's possible it's the first one I suppose
Copy code
Parameters not found in your Kedro project config.                                    
                             No files of YAML or JSON format found in                                              
                             /dbfs/Shared/<REMOVED>/conf or                
                             /dbfs/Shared/<REMOVED>/conf/da                
                             tabricks matching the glob pattern(s): ['parameters*',                                
                             'parameters*/**']
ah it's not finding the params in conf
The parameters.yml is inside conf/base but for some reason it doesn't search there
Copy code
from kedro.config import OmegaConfigLoader  # noqa: E402

CONFIG_LOADER_CLASS = OmegaConfigLoader
CONFIG_LOADER_ARGS = {
      "base_env": "base",
      "default_run_env": "local",
      "config_patterns": {
          "spark" : ["spark*/"],
          "parameters": ["parameters*", "parameters*/**", "**/parameters*"],
      }
}
looks fine to me I'll check if we're doing something weird in the hooks maybe
e
To quickly test you can use an interactive environment:
Copy code
kedro ipython

parameters = context.config_loader.get("parameters")
parameters
Or print the whole catalog to see what’s inside
Copy code
kedro ipython

catalog
n
/dbfs/Shared/<REMOVED>/conf
Is this where your project live in? if you are on databricks, it's most likely you need to use
CONF_SOURCE
to customise where you put your configuration.