I have a dumb question - how do I access in the ex...
# questions
ł
I have a dumb question - how do I access in the example here: https://docs.kedro.org/en/stable/configuration/parameters.html the step_size parameter in the code?
Copy code
step_size: 1
model_params:
    learning_rate: 0.01
    test_data_ratio: 0.2
    number_of_train_iterations: 10000
Example shows only the nested one - I have some problems with getting the non-nested parameters :
Copy code
def train_model(data, model):
    lr = model["learning_rate"]
    test_data_ratio = model["test_data_ratio"]
    iterations = model["number_of_train_iterations"]
# in pipeline definition
node(
    func=train_model,
    inputs=["input_data", "params:model_params"],
    outputs="output_data",
)
Something like
parameters["step_size"]
gives me KeyError, is there an option to have non-nested global parameters? I would like something like this:
Copy code
def load_train_data(parameters: dict[str, str | int | bool], data_ingestion_parameters: dict[str, str | bool]):
    ...
    step_size = parameters["step_size"]
Currently I have to nest it to work with it like
parameters["global_parameters"]["step_size"]
, but it is a bit ugly
h
Someone will reply to you shortly. In the meantime, this might help:
m
Hi @Łukasz Janiec, not a dumb question at all 🙂 If you want to access top-level parameters, you need to make sure you pass the
parameters
argument inside your pipeline, rather than specifying the lower level key
params:model_parameters
so in your case you'll have to change the code to:
Copy code
def train_model(data, parameters):
    model_params = parameters["model_params"]
    lr = model_params["learning_rate"]
    test_data_ratio = model_params["test_data_ratio"]
    iterations = model_params["number_of_train_iterations"]
   --> step_size = parameters["step_size"]

# in pipeline definition
node(
    func=train_model,
    inputs=["input_data", "parameters"],
    outputs="output_data",
)
ł
Okay, I thought that these global parameters in
parameters.yml
were somehow special, thank you for the answer