Piotr Grabowski
11/23/2023, 12:16 PMdatajoely
11/23/2023, 12:18 PMPiotr Grabowski
11/23/2023, 12:19 PMPiotr Grabowski
11/23/2023, 12:42 PMconfig = load_local_pipeline_config(Path(__file__))
which gives me OmegaConfigLoader access.
Then I have a pipeline of nodes:
def create_pipeline() -> Pipeline:
"""Train GCN model."""
return pipeline(
[
node(
func=to_pyg_graph,
inputs=["graph_configured", "config:num_features"],
outputs="data",
),
obviously "config:num_features" won't work here as Kedro thinks I am trying to invoke a member of the DataCatalog here. Only params works like this, but I'd like to have access like this to the local config.Piotr Grabowski
11/23/2023, 12:44 PMdatajoely
11/23/2023, 12:46 PMparams:some_config
as an input to a node:
β’ you can then override this with configuration environments
β¦ kedro run
will take the default param from conf/base
structure
β¦ kedro run --parmas:some_config=10
will take directly from CLI
β¦ kedro run --env prod
will take from conf/prod
in the file structurePiotr Grabowski
11/23/2023, 12:51 PMconf/base/parameters.yml
and I have those pipeline-specific ones in:
src/package/pipelines/pipeline_foobar/local_config.yml
what's the best way to configure Kedro so that it merges these yaml files when doing kedro run
? I could then just access everything via "params:" as designed πdatajoely
11/23/2023, 12:56 PMconf/
folderdatajoely
11/23/2023, 12:56 PMconf/alt_1
conf/alt_2
conf/alt_3
datajoely
11/23/2023, 12:57 PMkedro run --env conf/alt_1
datajoely
11/23/2023, 12:57 PMdatajoely
11/23/2023, 12:58 PMdatajoely
11/23/2023, 12:58 PMparameters_a.yml
, parameters_b.yml
etcdatajoely
11/23/2023, 12:58 PMPiotr Grabowski
11/23/2023, 12:58 PMdatajoely
11/23/2023, 12:58 PMdatajoely
11/23/2023, 12:58 PMPiotr Grabowski
11/23/2023, 12:58 PMdatajoely
11/23/2023, 12:59 PMdatajoely
11/23/2023, 12:59 PMdatajoely
11/23/2023, 12:59 PMMatthias Roels
11/24/2023, 9:37 AMparameters_[pipeline_name].yaml
Piotr Grabowski
11/24/2023, 1:51 PM