I have a yaml file full of Kedro parameters like: ...
# questions
a
I have a yaml file full of Kedro parameters like:
Copy code
data_extraction:
    param1: 5
    param2: "hello"
My nodes then take in inputs like
def mynode(data_extraction_params: dict): ...
and my pipeline does
node(func=mynode, inputs=["params:data_extraction"], ...
I want to use Pydantic models to specify for each node what the inputs are, kind of like structs in Typescript. So essentially I'd like to do:
Copy code
class dataExtractionParams(BaseModel):
    param1: int
    param2: str
def mynode(data_extraction_params: dataExtractionParams): ...
How can I most easily do this? That way in the function I can use the model rather than a dict.
y
The closest thing to this that I've seen is
typing.Unpack
- see here
e
n
Apart from that, if you want to pass a native dataclass as an argument to
node
, it's not supported. there is a on-going discussion here: https://github.com/kedro-org/kedro/discussions/3808