Hello everyone! I have started to use the kedro da...
# questions
g
Hello everyone! I have started to use the kedro data catalog and it is amazing! I have some issues with handling data located ina S3 bucket. I put the required field in
conf/local/credentials.yml
and updated my catalog data to:
Copy code
waveforms:
  type: ifd.datasets.TDMSDataset
  filepath: <s3://my_bucket_name/filename.tdms>
  credentials: dev_s3
TDMSDataset is a dataset I created to handle this proprietary format and it works fine with local files. I get the following error when trying to load the data with a
kedro ipython
session:
Copy code
TypeError: AioSession.__init__() got an unexpected keyword argument 'aws_access_key_id'
I could not find much on the internet regarding this error. Anyone has an idea? Thanks! ☀️
K 2
Just found out that the examples automatically generated in
conf/local/credentials.yml
may not be correct. Instead of:
Copy code
# prod_s3:
#     aws_access_key_id: token
#     aws_secret_access_key: key
I think it should be:
Copy code
# prod_s3:
#   client_kwargs:
#     aws_access_key_id: token
#     aws_secret_access_key: key
👀 2
e
Hello Guillaume, thank you for sharing this! Indeed, looks like we need to update generated examples cause the docs have the correct configuration: https://docs.kedro.org/en/stable/data/data_catalog.html#dataset-access-credentials
👍 1