Elior Cohen
01/22/2023, 11:45 AMkedro pipeline create my_pipeline --template my_awesome_template
which will include template code for the pipelineMassinissa Saïdi
01/23/2023, 9:26 AMcredentials: Credentials required to get access to the underlying filesystem.
E.g. for ``S3FileSystem`` it should look like:
`{'key': '<id>', 'secret': '<key>'}}`
But is it possible to add the endpoint_url like that: {'key': '<id>', 'secret': '<key>'}, 'client_kwargs': {'endpoint_url': '<http://myurl:9000>'}}
? When i use the API code it doesnt work but when I use the catalog it works.Prachi Jain
01/23/2023, 12:56 PMkedro run
then it gives me an error saying that Pipeline contains no nodes after applying all provided filters
can someone help here? i am using latest version of kedro.Safouane Chergui
01/23/2023, 2:06 PMRob
01/23/2023, 3:43 PMBrandon Meek
01/23/2023, 7:41 PMMarioFeynman
01/23/2023, 8:39 PMAlex Ofori-Boahen
01/23/2023, 9:18 PMIvan Danov
01/24/2023, 11:34 AMDustin
01/25/2023, 12:26 AMkedro run
(no error but same console information without hooks). Just wondering do i need to do something to 'reload' settings? orJoel Ramirez
01/25/2023, 2:33 PMJoel Ramirez
01/25/2023, 2:33 PMJoel Ramirez
01/25/2023, 2:33 PMJoel Ramirez
01/25/2023, 2:33 PMMiguel Angel Ortiz Marin
01/25/2023, 7:46 PMJong Hyeok Lee
01/26/2023, 5:57 AMSergei Benkovich
01/26/2023, 11:33 AMuser
01/26/2023, 12:48 PMSergei Benkovich
01/26/2023, 1:20 PMAndrew Stewart
01/27/2023, 4:55 AMkedro-docker
?Paul Mora
01/27/2023, 8:44 AMMemoryDataSets
for those non-dataframe instances. That is all fine and well, though of course not being able to save any transformers becomes quite tedious at some point.
Is there any guidance/ development on that front?Massinissa Saïdi
01/27/2023, 11:42 AMDataSetError:
botocore.session.session.create_client() got multiple values for keyword
argument 'aws_access_key_id'.
DataSet 'dataset' must only contain valid arguments for the
constructor of 'kedro.extras.datasets.pandas.csv_dataset.CSVDataSet'.
I run my code from a docker-compose
with only one container (for now), I write files in s3. I specified the credentials this way:
aws_credentials:
aws_access_key_id: XXXXXXX
aws_secret_access_key: XXXXXXX
and my dataframe in catalog.yml
this way:
dataset:
type: pandas.CSVDataSet
filepath: ${s3.path}/data/dataset.csv
credentials: aws_credentials
docker-compose.yml
version: '3.7'
services:
kedro:
build:
context: .
args:
PIP_USERNAME: ${PIP_USERNAME}
PIP_PASSWORD: ${PIP_PASSWORD}
PIP_REPO: ${PIP_REPO}
dockerfile: dockerfile.kedro
cache_from:
- ia-churn
image: ia-churn
command: kedro run --env prod --pipeline data-processing
volumes:
- .:/usr/src/app/
- ./data/01_raw/:/usr/src/app/data/01_raw
In conda
environement evrything works. Someone has an idea please ?
More informations: I used kedro v0.18.4 and python 3.10Patrick Deutschmann
01/27/2023, 1:19 PMMassinissa Saïdi
01/27/2023, 4:12 PMsagemaker_entry_point.py
script, example:
...
from pipelines.ml_model.model import train_model
...
def main():
....
regressor = train_model(...)
...
if __name__ == "__main__":
# SageMaker will run this script as the main program
main()
Because I have this error: ModuleNotFoundError: No module named 'pipelines'
Thanks for your help 🙂Alexandra Lorenzo
01/27/2023, 5:49 PMDepartment 1
|-> Zone 1
|---> IMG_00001.tif
|---> MSK_00001.tifI need to read first IMG_*****.tif then MSK_*****.tif is it possible ? Thanks for your help
raw_images:
type: PartitionedDataSet
dataset:
type: flair_ign.extras.datasets.satellite_image.SatelliteImageDataSet
path: /home/ubuntu/train
filename_suffix: .tif
layer: raw
Andrew Stewart
01/28/2023, 12:15 AMRob
01/28/2023, 6:00 PMmain.py
instead of using the CLI commands to run all the pipelines? (If so any docs or examples would be great) (Using kedro==0.17.7
)Ofir
01/28/2023, 6:26 PMOfir
01/28/2023, 6:33 PMSergei Benkovich
01/29/2023, 9:12 AMsplit_folder: "split_1"
folders:
raw: "{split_folder}/01_raw"
but it doesnt work and i just get a new folder called, {split_folder}/01_raw
is there anyway to accomplish this?
i’m running several versions one after the other, i want each one in different folder, but don’t want to have to change paths for all the subdirs i defined...