Hello, had a question The pipeline I'm trying to ...
# questions
s
Hello, had a question The pipeline I'm trying to build includes credentials for a PostgreSQL DB. The idea is to pass off a containerized pipeline and facilitate the necessary data cleaning, transformation and storage required for further analytics. In credentials.yml, I have added the following
Copy code
postgresql_connection:
  host: "${oc.env:POSTGRESQL_HOST}"
  username: "${oc.env:POSTGRESQL_USER}"
  password: "${oc.env:POSTGRESQL_PASSWORD}"
  port: "${oc.env:POSTGRESQL_PORT}"
and each of these information are stored in a .env file in the same
local
folder however when I do
kedro run
postgresql_connection isn't recognized and we are unable to detect the actual values provided in the .env file that should be passed onto credentials.yml since I want this to be dynamic and based on user input. Any idea how to resolve this? Additionally what is the process to getting kedro to read credentials.yml as well? it seems on kedro run it only cares about the catalog.yml? is it just linking credentials in catalog? i tried but then it reads the dynamic string literally
s
Hi Sharan, I believe you just need to add to the dataset in the
catalog.yml
to use the credentials you defined. You do this by adding a
credentials
key,
postgresql_connection
, to the dataset's definition.
m
That’s indeed one piece. Another thing is that a
.env
is not automatically picked up by kedro