I am trying to run kedro project which is packaged as a python package and created a docker image. So the docker image has kedro project in the form of python package. And the docker image is pushed to AWS ECR. Now i am trying to run the DAG in airflow. For the DAG to run, the docker image (kedro project as python package) should be made available to the airflow executor environment.
Could anyone suggest optimal ways to achieve this i.e. how to make the image in ECR available to airflow when the DAG is scheduled to run?
And is it recommended to run the DAG in the same instance where the airflow is
running, if not any alternate options?