Hello All! I have an issue with regards to error h...
# questions
s
Hello All! I have an issue with regards to error handling: the pipeline in airflow would run showing all of the tasks (each task is a kedro pipeline) as successful despite errors. Currently I am trying to run kedro pipeline as tasks in airflow. The pipeline is orchestrated to run tasks in sequence.Given the definition of DAG, the success status of one task will trigger the the next task. Currently all of the tasks are defined as Notebook tasks where each task will execute the code in the cells of Jupiter Notebook. Once the kedro module is imported, the error display with traceback is of the output type “display_data” and not “error”. As a consequence, the task would report the status as success despite the failure to execute the code. Current understanding is the “rich” library is formatting the output type of the cell for the error, hence the task does not understand the failure and assumes success and the next task would get triggered in the pipeline running on airflow. What is the right version of the rich library that can avoid this issue ? Is there way we could turn off the traceback module that is changing output type from “error” to “display_data” Note: try catch and raise exception was tried and the error output type is still “display_data” or INFO and not error. CC: @Bernardo Sandi
d
in your logging.yaml change this
Copy code
root:
-  handlers: [console, info_file_handler, error_file_handler]
+  handlers: [console]
👍 1