Dotun O
03/27/2023, 4:58 PMdatajoely
03/27/2023, 4:59 PMon_node_error
hookDotun O
03/27/2023, 5:02 PMon_node_error
function hook, do we have a way to track the remaining unrun nodes? I only see the next node to run in the function call.
Second question, if I implement this functionality (and test it accordingly) and want it to be part of the general kedro library, what steps do I need to take to have it implemented?datajoely
04/10/2023, 2:47 PMbefore_pipeline_run
hook and maintain the run pipeline
object in self
Dotun O
04/10/2023, 2:54 PMdatajoely
04/10/2023, 2:54 PMclass FailureHooks:
def __init__(self):
self.run_pipeline = None
class FailureHooks:
def __init__(self):
self.run_pipeline = None
def before_pipeline_run(self, pipeline):
self.run_pipeline = pipeline
def on_node_error(self, node):
remaining_pipeline = self.run_pipeline.from_nodes(node.name)
# do stuff with this
Dotun O
04/10/2023, 3:02 PMdatajoely
04/10/2023, 3:03 PMsettings.py
2. Yes it will be called
2b You need to log out the contents of remaining_pipeline
some way that the user can work withDotun O
04/10/2023, 3:04 PMfrom- nodes ""
, and when I re-run with the command, the entire pipeline runs.
What situations lead to from-nodes being an empty string?
How do I avoid this situation by adding this run from unrun node functionality?datajoely
04/10/2023, 3:08 PMDotun O
04/10/2023, 3:09 PMdatajoely
04/10/2023, 3:09 PMDotun O
04/10/2023, 3:15 PMrun_only_missing
functionality you shared, might be what I need. Is there a way for me to access this function easily within my pipeline runs. Can I do something like pipeline.run_only_missing()
datajoely
04/11/2023, 1:27 PMDotun O
04/11/2023, 1:35 PMself._suggest_resume_scenario(pipeline, done_nodes, catalog)
datajoely
04/11/2023, 2:27 PMDotun O
04/12/2023, 4:11 PM