hello, I am using `joblib` to run some nodes in pa...
# questions
g
hello, I am using
joblib
to run some nodes in parallel. so far it works fine, but it breaks my logging config since each separate joblib process overwrites the file as defined in
logging.yml
version: 1
Copy code
disable_existing_loggers: False
formatters:
  simple:
    format: "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
handlers:
  console:
    class: logging.StreamHandler
    level: INFO
    formatter: simple
    stream: <ext://sys.stdout>
  file:
    class: logging.FileHandler
    level: INFO
    formatter: simple
    filename: "logs/kedro_run.log"
    mode: 'w'
I could change mode to 'append' , but first i'd need to define a single logfile per kedro run (with a timestamp for example) so that I don't have the result of multiple runs in one log file. How to do this? Another solution would be to clear the log file on every run, but I'd need to dynamically retrieve the logfile path since the pipeline will run on several environments.
h
Someone will reply to you shortly. In the meantime, this might help:
g
nevermind, a custom logger hook does the job