-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
how to turn on logging? #84
Comments
In general, libraries like pangeo-forge should only provide the So I imagine that anyone executing the recipes would have something like
That's configured things to print to stdout for messages with INFO or higher (so not DEBUG). I think pangeo-forge might provide a standard handler and formatter that executors could use. Side-note, libraries should typically use Configuring the Dask workers is a bit more involved. I'll have to look up some examples. But it follows a similar pattern of adding a handler. The extra complication is that the handlers are on a different machine. We can pull the logs with |
Thanks Tom! I found I also needed to follow this advice to get the logs to show in jupyterlab. |
I have no idea! |
If we execute using prefect, it should be straightforward to get the logs: https://docs.prefect.io/core/concepts/logging.html#extra-loggers |
I am really stuck on something and I would love some help from a Dask guru (@martindurant or @TomAugspurger). I am debugging a problem with distributed locking (see #86) and I really need to see the pangeo-forge logging from dask workers during pytest execution. I can turn on logging from pytest by running e.g.
However, when I am executing with dask, e.g.
there is no output because the logging happens on the workers. I have tried all sorts of hacks to try to get the output to show, but I am stuck. I CAN see STDOUT from the workers. Any hints would be appreciated. |
Can you add a handler to redirect dask worker logs to stdout? With something like the following somewhere (at the top of the
|
To be clear, I am not looking for the That was one of the things I tried (with However, it gave me an idea. I added this to the fixture that creates the dask cluster def redirect_logs():
import logging
logger = logging.getLogger("pangeo_forge")
handler = logging.StreamHandler()
handler.setLevel(logging.DEBUG)
logger.setLevel(logging.DEBUG)
logger.addHandler(handler)
client.run(redirect_logs) and it worked! 🎉 |
Gotcha, sorry I missed that point. I think that the bakeries will want some kind of logging config file and then we'd set the Dask config to load it (so that it's picked up everywhere). It's a bit finnicky to get right, but hopefully we just have to do it once (per bakery). |
I put a lot of logging statements into the recipe class to help with debugging. But it is never easy to turn them on. How can we make pangeo-forge work like prefect, where logs are automatically on by default? In particular, I would really like to see the logs on the dask workers.
The text was updated successfully, but these errors were encountered: