Shared Library and Main Script Logging to a Same Destinations (original) (raw)
Edit 2.1: I have something of a solution in Edit 1 and Edit 2, but I’m not 100% sure it’s appropriate or efficient. I’d still appreciate feedback.
*****************
I have what I believe is referred to as a library - a folder of my own classes & functions. I drop the folder into my scripts and import what I need.
I want to formalize my logging via the built-in logging module. I’ve been using it but my solution was from my early days of understanding python and was cobbled together from whatever I could find online while I was new. The name of the logger is the calling script’s basename and passed as a parameter to things called from the library, typically when instantiating a class. I’ve also used os.path.splitext(os.path.basename(inspect.stack()[1].filename))[0]but I think that stops working consistently if one library file imports another due to the changing depth of the stack?
This also seems inefficient and janky, so here I am asking.
I want to be able to:
- Generate a separate text log file for each main script that uses the library
- Generate a separate text log file for each day - I plan to use a TimedRotatingFileHandler and
when=‘midnight’ - Log from the library to the main/calling script’s log file and not a library-specific log file
- Determine each calling script’s log directory, among other things, via an INI, JSON, or YAML file - the script will be packaged as an EXE, run on Windows via Task Scheduler, and will need to be configurable via this config file
- Eventually I’d like to generate log data in GELF format for a graylog server, if we decide to move forward with using one
I’ve seen over and over again in my searches the following comments:
- Do not use the root log, for security reasons
- You should really use logging.get_logger(name)
I found answers referencing the “Logging Cookbook: Using logging in multiple modules” which of course breaks #2 and names the loggers statically. That sounds like a bad idea, since:
- I’d need to modify each logger for each file in the library every time it was used by a new main script
- I’d need to maintain a separate copy of the library for every main script.
I don’t expect to share the library. Maybe eventually I would with one or two other people, so I didn’t try to package the library. I also have no experience packaging.
Any help or direction would be really appreciated. Examples are super helpful as I am mostly self taught and only occasionally dabble in Python. While I may be familiar with a concept, I might not recognize it or know it’s formal or “pythonic” name. I’ve done a lot of searching but I always worry my word choice works against me and some results read like jargon and I can’t parse it. I also don’t touch LLMs for a number of reasons.
*****************
Edit 1: I have found a possible solution after just rapid trial and error. I’d appreciate any thoughts on how good of a solution it is.
Using dictConfig and a YAML file, I am able to specify non-root logger names and assign them handlers, including file handlers that can all point to the same file.
/log_config.yaml:
version: 1
disable_existing_loggers: False
formatters:
simple_format:
format: '%(asctime)s | %(name)s | %(levelname)s | %(message)s'
datefmt: '%Y%m%d %H:%M:%S'
# Handlers
handlers:
console:
class: logging.StreamHandler
formatter: simple_format
level: DEBUG
file_handler:
class: logging.FileHandler
filename: 'custom/log/path.log'
formatter: simple_format
level: ERROR
# Loggers
loggers:
__main__:
handlers:
- console
- file_handler
level: DEBUG
propagate: False
library:
handlers:
- console
- file_handler
level: DEBUG
propagate: False
/main.py
import logging
import logging.config
import yaml
from library import my_math
with open("log_config.yaml", "r") as f:
config = yaml.safe_load(f.read())
logging.config.dictConfig(config)
def main():
logger = logging.getLogger(__name__)
logger.info("Script Starting.")
my_math.divide(10,5)
my_math.divide(5,0)
logger.info("Script Finished.")
logger.error(f"Test Error from {__name__}")
if __name__ == "__main__":
main()
/library/my_math.py
import logging
logger = logging.getLogger(__name__)
def divide(x: int, y: int) -> int:
try:
quotient = x / y
except ZeroDivisionError:
logger.exception("Divide by Zero Error.")
quotient = 0
else:
logger.debug(f"{x} / {y} = {quotient}")
return quotient
I get log output from both main and anything in /library:
20260316 16:53:15 | library.my_math | ERROR | Divide by Zero Error.
Traceback (most recent call last):
File "C:\scripts\library_example\library\my_math.py", line 7, in divide
quotient = x / y
~~^~~
ZeroDivisionError: division by zero
20260316 16:53:15 | __main__ | ERROR | Test Error from __main__
Any concerns with this solution?
*****************
Edit 2: I found the YAML + dictConfig solution earlier but it was presented as a way to control module logging and used the root logger to combine them, which was an incomplete solution and I was trying to avoid the root logger. I didn’t consider/realize I could also just name and give configurations for __main__ and configure all of logging with it without relying on the root log. I had a few minutes and rapidly tried different things in the YAML/dictionary to figure out how it worked because I couldn’t find any examples of how to control logging for a more complex folder hierarchy. I just kept going past quitting time and found my solution. I was in a rush to not waste everyone’s time if I’d found a solution so Edit 1 was my attempt to avoid that.
After some thinking and what I learned yesterday from experimenting, it would be easy enough to expand the YAML file and keep the logging config as a section of that file. In this manner, I could set other script configurations in new sections and use the YAML file’s logging section to set the log path. For some reason that logic was alluding me yesterday so I was trying to figure out how to charge the log path outside of the YAML file.
I’d still like feedback on flaws of my solution from people more experienced than me.