Skip to content Skip to sidebar Skip to footer

How To Configure Logging System In One File On Python

I have two files. first is the TCP server. second is the flask app. they are one project but they are inside of a separated docker container they should write logs same file due t

Solution 1:

Anything that directly uses files – config files, log files, data files – is a little trickier to manage in Docker than running locally. For logs in particular, it's usually better to set your process to log directly to stdout. Docker will collect the logs, and you can review them with docker logs. In this setup, without changing your code, you can configure Docker to send the logs somewhere else or use a log collector like fluentd or logstash to manage the logs.

In your Python code, you usually will want to configure the detailed logging setup at the top level, on the root logger

import logging
def main():
  logging.basicConfig(
    format='%(asctime)s - %(levelname)s - %(funcName)s - %(message)s  ',
    datefmt='%d-%b-%y %H:%M:%S',
    level=logging.INFO
  )
  ...

and in each individual module you can just get a local logger, which will inherit the root logger's setup

import logging
LOGGER = logging.getLogger(__name__)

With its default setup, Docker will capture log messages into JSON files on disk. If you generate a large amount of log messages in a long-running container, it can lead to local disk exhaustion (it will have no effect on memory available to processes). The Docker logging documentation advises using the local file logging driver, which does automatic log rotation. In a Compose setup you can specify logging: options:

version: '3.8'
services:
  app:
    image: ...
    logging:
      driver: local

You can also configure log rotation on the default JSON File logging driver:

version: '3.8'
services:
  app:
    image: ...
    logging:
      driver: json-file # default, can be omitted
      options:
        max-size: 10m
        max-file: 50

You "shouldn't" directly access the logs, but they are in a fairly stable format in /var/lib/docker, and tools like fluentd and logstash know how to collect them.

If you ever decide to run this application in a cluster environment like Kubernetes, that will have its own log-management system, but again designed around containers that directly log to their stdout. You would be able to run this application unmodified in Kubernetes, with appropriate cluster-level configuration to forward the logs somewhere. Retrieving a log file from opaque storage in a remote cluster can be tricky to set up.


Post a Comment for "How To Configure Logging System In One File On Python"