Logging

One of the major changes made by the LOFAR pipeline system to the cuisine system was the introduction of logging using the standard Python logging module. All instances of recipes derived from WSRTrecipe (in other words, every recipe developed using the framework) has an associated logger (in fact, an instance of SearchingLogger available as the attribute self.logger), which supports the standard logging methods: see the Python documentation for details. The logging system is also available in much the same way on remote hosts using the pipeline’s distribution system.

Note that by default only messages of level logging.WARNING and higher are logged. Use of the -v or --verbose flag the command line will log messages at level logging.INFO; -d or --debug at level logging.DEBUG.

Logs are output to standard output and also to a file. By default, the file is located in the job_directory, but this, together with the format used for logging, may be configured through the configuration file if required.

The lofarpipe.support module provides a number of helper functions for working with logs, which are documented here.

Searching logs

Sometimes, it is convenient to be able to keep track of messages sent to a logger. For example, pipeline tools may send metadata to the log rather than output it in any other, more useful, fashion.

As mentioned above, all recipes have an associated instance of SearchingLogger. This can have any number of regular expression patterns defined, and it will then store for later reference any log entries which match those patterns. For example, a recipe could include the code:

self.logger.searchpatterns["new_pattern"] = "A log entry"

This would record all log entries matching “A log entry”. Later, a list of all those entries can be retrieved:

matches = self.logger.searchpatterns["new_pattern"].results
self.logger.searchpatterns.clear()

Note that messages generated by all subsidiary loggers – including those on remote hosts – are included. The call to clear() simply instructs the logger to stop searching for that pattern, to avoid incurring overhead in future.

class lofarpipe.support.pipelinelogging.SearchingLogger(*args, **kwargs)

Bases: logging.Logger

A SearchingLogger will act as a normal logger object, forwarding LogRecords to the appropriate handlers. In addition, it will check the LogRecord against a SearchPatterns object and save any useful results.

class lofarpipe.support.pipelinelogging.SearchPattern(pattern)

Match the contents of LogRecords against a regular expression, keeping track of matching records.

class lofarpipe.support.pipelinelogging.SearchPatterns

Bases: dict

A dictionary of SearchPattern objects.

When a new entry is appended, it’s automatically compiled into a SearchPattern. Other access patterns are as for a dictionary.

lofarpipe.support.pipelinelogging.getSearchingLogger(name)

Return an instance of SearchingLogger with the given name.

Equivalent to logging.getLogger, but returns a SearchingLogger.

Logging process output

Many pipeline recipes run external executables. These tools may produce useful logging output, either by writing to stdout (or stderr), or by using a library such as log4cplus or log4cxx. The framework makes it possible to ingest that output and re-route it through the standard pipeline logging system.

Standard output/error

lofarpipe.support.pipelinelogging.log_process_output(process_name, sout, serr, logger)

Log stdout/stderr from a process if they contain anything interesting – some line-noise produced by many CEP executables is stripped.

Parameters:
  • process_name – Name to be used for logging purposes
  • sout – Standard out to log (string)
  • serr – Standard error to log (string)
  • logger – Logger to which messages should be sent

The sout and serr params are intended to be used with the output of subprocess.Popen.communicate(), but any string-a-like should work.

Logging libraries

The output from log4cplus or log4cxx is currently intercepted by simply redirecting it to a file and then logging the contents of that file as it is updated via the log_file() function.

class lofarpipe.support.pipelinelogging.LogCatcher

Sets up a context in which we can catch logs from individual pipeline process in a file, then send then to the pipeline logger.

This provides the basic mechanism, but requires subclassing to define self.log_prop and self.log_prop_filename (the name & contents of the log configuration file).

class lofarpipe.support.pipelinelogging.CatchLog4CPlus(working_dir, logger_name, executable_name)

Bases: lofarpipe.support.pipelinelogging.LogCatcher

Implement a LogCatcher for log4cplus (as used by most LOFAR pipeline tools).

class lofarpipe.support.pipelinelogging.CatchLog4CXX(working_dir, logger_name)

Bases: lofarpipe.support.pipelinelogging.LogCatcher

Implement a LogCatcher for log4cplus (as used by ASKAP tools, eg cimager).

lofarpipe.support.pipelinelogging.log_file(filename, logger, killswitch)

Do the equivalent of tail -f on filename – ie, watch it for updates – and send any lines written to the file to the logger.

killswitch is an instance of threading.Event: when set, we bail out of the loop.

Parameters:
  • filename – Full path to file to watch
  • logger – Logger to which to send updates
  • killswitch – instance of threading.Event – stop watching file when set

Logging resource usage

This is a decorator which makes it easy to log the amount of time (wall and CPU) used by parts of a pipeline.

lofarpipe.support.pipelinelogging.log_time(*args, **kwds)

Send information about the processing time used by code in this context to the specified logger.

Parameters:logger – logger to which timing information should be sent.

Table Of Contents

Previous topic

Ingredients

Next topic

Distribution

This Page