Pricing Login
Pricing
Support
Demo
Interactive demos

Click through interactive platform demos now.

Live demo, real expert

Schedule a platform demo with a Sumo Logic expert.

Start free trial

DevOps and Security Glossary Terms

Glossary Terms
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Structured logging - definition & overview

In this article
What is structured logging?
Why use structured logging?
A structured logging example
Implementing structured logging with data parsing tools
Sumo Logic supports structured logging functionality
FAQs
What is structured logging?
Why use structured logging?
A structured logging example
Implementing structured logging with data parsing tools
Sumo Logic supports structured logging functionality
FAQs

What is structured logging?

Structured logging is the practice of implementing a consistent, predetermined message format for application logs that allows them to be treated as data sets that can be more easily searched and analyzed than text.

Key takeaways

  • Structured logging takes an application log that is delivered as a string of text and converts it into a simple relational data set that can be more easily searched and analyzed.
  • In practice, most developers now implement structured logging to help application users interact with their log files through automated processes.
  • A structured log has a clearly identified event number for reference, attributes, and values that comprise records and additional contextual data.
  • The contents of a log entry are sometimes referred to as a payload, with the distinction drawn between structured, which appear according to a predetermined standard or custom configuration, and unstructured payloads, which appear in a text format.

Why use structured logging?

Structured logging addresses three key issues that arise when dealing with log files in the standard "plain text" format:

  1. Log files presented in plain text are formatted arbitrarily. The user must implement a customized parsing algorithm to extract attribute data from the string of information presented.

  2. Unstructured log files are not always human-friendly. They can be hard to read and individual values may be difficult to interpret unless the reader knows how the logs are being formatted.

  3. If the format changes somehow, downstream applications that depend on that specific formatting may have their function impacted.

In practice, most developers now implement structured logging to help application users interact with their log files through automated processes. The use of basic or unstructured logs is becoming less widespread as more IT organizations adopt log management tools and processes for security and operational diagnostic purposes.

A structured logging example

Structured logging takes the contents of a log and puts them into a structured format. A structured log has a clearly identified event number for reference, attributes, and values that comprise records and additional contextual data. The contents of a log entry are sometimes referred to as a payload, with the distinction drawn between structured and unstructured payloads. An unstructured payload might appear in a textual format while a structured payload appears in a structured format according to a predetermined standard or custom configuration that identifies attributes and values.

Here is an example of an unstructured log record generated through the Google Cloud Platform:

<4>Nov 21 2:53:17 192.168.0.1 fluentd[11111]: [error] Syslog test

This log contains a wealth of information, but it may not be immediately obvious what each part of the message is referring to and some important details might have been left out. We can introduce structured logging to help clarify the meaning of this log message and make it more readable for machines. One of the ways this can be done is by using the Javascript Object Notation (JSON) format to change the structure of the payload:

JSON payload: { "pri": "6", "host": "192.168.0.1", "ident": "fluentd", "pid": "11111", "message": "[error] Syslog test" }

As you can see, the modified payload contains essentially the same information as the initial message. The key difference is that attributes have been identified, named, and presented as a set of ordered pairs along with the corresponding values. Now, a data analysis program can use these attributes to filter search results or to detect patterns in the data.

Implementing structured logging with data parsing tools

IT organizations implement structured logging using specialized software tools that parse data from various sources and convert them into a common format. The basic process can be outlined as follows:

  1. The application generates a log entry in response to an application event
  2. The log entry is captured and collected by a log aggregation software tool
  3. The source of the log entry is determined and the correct parsing algorithm is applied to convert the log payload into a structured payload
  4. The log entry can now be combined with other data to support search and analysis functions

Each application generates logs according to the specifications created by the developer. Some applications are following Syslog standards by default, while others may present log entries in an unstructured plain text format. Many applications output structured logs by default, but application users may still want to convert data from all application logs they collect into a standard format to better facilitate event log search and data analysis.

Sumo Logic supports structured logging functionality

Effective event logs should stand on their own, providing information in a format that is easily readable for humans and machines without depending on file names, storage locations or automatic metadata tagging from a software tool to provide additional context.

Sumo Logic provides excellent support for structured logging functionality, helping IT organizations make the most of their event logs. With Sumo Logic, you can take a log entry that looks like this:

2017-04-10 09:50:32 -0700 dan12345 10.0.24.123 GET /checkout/flights/ credit.payments.io Success 2 241.9

...and convert it into a structured format like JSON so it looks like this:

{ timestamp: 2017-04-10 09:50:32 -0700,username: dan12345,source_ip: 10.0.24.123,method: GET,resource: /checkout/flights/,gateway: credit.payments.io,audit: Success,flights_purchased: 2,value: 241.98,}

You can then parse through the data using Sumo Logic's parse operators to convert the log entry into your preferred structure and format for data analysis. Learn more about Sumo Logic’s log management solutions.

FAQs

What benefits can structured logging provide for a log analysis tool?

  • Quicker and more efficient analysis
  • Improved troubleshooting and root cause analysis
  • Simpler log correlation 
  • Easier integration and automation
  • Logical, searchable and insightful logs

    Why use the JSON format for structured logging?

    The JSON format is the favored logging framework for generating structured logs due to its ability to organize log data in a way that is logical, searchable and insightful. JSON enables easy parsing and searching of log entries, which enhances the efficiency of log analysis. Moreover, the consistent format of structured logs simplifies correlating specific log entries and tracking performance trends, contributing to improved scalability and integration of log analysis tools.

    What are the best practices for managing structured log data effectively?

    • All log entries adhere to a standardized structured format

    • Embed relevant contextual information within log messages

    • Augment log messages with enriched data such as timestamps, log level and source indicators

    • Implement a centralized log management system

    • Use regular expression filtering and extract specific log entries based on predefined criteria

    • Use Mapped Diagnostic Context (MDC) to enrich log messages with contextual information dynamically during runtime

    • Automate log data collection

    • Implement log rotation strategies to manage log file sizes

    • Retain historical log data for compliance and analysis purposes

    Complete visibility for DevSecOps

    Reduce downtime and move from reactive to proactive monitoring.