Logging Everything in JSON Format

Written by abhishek-dubey | Published 2020/01/01
Tech Story Tags: devops | development-skills | logs | elasticsearch | kibana | logging | technology | best-practice

TLDR Logging Everything in JSON Format is a great way to analyze application logs just like Big Data. Logging and monitoring are like Tony Stark and his Iron Man suit, the two will go together. The biggest benefit of logging in. JSON logging is that it has a structured format. This makes possible to analyze. application logs. It’s not just readable, but a database that can be queried for each and every field. Every programming language can parse it. It will not take too long to migrate from text logging to.via the TL;DR App

Logging and monitoring are like Tony Stark and his Iron Man suit, the two will go together. Similarly, logging and monitoring work best together because they complement each other well.
For many years, logs have been an essential part of troubleshooting application and infrastructure performance. But over the period of time we have realized that logs are not only meant for troubleshooting purposes, they can also be used for business dashboards visualization and performance analysis.
So logging application data in a file is great, but we need more.

Why JSON logging is best framework?

For understanding the greatness of the JSON logging framework, let’s understand this conversation between Anuj(A system Engineer) and Kartik(A Business Analyst).
A few days later Kartik complains that Web Interface is broken. Anuj scratches his head and takes a look at the logs and realizes that Developer has added an extra field to the log lines broked his custom parser.
I am sure anyone can face a similar kind of situation.
In this case, if Developer has designed the application to write logs as JSON, it would be a piece of cake for Anuj to create a parser for that because then he has to search fields on the basis of the JSON key and it doesn’t matter how many new fields are getting added in the logline.
The biggest benefit of logging in JSON is that it has a structured format. This makes possible to analyze application logs just like Big Data. It’s not just readable, but a database that can be queried for each and every field. Also, every programming language can parse it.

Magic with JSON logging

Recently, we have created a sample Golang application to get Code Build, Code Test and Deployment phase experience with Golang Applications. So while writing this application we have incorporated the functionality to write logs in JSON.
The sample logs are something like this:
And while integrating ELK for logs analysis, the only parsing line we have to add in logstash is:
filter {
    json {
        source => "message"
    }
}
After this, we don’t require any further parsing and we can add as many fields in the log file.
As you can see I have all fields available in Kibana like: employee name, employee city and for this, we do not have to add some complex parsing in logstash or in any other tool. Also, I can create a beautiful Business Dashboard with this data.

Conclusion

It will not take too long to migrate from text logging to JSON logging as there are multiple programming language log drivers are available. I am sure JSON logging will provide more flexibility to your current logging system.
If your organization is using any Log Management platform like Splunk, ELK, etc. I think JSON logging could be a companion of it.
Some of the popular logging drivers which support JSON output are:
I hope now we have a good understanding of JSON logging. So now it’s time to choose your logging wisely.
That’s all I have, thanks for reading, I’d really appreciate any and all feedback, please leave your comment below if you guys have any feedback or any queries.
Cheers till next time!

Written by abhishek-dubey | A DevOps Engineer currently working with the OpsTree
Published by HackerNoon on 2020/01/01