Home

ELK- The Passionate Pursuit of Log Analysis

As the number of distributed systems keeps on increasing, it is tough to understand where the errors are happening. But aggregating our logs together gives a unified understanding of the overall working of our system and helps us identify the error. The idea behind ELK is also to identify the root cause of errors and take corrective measures to prevent further damage.

Overview: During SDLC (System Development Life Cycle), we often face situations where we need to find frequently occurring errors in different pages, methods having exceptions/errors, the performance of the pages/methods, availability of application and server, and many more. Various applications are feed into each other. Debugging is an issue that requires logging into each box to look at the logs for errors. With a small number of apps/boxes, debugging is not an issue, but it quickly becomes tedious as the number of apps/boxes increase!

Idea: With ELK identifying such errors even in multiple numbers of boxes has become easy. It also allows you to monitor server performance that includes CPU utilization, Memory Utilization, Swap/Paging, network performance, disk performance, etc. 

Popularly known as ELK Stack, it has been recently re-branded as Elastic Stack. It is a powerful collection of three open-source tools: Elasticsearch, Logstash, and Kibana. These three different products are most commonly used together for log analysis in different IT environments. Using ELK Stack, you can perform centralized logging, which helps in identifying the problems with the web servers or applications. It lets you search through all the logs at a single place and identify the issues spanning through multiple servers by correlating their logs within a specific time frame. ELK Stack is designed to allow users to take data from any source, in any format, and to search, analyze, and visualize that data in real-time.

Architecture:

ELK Architecture

Technology used:  Elasticsearch is a NoSQL database that is based on the Lucene search engine and is built with RESTful APIs. It is a highly flexible and distributed search and analytics engine. Also, it provides simple deployment, maximum reliability, and easy management through horizontal scalability. It gives advanced queries to perform detailed analysis and stores all the data centrally for a quick search of the documents.

Logstash is the data collection pipeline tool. It is the component of ELK Stack which collects data inputs and feeds it to Elasticsearch. It collects various types of data from different sources, all at once, and makes it available immediately for further use.  

Kibana is a data visualization tool. It is used for visualizing the Elasticsearch documents and helps the developers to have an immediate insight into it. Kibana dashboard provides various interactive diagrams, geospatial data, timelines, and graphs to visualize the complex queries done using Elasticsearch. Using Kibana, you can create and save custom graphs according to your specific needs.

Use Cases:

  • DevOps
  • Data Analytics
  • Data Center / Server Performance Monitoring
  • Market Intelligence
  • Security Analysis

Conclusion: ELK offers ease of use and interoperability. It is wise to use ELK to understand errors in logs and make an informed decision.

Watch the demo video here

Facebook Twitter Linkdin
Up ArrowBack to Blog

Social Engagement in the times of Social Distancing

What curve do we want to flatten and why?

iBeans iBean close chat
chat