How to Set Up Monitoring with ELK Stack

DevOps
EmpowerCodes
Oct 31, 2025

In the age of cloud-native applications and distributed systems, monitoring and observability have become essential for maintaining system health and performance. One of the most popular solutions for centralized logging and monitoring is the ELK Stack — a powerful combination of Elasticsearch, Logstash, and Kibana.

This open-source stack allows developers, system administrators, and DevOps engineers to collect, process, and visualize log data from multiple sources in real time. In this blog, we’ll explore what the ELK Stack is, why it’s essential for modern infrastructure, and how to set up an end-to-end monitoring system using it.

What Is the ELK Stack?

The ELK Stack is a collection of three open-source tools developed by Elastic:

  • Elasticsearch: A distributed, RESTful search and analytics engine used to store and index data.

  • Logstash: A data processing pipeline that collects, filters, and sends logs to Elasticsearch.

  • Kibana: A visualization tool that enables users to explore and analyze data stored in Elasticsearch.

When combined, these tools provide a powerful framework for real-time monitoring, log aggregation, and data visualization — making it easier to detect issues, track performance, and make informed decisions.

Why Use the ELK Stack for Monitoring?

Organizations choose the ELK Stack for several reasons:

1. Centralized Logging

ELK collects logs from multiple applications, servers, and containers into a single dashboard, simplifying troubleshooting.

2. Real-Time Insights

You can view live data streams and visualize metrics in near real-time, allowing you to spot anomalies quickly.

3. Scalability

Elasticsearch is built to handle large volumes of data, making it ideal for enterprise-scale environments.

4. Flexibility

The stack supports a wide range of data sources, formats, and environments — from on-premises servers to cloud deployments.

5. Open Source and Cost-Effective

Being open source, the ELK Stack is a cost-efficient alternative to commercial monitoring tools like Splunk or Datadog.

Core Components of the ELK Stack

Before setting up the stack, let’s break down its main components:

Elasticsearch

Elasticsearch is the heart of the stack — it stores, indexes, and searches data efficiently. It enables fast querying and analysis using its REST API.

Logstash

Logstash acts as a log collector and processor. It ingests data from various sources (files, databases, APIs), processes it using filters, and outputs it to Elasticsearch or another destination.

Kibana

Kibana provides a user-friendly interface to visualize data stored in Elasticsearch. You can create dashboards, charts, and alerts for monitoring your applications and infrastructure.

Beats (Optional but Recommended)

Beats are lightweight data shippers, such as Filebeat and Metricbeat, that collect and forward data to Logstash or Elasticsearch. They’re useful for sending logs and metrics directly from your servers.

How the ELK Stack Works

  1. Data Collection: Beats or Logstash gather logs from multiple sources.

  2. Data Processing: Logstash filters, parses, and transforms the data.

  3. Data Storage: Processed data is indexed and stored in Elasticsearch.

  4. Visualization: Kibana queries Elasticsearch to display data in dashboards and visualizations.

Setting Up the ELK Stack

Let’s go step-by-step through setting up the ELK Stack on a Linux-based server.

Step 1: Prerequisites

Before you begin, ensure that you have:

  • A Linux server (Ubuntu 20.04 or higher recommended)

  • Root or sudo access

  • Java 11 or later installed

  • At least 2GB of RAM for testing purposes

You can install Java using:

sudo apt update sudo apt install openjdk-11-jdk -y

Verify installation:

java -version

Step 2: Install Elasticsearch

Add the Elastic repository and install Elasticsearch:

wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add - sudo apt install apt-transport-https echo "deb https://artifacts.elastic.co/packages/8.x/apt stable main" | sudo tee -a /etc/apt/sources.list.d/elastic-8.x.list sudo apt update sudo apt install elasticsearch -y

Enable and start Elasticsearch:

sudo systemctl enable elasticsearch sudo systemctl start elasticsearch

You can verify if Elasticsearch is running:

curl -X GET "localhost:9200/"

Step 3: Install Logstash

Next, install Logstash from the Elastic repository:

sudo apt install logstash -y

You can create a simple Logstash configuration file:

sudo nano /etc/logstash/conf.d/logstash.conf

Add the following configuration:

input { beats { port => 5044 } } filter { grok { match => { "message" => "%{COMMONAPACHELOG}" } } } output { elasticsearch { hosts => ["localhost:9200"] index => "logs-%{+YYYY.MM.dd}" } stdout { codec => rubydebug } }

Start Logstash:

sudo systemctl enable logstash sudo systemctl start logstash

Step 4: Install Kibana

Install Kibana with:

sudo apt install kibana -y

Enable and start Kibana:

sudo systemctl enable kibana sudo systemctl start kibana

Kibana runs on port 5601 by default. You can access it in your browser at:

http://your-server-ip:5601

Step 5: Install Filebeat (Optional but Recommended)

Filebeat simplifies the process of collecting logs from different sources and sending them to Logstash or Elasticsearch.

Install Filebeat:

sudo apt install filebeat -y

Enable the Elasticsearch or Logstash output in the Filebeat configuration:

sudo nano /etc/filebeat/filebeat.yml

Example output for Logstash:

output.logstash: hosts: ["localhost:5044"]

Enable and start Filebeat:

sudo systemctl enable filebeat sudo systemctl start filebeat

Step 6: Verify Data Flow

Once all components are running, you can verify the data flow:

  • Elasticsearch: Stores indexed data

  • Logstash: Processes logs

  • Kibana: Displays data visually

Open Kibana and navigate to the “Discover” tab to start exploring your logs.

Creating Dashboards in Kibana

Kibana’s powerful visualization capabilities allow you to create dashboards for real-time monitoring.

Steps to Create a Dashboard:

  1. Go to the Dashboard section in Kibana.

  2. Click on Create Visualization.

  3. Choose a visualization type (Bar, Line, Pie, Metric).

  4. Select the index pattern (e.g., logs-*).

  5. Save your visualization and add it to a dashboard.

You can create dashboards for:

  • Application performance

  • System resource usage

  • Error tracking

  • Network latency

  • User activity metrics

Integrating Alerts with ELK Stack

Kibana allows you to set up alerts for critical events. For example, you can receive an email or Slack message when your server logs show repeated errors.

To set up alerts:

  1. Go to Kibana → Alerts and Actions.

  2. Define trigger conditions (e.g., error count exceeds a threshold).

  3. Set up notification channels such as Slack, email, or webhook.

This feature helps you proactively respond to incidents and performance issues.

Scaling the ELK Stack

As your data grows, you can scale ELK horizontally:

  • Elasticsearch: Add more nodes for distributed storage and search.

  • Logstash: Use multiple pipelines to handle higher ingestion rates.

  • Kibana: Connect to multiple Elasticsearch clusters for centralized analytics.

You can also deploy the ELK Stack using Docker or Kubernetes for better scalability and management.

Best Practices for Using ELK Stack

1. Secure Your Stack

Always enable SSL/TLS and authentication for Elasticsearch and Kibana to prevent unauthorized access.

2. Use Index Lifecycle Management (ILM)

Set up ILM policies to automatically delete or archive old indices and save storage space.

3. Optimize Logstash Pipelines

Keep Logstash filters efficient to prevent bottlenecks during log ingestion.

4. Regular Backups

Take periodic snapshots of Elasticsearch indices to avoid data loss.

5. Monitor ELK Itself

Use Metricbeat to monitor the performance of the ELK Stack components themselves.

Common Use Cases of ELK Monitoring

  • Application Log Analysis: Centralize and analyze application logs to detect issues faster.

  • Security Monitoring: Track suspicious login attempts or access patterns.

  • Infrastructure Monitoring: Monitor CPU, memory, and network usage across servers.

  • Business Intelligence: Visualize key operational data trends.

Conclusion

Setting up monitoring with the ELK Stack empowers organizations to gain real-time visibility into their systems, detect problems early, and optimize performance. With Elasticsearch handling massive data volumes, Logstash streamlining data processing, and Kibana offering intuitive visualization, the ELK Stack delivers a robust and scalable observability solution.

Whether you’re managing a small application or a complex multi-service environment, implementing ELK-based monitoring helps you maintain system reliability, improve operational efficiency, and make data-driven decisions with confidence.