Cloud Native 5 min read

Build an ELK Log Center for Docker Containers in Minutes

This guide walks you through setting up an ELK stack to collect, centralize, and visualize logs from Dockerized applications, covering rsyslog configuration, container deployment for Elasticsearch, Logstash, Kibana, and Nginx, and demonstrates verification via Kibana queries.

Efficient Ops
Efficient Ops
Efficient Ops
Build an ELK Log Center for Docker Containers in Minutes

Overview

When an application is containerized, the first task is to collect its printed logs from Docker containers for analysis. This article explains how to use the ELK (Elasticsearch, Logstash, Kibana) stack to gather and manage these logs, providing visual query and monitoring capabilities.

The architecture is illustrated below:

Image Preparation

Elasticsearch image

Logstash image

Kibana image

Nginx image (produces logs for the containerized app)

Enable Linux Rsyslog Service

Edit the rsyslog configuration file:

<code>vim /etc/rsyslog.conf</code>

Enable the following three parameters:

<code>$ModLoad imtcp</code><code>$InputTCPServerRun 514</code><code>*.* @@localhost:4560</code>

Restart the rsyslog service:

<code>systemctl restart rsyslog</code>

Check the rsyslog listening status:

<code>netstat -tnl</code>

Deploy Elasticsearch Service

<code>docker run -d -p 9200:9200 \
  -v ~/elasticsearch/data:/usr/share/elasticsearch/data \
  --name elasticsearch elasticsearch</code>

Deploy Logstash Service

Create the configuration file

~/logstash/logstash.conf

:

<code>input {</code><code>  syslog {</code><code>    type => "rsyslog"</code><code>    port => 4560</code><code>  }</code><code>}</code><br/><code>output {</code><code>  elasticsearch {</code><code>    hosts => [ "elasticsearch:9200" ]</code><code>  }</code><code>}</code>

This configuration makes Logstash pull logs from the local rsyslog service and forward them to Elasticsearch.

Start the Logstash container:

<code>docker run -d -p 4560:4560 \
  -v ~/logstash/logstash.conf:/etc/logstash.conf \
  --link elasticsearch:elasticsearch \
  --name logstash logstash \
  logstash -f /etc/logstash.conf</code>

Deploy Kibana Service

<code>docker run -d -p 5601:5601 \
  --link elasticsearch:elasticsearch \
  -e ELASTICSEARCH_URL=http://elasticsearch:9200 \
  --name kibana kibana</code>

Run Nginx Container to Produce Logs

<code>docker run -d -p 90:80 \
  --log-driver syslog \
  --log-opt syslog-address=tcp://localhost:514 \
  --log-opt tag="nginx" \
  --name nginx nginx</code>

The Nginx container forwards its application logs to the local syslog service, which then passes the data to Logstash for collection.

At this point, four containers are running: Elasticsearch, Logstash, Kibana, and Nginx.

Verification

Open

http://localhost:90

in a browser, refresh to generate GET request logs.

Open Kibana UI at

http://localhost:5601

.

In Kibana, query

program=nginx

to view the collected Nginx logs.

DockerElasticsearchloggingELKLogstashKibanarsyslog
Efficient Ops
Written by

Efficient Ops

This public account is maintained by Xiaotianguo and friends, regularly publishing widely-read original technical articles. We focus on operations transformation and accompany you throughout your operations career, growing together happily.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.