Hands-on Labs Blog Header
Share on facebook
Share on twitter
Share on linkedin

System Log Aggregation with the Elastic Stack

Myles Young
Myles Young

The Elastic Stack is infinitely configurable for just about any use case that involves collecting, searching, and analyzing data. To make it easy to get up and running, we can use modules to quickly implement a preconfigured pipeline. In this brief tutorial, we are going to use the System module to collect log events from /var/log/secure and /var/log/auth.log and then analyze the log events through module-created dashboards in Kibana. For this demonstration, I am going to be using a t2.medium EC2 instance on the A Cloud Guru Cloud Playground. If you are not a Linux Academy subscriber, feel free to follow along with your own cloud server or virtual machine. All you need is a CentOS 7 host with 1 CPU and 4 GB of memory. Otherwise, the server is pre-configured for you!

Linux Academy Cloud Playground

Linux Academy Cloud Playground


First, we need to install the only prerequisite for Elasticsearch, a Java JDK. I am going to be using OpenJDK, specifically the java-1.8.0-openjdk package:

sudo yum install java-1.8.0-openjdk -y

Now we can install Elasticsearch. I am going to install via RPM, so first let’s import Elastic’s GPG key:

sudo rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch

Now we can download and install the Elasticsearch RPM:

curl -O https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-6.4.2.rpmsudo rpm --install elasticsearch-6.4.2.rpmsudo systemctl daemon-reload elasticsearch

Let’s enable the Elasticsearch service so it starts after a reboot and then start Elasticsearch:

sudo systemctl enable elasticsearchsudo systemctl start elasticsearch

The ingest pipeline created by the Filebeat system module uses a GeoIP processor to look up geographical information for IP addresses found in the log events. For this to work, we first need to install it as a plugin for Elasticsearch:

sudo /usr/share/elasticsearch/bin/elasticsearch-plugin install ingest-geoip

Now we need to restart Elasticsearch in order for it to recognize the new plugin:

sudo systemctl restart elasticsearch


We already have the Elastic GPG key imported, so let’s download and install the Kibana RPM:

curl -O https://artifacts.elastic.co/downloads/kibana/kibana-6.4.2-x86_64.rpmsudo rpm --install kibana-6.4.2-x86_64.rpm

Now we can start and enable the Kibana service:

sudo systemctl enable kibanasudo systemctl start kibana

Because Kibana and Elasticsearch both come with sensible defaults for a single-node deployment, we do not need to make any configuration changes to either service.


Now we can install the client that will be collecting our logs, Filebeat. Again, because we already have the Elastic GPG key imported, we can download and install the Filebeat RPM:

curl -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-6.4.2-x86_64.rpmsudo rpm --install filebeat-6.4.2-x86_64.rpm

We want to store our log events in Elasticsearch with a UTC timestamp. That way, Kibana can simply convert from UTC to whatever time zone our browser is in at request time. To enable this conversion, let’s uncomment and enable the following variable in /etc/filebeat/modules.d/system.yml.disabled for both the syslog and auth sections:

var.convert_timezone: true

Now we can enable the System module and push the module assets to Elasticsearch and Kibana:

sudo filebeat modules enable systemsudo filebeat setup

Finally, we can enable and start the Filebeat service to begin collecting our system log events:

sudo systemctl enable filebeatsudo systemctl start filebeat


By default, Kibana listens on localhost:5601. So in order to browse Kibana in our local web browser, let’s use SSH to log in to our host with port forwarding:

ssh username@hostname_or_ip -L 5601:localhost:5601

Now we can navigate to https://localhost:5601 in our local web browser to access our remote instance of Kibana. From Kibana’s side navigation pane, select Dashboard and search for “system” to see all the System module dashboards. To take things a step further, you can create your own honeypot by exposing your host to the internet to garner even more log events to analyze.

Syslog Dashboard

Syslog Dashboard

Sudo Commands Dashboard

Sudo Commands Dashboard

SSH Logins Dashboard

SSH Logins Dashboard

New Users and Groups Dashboard

New Users and Groups Dashboard

Want to know more?

From creating beautiful visualizations to managing the Elastic Stack, Kibana markdown visualization helps you get the most of your data. At A Cloud Guru, we offer a ton of fantastic learning content for Elastic products. Get a brief overview of all the products in the Elastic Stack with the Elastic Stack Essentials course. Or get to know the heart of the Elastic Stack, Elasticsearch, with the Elasticsearch Deep Dive course. When you’re ready, prove your mastery of the Elastic Stack by becoming an Elastic Certified Engineer with our latest certification preparation course. All of these courses are packed with Hands-On Labs and lessons that you can follow along with using your very own A Cloud Guru cloud servers. So what are you waiting for? Let’s get Elastic!

Elastic Stack Ecosystem

The Elastic Stack Ecosystem


Get more insights, news, and assorted awesomeness around all things cloud learning.

Sign In
Welcome Back!

Psst…this one if you’ve been moved to ACG!

Get Started
Who’s going to be learning?