How To Install ELK Stack on CentOS 7

ELK stack is a popular, open source log management platform. It is used as a centralized management for storing, analyzing and viewing of logs. Centralized management makes it easier to study the logs & identify issues if any for any number of servers.

Table of Contents

Step 1. First let’s start by ensuring your system is up-to-date.

Step 2. Installing Java.

Step 3. Installing Elasticsearch.

Step 4. Installing Kibana.

Step 5. Configure ELK stack.

Step 6. Configure Logstash.

Prerequisites

This article assumes you have at least basic knowledge of Linux, know how to use the shell, and most importantly, you host your site on your own VPS. The installation is quite simple and assumes you are running in the root account, if not you may need to add ‘sudo’ to the commands to get root privileges. I will show you through the step by step install ELK Stack (Elasticsearch, Logstash and Kibana) on CentOS 7 server.
Install ELK Stack on CentOS 7

Step 1. First let’s start by ensuring your system is up-to-date.

yum clean all
yum -y update

Step 2. Installing Java.

You need a Java Runtime Environment (JRE) because Elasticsearch is written in Java programming language, you can install OpenJDK package that includes JRE:

yum install java-1.8.0-openjdk.x86_64

Verify the Java version:

[[email protected] ~]# java -version
openjdk version "1.8.0_131"
OpenJDK Runtime Environment (build 1.8.0_131-b12)
OpenJDK 64-Bit Server VM (build 25.131-b12, mixed mode)

Step 3. Installing Elasticsearch.

Elasticsearch can be installed with a package manager by adding Elastic’s package repository:

wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-5.0.0.rpm

Then install the RPM package that you just downloaded:

rpm -ivh elasticsearch-5.0.0.rpm

Start and enable the service:

systemctl enable elasticsearch
systemctl start elasticsearch

Now run the following command from the terminal to check if the elasticsearch is working properly:

curl -X GET http://localhost:9200

You should get the following output:

{
"name" : "idroot.net",
"cluster_name" : "elasticsearch",
"cluster_uuid" : "k27ZZFJPTaOtwg6_pyzEiw",
"version" : {
"number" : "5.5.0",
"build_hash" : "2cfe0df",
"build_date" : "2017-05-29T16:05:51.443Z",
"build_snapshot" : false,
"lucene_version" : "6.5.1"
},
"tagline" : "You Know, for Search"
}

Step 4. Installing Kibana.

Install Kibana is very simple, you can easily install it using an RPM package:

wget https://artifacts.elastic.co/downloads/kibana/kibana-5.5.0-x86_64.rpm

Now just execute the following command so you can start the Kibana service:

systemctl daemon-reload
systemctl start kibana

Kibana is now installed and working on our system. To check the web-page, open the web browser & go to the URL mentioned below (use the IP address for your ELK host):

http://localhost:5601

Step 5. Configure ELK stack.

First, we need to create an SSL certificate. This certificate will be used for securing communication between logstash & filebeat clients. Before creating a SSL certificate, we will make an entry of our server IP address in openssl.cnf:

nano /etc/ssl/openssl.cnf

Look for section with ‘subjectAltName’ & add your server IP to it:

subjectAltName = IP:10.20.30.100

Now change the directory to /etc/ssl and create SSL certificate:

cd /etc/ssl
openssl req -x509 -days 365 -batch -nodes -newkey rsa:2048 -keyout logstash-forwarder.key -out logstash_frwrd.crt

Step 6. Configure Logstash.

We will now create a configuration file for logstash under the folder ‘/etc/logstash/conf.d‘:

[[email protected] ~]# nano /etc/logstash/conf.d/logstash.conf

# input section
input {
 beats {
 port => 5044
 ssl => true
 ssl_certificate => "/etc/ssl/logstash_frwrd.crt"
 ssl_key => "/etc/ssl/logstash-forwarder.key"
 congestion_threshold => "40"
 }
}

Next section i.e. ‘filter section’ will parse the logs before sending them to elasticsearch:

# Filter section
filter {
if [type] == "syslog" {
 grok {
 match => { "message" => "%{SYSLOGLINE}" }
 }
 date {
match => [ "timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]

}
 }
}

Last section is ‘output section’ & it defines the location for the storage of logs:

# output section
output {
 elasticsearch {
 hosts => localhost
 index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
 }
stdout {
 codec => rubydebug
 }
}

Now save the file and exit. Now start the logstash service & enable it at boot time:

systemctl start logstash
systemctl enable logstash

Step 7. Installing Filebeat on Clients.

Now to be able to communicate with the ELK stack, Filebeat needs to installed on all the client machines:

### nano /etc/yum.repos.d/filebeat.repo
[beats]
name=Elastic Beats Repository
baseurl=https://packages.elastic.co/beats/yum/el/$basearch
enabled=1
gpgkey=https://packages.elastic.co/GPG-KEY-elasticsearch
gpgcheck=1

Now install filebeat using following command:

yum install filebeat

After the filebeat has been installed, copy the ssl certificate from the ELK stack server to ‘/etc/ssl’. Next we will make changes to filebeat configuration file to connect the client to ELK server:

nano /etc/filebeat/filebeat.yml

Make the following changes to file:

. . .
paths:
– /var/log/*.log
. . .

. . .
document_type: syslog
. . .

. . .
output:
logstash:
hosts: [“10.20.30.100:5044”]
tls:
certificate_authorities: [“/etc/ssl/logstash_frwrd.crt”]
. . .

Now start the service and enable it at boot time:

systemctl restart filebeat
systemctl enable filebeat

Configurations on both server end and client end are now complete. We can now login to the kibana web interface to look for analysed logs.

http://your-ip-address:5601/

Congratulation’s! You have successfully installed ELK Stack on CentOS 7. Thanks for using this tutorial for installing ELK Stack (Elasticsearch, Logstash and Kibana) on CentOS 7 systems. For additional help or useful information, we recommend you to check the official ELK Stack web site.

Leave a Reply