Setting up Elasticsearch 5.x (Distributed) on CentOS 7 Minimal Part 1/3

In this series we will go ahead and setup Elasticsearch 5 to collect Windows Logs.

The point of this tutorial is to have a truly distributed test Elasticsearch cluster environment which will allow us to search through logs real-time and provide an ability to create meaningful visualizations from Kibana. We will be focusing on collecting Windows logs such as account lockouts, account login failures, task creation, firewall modification, process creations, network connections.

I would suggest to take 5 minutes to understand what The Elasticsearch Stack is by visiting their main site.

Our setup: Total of 5 servers (CentOS) setup with Static IP addresses. (Note: I’ve made another tutorial with the same setup for single server which may be accessed here )
———————————————————————————————————–

  • 1 (server name: logstash) server will serve as our Logstash parser – This server will receive logs from various sources (Windows, Cisco Firewalls, Exchange server, etc) and will write the data to elasticsearch.
  • 1 (server name: analyze) server will serve as Kibana and Elasticsearch Master – Kibana will serve as our dashboard to view our data.
  • 1 (server name: elastic1) Server will serve as elasticsearch Master-eligible (this will also receive logs)
  • 2 (server name(s): elastic2, elastic3) Servers will serve as elasticsearch data nodes

Before moving forward please read the following pertaining to Memory Lock and Master/Data Node information as my guide will include both.

Here are the prerequisites for this tutorial:

  • CentOS 7 minimal – Latest built (or full Dvd download)
  • net-tools
  • nano
  • Java 1.8.x

Run an OS update to ensure that you are getting the most up-to-date applications for YUM. (perform these steps for all servers)

Ensure that you are installing basic network tools such as “ifconfig”

Run the following to get a text editor

Set Static IP addresses for all servers

sample configuration:

Reboot your network adapter.

At this point I’d suggestion you create some DNS records for your systems.

Now that we’re done with basic CentOS 7 items, let’s move on to Elasticsearch Prerequisites.

Step 1: install Java 1.8 DJK.
Install on all servers

set $JAVA_HOME to

you may run “java -version” to confirm your version of java installed.

Step 2: Import Elasticsearch PGP key

Step 3: Setup Repositories for Elasticsearch(4) , Kibana(1) , and Logstash(1)  (number represents the amount of servers you should setup the Repositories on). 

For Elasticsearch:
Server names: (analyze, elastic1,elastic2,elastic3)
Navigate to /etc/yum.repos.d and create a new repository file, call it elasticsearch.repo

And copy the following to it:

CTRL-O and save as elasticsearch.repo”

Do the same for your Kibana Server
Server name: (analyze)

Save as kibana.repo

Lastly, on your logstash server setup a repo for Logstash
Server name:(logstash)

save as logstash.repo

Step 4: install Elasticsearch:
server names: (analyze, elastic1,elastic2,elastic3)

Step 5: Set the service to start automatically

To configure Elasticsearch to start automatically when the system boots up, run the following commands:

Elasticsearch can be started and stopped as follows:
sudo systemctl start elasticsearch.service
sudo systemctl stop elasticsearch.service

Step 6: Setup Firewall Rules

Add firewall rules, (Kibana will run on port 5601, Elasticsearch will run on port 9200, 9300, and Logstash will be running on port 5044 or whichever port you decide)

server names: (analyze)

server names: (elastic1,elastic2, elastic3)

server names: (logstash)

Step 7: Test Elasticsearch

on (analyze) run the following query:

curl -XGET ‘analyze:9200/?pretty’

You should see the following:

{
“version” : {
“number” : “5.2.0”,
“build_hash” : “24e05b9”,
“build_date” : “2017-“,
“build_snapshot” : false,
“lucene_version” : “6.4.0”
},
“tagline” : “You Know, for Search”
}

Step 8: Configure Elasticsearch:
server names: (analyze, elastic1,elastic2,elastic3)

Edit the following options and ensure that you remove the #comment field to enable them.

Next, uncomment the following setting MAX_LOCKED_MEMORY=unlimited (Remove the #)

Save and exit

Again we’re doing it at the service level too, uncomment LIMITMEMLOCK=infinity

Save and exit

Run the following once you have edited both configurations:

Step 9: Configure Master and Data nodes on Elasticsearch
– For more information pertaining this setup please visit

Go back to the (analyze) server only  and edit the configuration again.

Find the #Master Node Information and set to the following:

node.master:true
node.data:false

then run the following:

For elastic2, and elastic3 set to data nodes.

Find the #Master Node Information and set to the following:

node.master: false
node.data: true

Save the configuration and restart the elasticsearch service.

nothing needs to be done on (elastic1)

Step 10: Install KIBANA
server names: (analyze)

To configure Kibana to start automatically when the system boots up, run the following commands:

Kibana can be started and stopped as follows:

Step 11: Configure Kibana

Change the following settings:

Save it, and then restart the Kibana service.

You should be able to visit: http://analyze:5601

Step 12: Configure Logstash
server names:(Logstash)

Install Logstash

Setup Logstash as a service

Next we will create a basic logstash configuration (assuming that you will be using Winlogbeats (Next tutorial) to send your data)

If you will be using Winlogbeat to send data use the following logstash configuration: (For this tutorial use the first configuration shown)

If you are sending logs using NXlogs you may use the following logstash configuration:

Save the configuration and restart the Logstash service.

Logstash will now listen on port 5044. You may run netstat -plnt to see the listening port:

you should see something similar to this:

Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name
tcp6 0 0 :::5044 :::* LISTEN 8595/java

Now you have completed the Elasticsearch (ELK) stack basic setup. On the next tutorial I will go over adding data from a windows system using Winlogbeat and creating Indexes from Kibana.