Centralized Log Setup awesant elasticsearch logstash and kibana3 part1

Overview

This guide will help you setup a centralized Logging Setup using some of the best opensource products available today. We will use elasticsearch to index and store our logs, awesant is a lightweight daemon to send the log files to our centralized server. Logstash is being used to collect logs, parse them, and store them to elasticsearch, Logstash can also be used as shipper and I will use it to ship tomcat/jboss logs. Finally we will be using kibana3 as an interface for Logstash and ElasticSearch which will allows us to efficiently search, graph and analyze logs.

Centralized Log Setup awesant elasticsearch logstash and kibana3 part1

Tweak the machine

Add the following to it

Next edit limits.conf file

Append the following to it

Reboot the machine

Once back up, install the prerequisites

Download and install redis

To verify everything went well type the following on the terminal

You should receive pong as a reply

Download and install elasticsearch

Next download and install logstash

Create the conf file for logstash

Append the following to it. Logstash will pull logs from redis database running locally. Another thing worth mentioning is the extraction of timestamp from our event using the SYSLOGTIMESTAMP pattern which matches data like Mar 19 19:22:09 and store it as a field called timestamp. But each event also has a @timestamp value and that they are often not the same. The first timestamp is when the event actually occurred on the host and the second @timestamp is when LogStash first processed the event, we would want this to be same and to reconcile this difference we are using the filter plugin called date. This plugin makes sure that a common timestamp value is used for each event which is of the time when the event actually occurred on the host.

Create initialization script

Append the following to it

Change the permissions

Now out of the box your elasticsearch instance index every field of log received. If you want to search on a field, you do have to index it. In addition to indexing elasticsearch instance also analyze the fields which basically is deciding a tokenizer to apply to it, which will generate a list of tokens (words) and a list of token filters that can modify the generated tokens (even add or delete some).

If you index a field but don’t analyze it, and its text is composed of multiple words, you’ll be able to find that document only searching for that exact specific text, white spaces included. Analyzing on a field makes sense when the field has more than a single word because it takes up disk space. There are fields which comprise of single words and no sentences or whitespace thus we would ideally want elasticsearch not to analyze it but continue indexing it.

Tweak our elasticsearch instance to save lot of disk space

This concludes the first part of the three part series, we will install awesant and logstash on clients to send logs to our logstash logger instance and kibana to view our logs next.

elasticsearch01 300x176 Centralized Log Setup awesant elasticsearch logstash and kibana3 part1