Difference between revisions of "Logstash"

(Created page with "Category:Linux =Installation= Source: http://logstash.net/docs/latest/repositories * '''Add Logstash repository''': see Sources#ELK * Install application <syntaxh...")
 
Line 1: Line 1:
 
[[Category:Linux]]
 
[[Category:Linux]]
 +
  
 
=Installation=
 
=Installation=
 
  
 
Source: http://logstash.net/docs/latest/repositories
 
Source: http://logstash.net/docs/latest/repositories
Line 12: Line 12:
 
apt-get install logstash logstash-contrib
 
apt-get install logstash logstash-contrib
 
</syntaxhighlight>
 
</syntaxhighlight>
 +
 +
>> Binaries in ''/opt/logstash''
 +
 +
>> Configuration in ''/etc/logstash/conf.d/''
 +
 +
>> Logs in ''/var/log/logstash/
 +
 +
 +
* Register application as a service
 +
 +
<syntaxhighlight lang="bash">
 +
cd /etc/init.d
 +
update-rc.d logstash defaults 95 10
 +
</syntaxhighlight>
 +
 +
 +
 +
=Configuration=
 +
 +
Edit the configuration file:
 +
 +
<syntaxhighlight lang="bash">
 +
vim /etc/logstash/conf.d/logstash.conf
 +
</syntaxhighlight>
 +
 +
 +
==Apache2 logs==
 +
 +
To process your Apache2 logs you can use the following configuration. That comes from the official ElasticSearch webinar:
 +
 +
<syntaxhighlight lang="bash">
 +
vim /etc/logstash/conf.d/apache2_logs.conf
 +
</syntaxhighlight>
 +
 +
 +
Put the following content
 +
 +
<syntaxhighlight lang="conf">
 +
## List of complete inputs | filters | output available on the official website:
 +
## http://logstash.net/docs/latest/index
 +
 +
## Configuration syntax: http://logstash.net/docs/latest/configuration
 +
 +
 +
###### Data sources to process #####
 +
input {
 +
file {
 +
path => "/var/log/apache2/combined_log"
 +
type => "apache"
 +
}
 +
file {
 +
    path => "/var/log/messages"
 +
    type => "syslog"
 +
}
 +
}
 +
 +
 +
filter {
 +
# REMINDER: you can check on Kibana the field name to use for each filter.
 +
 +
if [type] == "apache" {
 +
# To process log data (message's content) using some regex
 +
grok {
 +
match => [ "message", "%{}"]
 +
}
 +
# To extract log's time according to a date pattern
 +
date {
 +
match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z"]
 +
}
 +
# Extraction browser information, if available.
 +
if [agent] != "" {
 +
useragent {
 +
source => "agent"
 +
}
 +
}
 +
if [clientip] != "" {}
 +
geoip {
 +
source => "clientip"
 +
}
 +
}
 +
}
 +
 +
}
 +
 +
output {
 +
elasticsearch {
 +
cluster => "clusterName"
 +
node => "logstash_agent_name"
 +
}
 +
}
 +
</syntaxhighlight>
 +
 +
 +
 +
==Application logs==
 +
 +
To be done: LOG4J logs
 +
 +
 +
 +
 +
=Start Logstash=
 +
 +
<syntaxhighlight lang="bash">
 +
service logstash start
 +
 +
## OR ##
 +
/etc/init.d/logstash start
 +
</syntaxhighlight>
 +
 +
 +
 +
=References=
 +
 +
* Very good webinar from the ElasticSearch team: http://www.elasticsearch.org/webinars/introduction-to-logstash/?watch=1

Revision as of 11:13, 18 November 2014


Installation

Source: http://logstash.net/docs/latest/repositories

  • Add Logstash repository: see Sources#ELK
  • Install application
apt-get install logstash logstash-contrib

>> Binaries in /opt/logstash

>> Configuration in /etc/logstash/conf.d/

>> Logs in /var/log/logstash/


  • Register application as a service
cd /etc/init.d
update-rc.d logstash defaults 95 10


Configuration

Edit the configuration file:

vim /etc/logstash/conf.d/logstash.conf


Apache2 logs

To process your Apache2 logs you can use the following configuration. That comes from the official ElasticSearch webinar:

vim /etc/logstash/conf.d/apache2_logs.conf


Put the following content

## List of complete inputs | filters | output available on the official website: 
## http://logstash.net/docs/latest/index

## Configuration syntax: http://logstash.net/docs/latest/configuration


###### Data sources to process #####
input {
	file {
		path => "/var/log/apache2/combined_log"
		type => "apache"
	} 
	file {
	    path => "/var/log/messages"
	    type => "syslog"
	}
}


filter {
	# REMINDER: you can check on Kibana the field name to use for each filter.

	if [type] == "apache" {
		# To process log data (message's content) using some regex
		grok {
			match => [ "message", "%{}"]
		}
		# To extract log's time according to a date pattern
		date {
			match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z"]
		}
		# Extraction browser information, if available.
		if [agent] != "" {
			useragent {
				source => "agent"
			}
		}
		if [clientip] != "" {}
			geoip {
				source => "clientip"
			}
		}
	}
	
}

output {
	elasticsearch {
		cluster => "clusterName"
		node => "logstash_agent_name"
	}
}


Application logs

To be done: LOG4J logs



Start Logstash

service logstash start 

## OR ##
/etc/init.d/logstash start


References