How to tag your apache logs with Logstash

Lately I've been hooked on the ELK Stack and I'm trying to monitor all my logs via Kibana web interface. Currently, my apache logs folder looks like this:

and, if you are like me, you don't like to type a lot! So, you would construct your logstash config file something close to this:

input {
	file {
		path => "/var/log/apache2/*.log"
		type => "apache"
	}
}
 
filter {
	grok {
		match => [ "message", "%{COMBINEDAPACHELOG}" ]
	}
}
 
output {
	elasticsearch {
		bind_host => "127.0.0.1"
		cluster => "elasticsearch"
		host => "127.0.0.1"
	}
}

As a result, you get a whole bunch of log entries that come from various log files within Apache's logs folder and it is somewhat tedious to search for the right data.

Luckily, with a simple snippet, it can all be fixed!

All that we need to do is to add another block in the filter section which will be responsible for matching path field and extracting the name of the log file. Once we have the log filename, simply add a tag to your log entry and you are all set!

So, here's the block to be added:

grok {
	match => [ "path", "/var/log/apache2/(?<VHost>[\w\d\.-]+).log" ]
	add_tag => [ "%{VHost}" ]
}

That's it! Now you should be able to perform a search by typing something like this in your web interface:

type:apache AND tags:*kibana3*

Hope this helps!

Comments