Logstash is a tool for managing logs that allows users to collect logs from various sources, parse them, and output the data to multiple destinations. It provides inputs to collect logs from different systems, filters to parse and transform the log data, and outputs to send the data to places like Elasticsearch, Graphite, or other applications. The document provides examples of using Logstash to collect Apache logs, parse them using Grok filters, and output the results to Elasticsearch for searching and Graphite via StatsD for metrics.
8. Fine if you have one server. But
what if you have 10 or 100 or
1000
for i in `seq 1 10` ; do ssh server$i blah blah; done
cluster ssh
Splunk perhaps?
Problems with Splunk...
13. How to: install logstash
wget http://logstash.objects.
dreamhost.com/release/logstash-
1.1.9-monolithic.jar
easy!
14. How to: run logstash
java -jar logstash-1.1.9-monolithic.
jar agent -f logstash.conf -- web
easy!
15. How to: get some apache logs in
input {
tcp {
type => "apache"
port => 3333
}
}
16. How to: get some apache logs in
tail -f /var/log/apache2/access.log |
nc localhost 3333
17. How to: digest the logs
filter {
grok {
type => "apache"
pattern => "%{COMBINEDAPACHELOG}"
}
date {
type => "apache"
timestamp => "dd/MMM/yyyy:HH:mm:ss Z"
}
}
18. How to: output to elasticsearch
output {
elasticsearch {
embedded => false
}
}
19. How to: output to elasticsearch
and graphite via statsd
output {
elasticsearch {
embedded => false
}
statsd {
increment => "apache.response.%{response}"
}
}