Shipping logs to Logstash with Filebeat

I've been spending some time looking at how to get data into my ELK stack, and one of the least disruptive options is Elastic's own Filebeat log shipper. It's one of the easiest ways to upgrade applications to centralised logging as it doesn't require any code or configuration changes - »

25 Minute ELK Stack With Docker - Part 4

I'm going to take a slightly different route with this article. Previously (Part 1 | Part 2 | Part 3), we set up an ELK stack almost anyone could use, providing they set up the right grok filters and figure out how to send data to it. This time I'm going to »

25 Minute ELK Stack with Docker - Part 3

In previous articles in this series (Part 1 | Part 2) we set up a functioning ELK stack in 25 minutes, and added nginx for a reverse proxy with some basic authentication. Now we have that ELK stack, it's time to put a meaningful amount of data into it. Some keen »

25 Minute ELK Stack With Docker - Part 2

In the last article in this series we set up a functioning ELK stack in 25 minutes. But we left a few things still to do for a production-quality system; some form of authentication, tuning ElasticSearch to prevent queue-limit problems with large quantities of data, and having persistent storage so »

ELK Stack in 25 minutes with Docker

Let's say you've got an application which outputs log files to a directory. Somewhere on your server is a file called applog_20151116.log, which contains a whole bunch of unstructured log entries output by your application - debug statements, errors, exceptions, stack traces and all sorts mixed up in »