May 012017
 

This is another way to quickly analyze nginx logs.  It will spit out the top 25 ip’s, domains, requests and some other data.  You may need to change the array value to match the format of your nginx logs.

It uses perl so it is very fast and takes just seconds to analyze hundreds of thousands of lines in a log file.  It can also be used for apache too or other column whitespaced logs.

Continue reading »

May 012017
 

I’ve wrote about this before in Using Sed to search between dates and offered a ad-hoc solution but the other day I came up with a much better solution using a little known option of ‘date’ command.  Using this new method, you just pass the time in minutes prior to current time.  I.e. if you want the last hour, you would simply type ‘./sed_time.sh 60’ and it will spit out the correctly formatted sed command like this:

$ ./sed_time.sh 60
sed -n '/01\/May\/2017\:07\:16\:15/,/01\/May\/2017\:08\:16\:15/ p'

Continue reading »

Mar 062013
 

This was tested on centos 6.3.  It is running at approx 900 – 3,000+ log events per second from approx 30 hosts.

Current load is about 900 messages per second:  load average: 1.57, 1.35, 1.29 with 8GB memory.

With the above in mind, there was approx 165GB of log data after running for 4 days.

graylog.org web site

elasticsearch web site

mongo db

passenger phusion web site

logstash web site

I wrote a script in order to install a graylog2 central log server.  Its a one shot run and be done kinda thing…..

CHANGES20130309 – see notes in script

Continue reading »