Jul 092018
 

If you have worked with mysql/mariadb/galera …. sooner or later you are going to have to do a restore.  Or if you are setting up a new master – slave, the size of the database can greatly affect how long it takes.  mysdqldump at one time was all that was available and for it for to be accurate, you need to lock tables which can affect production environments, do the dump in another shell, record the master log and position, transfer the files to another server, import the database, change master too ….. very very very time consuming.  So here is a way I have found that doesn’t lock the tables, doesn’t need to record the master log file or position, and does the dump and import in parallel greatly speeding things up.

Continue reading »

May 052018
 

We have all lost a hard drive at one time or another on a laptop or desktop computer and it always seems like it happens right after several weeks of not performing backups.  Last year, I lost about 15 years of research on an external drive that failed.  I had this system that has worked as long as I can remember where I simply swapped an external drive every two years with a new one after copying the data.  What failed on me though was I became over-confident in this system and wiped out the older drives in order to make room for something else, meanwhile the current drive decided to barf after only about 6 months of usage … literally within a couple weeks of me wiping the previous drives clean.  I was pretty pissed to say the least.  So, lesson learned, I decided to implement a better backup plan.  I wanted a way that would work and be simple.  Instead of a file server and transferring data over a wire, I wanted an external drive I could plug-in and leave plugged in while working or at home or in some motel.  I wanted full backups and I wanted it to be incremental to save space.  This was how I accomplished these tasks …

Continue reading »

May 012017
 

This is another way to quickly analyze nginx logs.  It will spit out the top 25 ip’s, domains, requests and some other data.  You may need to change the array value to match the format of your nginx logs.

It uses perl so it is very fast and takes just seconds to analyze hundreds of thousands of lines in a log file.  It can also be used for apache too or other column whitespaced logs.

Continue reading »

May 012017
 

I’ve wrote about this before in Using Sed to search between dates and offered a ad-hoc solution but the other day I came up with a much better solution using a little known option of ‘date’ command.  Using this new method, you just pass the time in minutes prior to current time.  I.e. if you want the last hour, you would simply type ‘./sed_time.sh 60’ and it will spit out the correctly formatted sed command like this:

$ ./sed_time.sh 60
sed -n '/01\/May\/2017\:07\:16\:15/,/01\/May\/2017\:08\:16\:15/ p'

Continue reading »

Feb 172016
 

This is a silly script but you would be surprised how many times a day I have to do this and no matter how many times I type the command, I always get it wrong (or more than likely I forget to escape something).  Its also interesting to note that the scripts I find silly are usually the ones that are the most popular on this site….so here it is.

Basically, if you copy and paste this script into a file and run it, it will give you the exact date and time in the sed command to run to search all lines in a log file from the previous hour to now and save it to another file.

Continue reading »

Mar 162015
 
Screenshot of psecio-parse scan

I used rips for many years to help with auditing source code.  Lets face it, anytime you can automate a mundane task such as source code auditing, you free up time for other things to be done…..plus if you have ever stared at source code for 14+ hours straight reading line by line by line ….. you know how well automation helps save your vision.

Anyways, today I found a new project at github and wanted to document how I set it up.  One thing to keep in mind is that this is a relatively new project, and with any new project of this size and scope … we can generally expect a few things …. lots of development changes and false positives.  Even with this being known, I still love the direction the project is already moving … so lets begin.

Continue reading »

Mar 072014
 

I love irc.  I love tor.  I love freenode via tor.  But one thing I hate is that sometimes I can’t connect and I would have to open up my torrc file and change the MapAddress cname.  So, I created a script today which randomly cycles through the names and changes it for me…..it uses a bash array to accomplish this.

Continue reading »

Feb 262014
 

I generally do most everything from a shell. I also generally script things when I can. However, I wanted to see changes made to arachni web interface and it had been a while since I used it. I’m not sure if this is automated via the links included in kali linux or not, I just know that when I went to fire up arachni_web it failed and this is how I fixed it.
Continue reading »

Dec 122013
 

I have been using this script for a long time (maybe 13 years) with only very slight changes.  It was probably one of the first cool ideas I had for a way to track laptops issued to employees that might possibly be stolen.  Granted, today, we use full disk encryption and other cool things that almost makes this script obsolete….but in the event something does get stolen, we can always track it.

The script only requires a crontab entry and a way to send mail (I use ssmtp btw).

Continue reading »