Linux isn’t as hard as it looks to do some cool things…

This server was down earlier today due to, I believe, a power outage.

Even though the server has pretty massive battery backups, I think the momentary kick over to battery might have caused it to poop out.

In any case, the server issues made me want to consider running some regular backups, but without using some packaged software or whatever, it could get complicated and annoying to run all the time.

So I decided to use some command-line tools to accomplish the task. I created a new script called “backup-www”. (Be sure to chmod it 755 so it’ll execute).

Each line is a separate single line in the BASH script.

#!/bin/bash

cd /var/

TheDate=$(date +%m-%d-%Y)

mysqldump –all-databases -h localhost -u root -ppassword >> /var/www/ALL-DB-$TheDate.sql
tar -cvvf ALL-WWW-$TheDate.tar www/

gzip ALL-WWW-$TheDate.tar

mv /var/ALL-WWW-$TheDate.tar.gz /var/www/claytond.com

curl -T /var/www/claytond.com/ALL-WWW-$TheDate.tar.gz -u username:password –URL ftp.TheFTPServer.com

Here is a brief description of what the above script is doing:

  • “cd /var”
  • Set Date Variable
  • Do a MySQL Dump of all my databases into the /var/www folder to be backed up in a next step
  • TAR up the entire WWW folder (recursively) under /var which contain my websites (all of them).
  • GZIP the TAR file I just created to save some space.
  • Move it to my webroot (personal preference, no necessary)
  • Then I CURLed the newly created gzipped tar file up to a different machine on the same network. Since the network here is gigabit, it takes less than a minute for me to upload the ~215mb file via CURL FTP upload. (Longer if you are using a remote server on the net)

Now, instead of manually running this script every day, I decided to look up cron jobs on WikiPedia and figure out how they work.

I proceeded to create a .crontab file called “backup-www.crontab”… I wanted to schedule this task to run every night at 11:50pm, so I entered the following information into the file.

50 23 * * * /var/backup-www

The “50” is the minute, the 23 is military time for 11pm, * * * are for the other time increments. I only entered the pertinent ones then put the command I wanted to run… /var/backup-www

After saving the “backup-www.crontab” file, all you have to do to schedule it is type “crontab backup-www.crontab” and it the crond will run it every day at the specified time.

I then waited until just before 11:50pm and started watching the filesystem and remote FTP server for the activity and wa-la, It worked like a charm.

If this might seem a bit novel to you Linux gurus, remember, I’m a bit of a noob when it comes to these command line things.