Amazon S3 backup script

Status
Not open for further replies.

Dan

Moderator
Please see updated S3 script here
---------------------------------------

If there's any interest in a backup script using AWS S3 and GPG encryption I can write up a tutorial.

And if you aren't doing backups or are using a different place I suggest looking into it as the cost and performance have been great. You can also use it for backing up your personal data as well and there are some great tools out there for doing so.
 
Hi Dan, I am interested, would be great if you could do it :) (I am on Plesk but I can take care of that part of it)
 
Backup script for Amazon S3 using cPanel backups and encryption

Alright guys, give this a run through and let's work out the kinks before I post it to the tutorials section :)
-------------------------------------
This is a post that will show the tools I use and the script I run to create four rotating encrypted backups to Amazon S3.

I am by no means an expert so if you use or run across something you think will work better then please let us know or even if you have an idea of how to improve the backup script please speak up! :D

Requirements
An Amazon web services account with S3 enabled. If you have an Amazon account already it gets tied to that. If not then you can always sign up for one.
Apache compiled with Curl and Curlssl (openssl). This is done using Easyapache in WHM. NOTE: For some reason even though I had Curl and Curlssl compiled ssl access did not work on my server. I ended up having to remove Curl using Yum and then install it from source. Not sure what is going to happen when I go to recompile Apache again but I'm giving you a heads up is all.
GnuGPG. This should be installed on your server already. If you upgrade it from source be sure to remove it using Yum and add it to the exclude list else you will have multiple versions.
Tim Kay's AWS script. This is a script written in Perl (should also be already installed) to allow access to S3 using Curl. Here is the link for it on the Amazon developer's site too.

Preparation
1) Log into your server via SSH and then go to the website given for the AWS script and follow the instructions to download it to your server. I put all such things into /home/tools to keep them all in one place. Install it using the given instructions. He states it will be installed to /usr/bin but the aws script stays where you downloaded it and that is what is actually used. Links are created in /usr/bin linking to it so do not move or delete it. If you do need to delete it simply delete the aws file and then unlink all the s3* and ecs* links in /usr/bin.

2) Edit the aws script to store your secret key file in the /home/tools folder rather than /root.
"Pico aws"
Search for .awssecret:
"Ctrl+w" ".awssecret" then press enter
Change "$home/.awssecret" to "/home/tools/.awssecret".
Save the file and exit:
"Ctrl+x" "y" and press enter.

3) Create the file containing your AWS keys and put them in there.
Touch .awssecret
Make sure only you have access to it: "chmod 400 .awssecret"
Edit the file to put your keys in it: "pico .awssecret" paste your access key on the first line then move to the next line and paste your secret access key there.
Save the file and exit: "Ctrl+x" "y" and press enter.

4) Create a bucket for your backups to be stored in:
"s3mkdir <bucketname>"
I've seen that bucket names have to be original across all of S3 but this hasn't been the case for me as so far as I have seen as I have never been told I couldn't create a bucket. [FONT=&quot]You do not have to create the weekn directories, they will be created when the script uploads to them automatically.[/FONT]

The script and GPG key
Here is the script file I have come up with:

Code:
  #!/bin/bash
  #This is a file to run the CPanel backup utility for each users home directory, encrypt it, and then upload it to S3 weekn (0-4).
  #<bucketname> must exist beforehand (s3mkdir <bucketname>).
  (. /home/tools/counterfile
  week=$(( $week + 1 ))
  if [ $week == 4 ]
  then week=0
  fi
  echo "week=$week" > /home/tools/counterfile
  . /home/tools/counterfile
  find /var/cpanel/users -type f -printf "%f\n" |
  while read user; do
  /scripts/pkgacct "$user"
  gpg -r <gpguser> -e /home/*gz
  rm -f /home/*gz
  s3put <bucketname>/week"$week"/cpmove-"$user".tar.gz.gpg /home/cpmove-"$user".tar.gz.gpg
  rm -f /home/*gpg
  done
  #Tar Apache config, shared applications, and MySQL backups. Encrypt them and then upload to S3 weekn (0-4).
  tar -Pczf /home/apache-conf.tar.gz /usr/local/apache/conf
  tar -Pczf /home/shared-htdocs.tar.gz /usr/local/share/htdocs
  tar -Pczf /home/sqlbu.tar.gz /home/sqlbu/export
  gpg -r <gpguser> -e /home/apache-conf.tar.gz
  gpg -r <gpguser> -e /home/shared-htdocs.tar.gz
  gpg -r <gpguser> -e /home/sqlbu.tar.gz
  rm -f /home/*gz
  s3put <bucketname>/week"$week"/apache-conf.tar.gz.gpg /home/apache-conf.tar.gz.gpg
  s3put <bucketname>/week"$week"/shared-htdocs.tar.gz.gpg /home/shared-htdocs.tar.gz.gpg
  s3put <bucketname>/week"$week"/sqlbu.tar.gz.gpg /home/sqlbu.tar.gz.gpg
  rm -f /home/*gpg
  )
1) Copy and modify the code as needed using your own text editor then copy it to the clipboard.

2) Create the file:
"touch /home/tools/s3backup"

3) Edit the file:
"pico /home/tools/s3backup"
Paste the code into the file (right click should paste it in if you're using Putty). Replace everything between <> with your pertinent information.

4) Save the file and exit:
"ctrl+x" "y" and press enter.

5) Make the file executable:
"chmod 744 /home/tools/s3backup"

6) Create the counter file:
"touch counterfile"
This file will be opened and week=n will be incremented every time s3backup is run. You can leave the file empty and it will automatically start at 1. Once it reaches 3 it will reset to 0.

7) Create a gpg key:
"gpg --gen-key"
Answer the questions entering your name and email address. For the keys I followed the defaults. Be sure to use a passphrase that is secure. If you have already created a GPG key then of course you may use that one.

8) Edit the crontab to run the file:
"crontab -e"
This is the schedule I use:
0 2 * * 0 /home/tools/backups
This will run the file on Sundays at 2:00 AM. Modify as you like of course.
Save the file and exit: Control+x, y, enter

Notes on GPG
Files are encrypted using GPG which you should have installed on your server already. I upgraded to the latest version from source on my VPS and if you want to do that we can address it separately. What GPG will do is encrypt the file and change the name to end with .gpg to indicate that it is GPG encrypted. This command: â€gpg -r <gpgname> -e <path/file>†will encrypt the file with your key.

In order to restore files from your backups you will then need to decrypt them. To do this on your server simply type “gpg -r <gpgname> <path/filename>†and you will be prompted for the passphrase then it will decrypt and restore your file.

To do this on Windows you will need to export your key from your server first. The command "gpg --export-secret-keys <gpgname> > <filename.gpg>". This will write your public and private keys to the file you specify in <filename.gpg> which you can then download to your local machine. Then there is a piece of software called GPG4Win (http://www.gpg4win.org). This software includes an Outlook plug in also but if you are using Outlook 2007 it will not work so tell the installer not to install it. After installation you will need to run GPA and import the keys you downloaded and then right click on the key to set the Owner Trust to ultimate. After that you can simply right click on an encrypted file and select GPGee | Verify/Decrypt and you will be prompted for your passphrase then your file will be decrypted and restored.

Notes about cPanel backups
These are backups created using cPanel’s own script and so are complete backups including email, dns zone files, home directory, etc… They are a tar.gz which you can extract and restore a single file from or do a full domain restoration from.

Additional files
Using these commands:
“tar -Pczf /home/apache-conf.tar.gz /usr/local/apache/conf
tar -Pczf /home/shared-htdocs.tar.gz /usr/local/share/htdocs
tar -Pczf /home/sqlbu.tar.gz /home/sqlbu/exportâ€

I tar and backup my Apache config, my globally shared applications (Roundcube, Group-office, etc), and separate database backups which are run more frequently. If you want to do anything like this then you will need to modify this line to suit your paths otherwise delete or comment it and the lines clear to the ")" out.


Notes on cron
1 2 3 4 5

1 = Minutes (0-59)
2 = Hour (0-23 0=midnight)
3 = Day of month (1-31)
4 = Month (1-12)
5 = Day of week (0-6 0=Sunday)

And * is used to signify all. So * * * * * would execute every minute every hour etc.

Notes on accessing S3
There are a number of utilities you can use to store and access your files in your S3 space. One I say is a must have is the Firefox addon S3Fox.
I also use my S3 space to store files from other computers and to do this I have purchased a copy of JungleDisk which allows you to map your S3 drive to a drive letter like a network drive. JungleDisk stores files using kind of strangely meaning that you can't have direct access with S3fox or anything but you can install it on as many computers as you want and even put it onto a USB memory stick so that's not that big a deal for me.
Another very nice piece of software that I ran across when I could not delete a folder is Bucket Explorer which is a mature robust piece of software also.

Thanks!
Thanks to Josh, Khiltd, and KH-Paul for their input and direction on the original script this was derived from :D
 
Thanks Dan -

Being a 'country boy' with two cats, a dog, a 12 year old and I'm expecting a flat tire... this may take me till next week.
 
Thanx very much Dan, should take make a while too, I'll try to go through the steps and post back.
 
Hey LeMarque and rezag,

I have gone ahead and posted in the Tutorials section and that post has a zip file attached with the script and other files needed in it. This newer version also has logging so check it out from there rather than here.
 
Status
Not open for further replies.
Top