Backup script for Amazon S3 using cPanel backups, encryption, and logging

Dan

Moderator
This is a post that will show the tools I use and the script I run to create four rotating encrypted backups to Amazon S3 with logging.

With the original script you would receive the log email sent via cPanel's backup script. Now that log is written locally with additional logging and then sent. This script will also encrypt your backups using GPG before uploading.

I am by no means an expert so if you use or run across something you think will work better then please let us know or even if you have an idea of how to improve the backup script please speak up! :D

Requirements
An Amazon web services account with S3 enabled. If you have an Amazon account already it gets tied to that. If not then you can always sign up for one.
Apache compiled with Curl and Curlssl (openssl). This is done using Easyapache in WHM. NOTE: For some reason even though I had Curl and Curlssl compiled ssl access did not work on my server. I ended up having to remove Curl using Yum and then install it from source. Not sure what is going to happen when I go to recompile Apache again but I'm giving you a heads up is all.
GnuGPG. This should be installed on your server already. If you upgrade it from source be sure to remove it using Yum and add it to the exclude list else you will have multiple versions.
Tim Kay's AWS script. This is a script written in Perl (should also be already installed) to allow access to S3 using Curl. Here is the link for it on the Amazon developer's site too.

Preparation
1) Log into your server via SSH and then go to the website given for the AWS script and follow the instructions to download it to your server. I put all such things into /home/tools to keep them all in one place. Install it using the given instructions. He states it will be installed to /usr/bin but the aws script stays where you downloaded it and that is what is actually used. Links are created in /usr/bin linking to it so do not move or delete it. If you do need to delete it simply delete the aws file and
then unlink all the s3* and ecs* links in /usr/bin.

2) Edit the aws script to store your secret key file in the /home/tools folder rather than /root.
"Pico aws"
Search for .awssecret:
"Ctrl+w" ".awssecret" then press enter
Change "$home/.awssecret" to "/home/tools/.awssecret".
Save the file and exit:
"Ctrl+x" "y" and press enter.

3) Create a bucket for your backups to be stored in:
"s3mkdir <bucketname>"
I've seen that bucket names have to be original across all of S3 but this hasn't been the case for me as so far as I have seen as I have never been told I couldn't create a bucket. You do not have to create the weekn directories, they will be created when the script uploads to them automatically.

The script and GPG key
Attached to this post is a zip file containing the script, a .awssecret file, and a counterfile. Text after a # is comments which do not affect the code in anyway.

A note on paths, if the script does not run or strange errors are logged
then you may need to determine/fix the paths to those commands. You can do
this in SSH by running "whereis <command>" and it should tell you the path
to that command then simply modify the script with the correct path.

1) Download the zip file locally and extract the files.
Modify the code in s3backup as needed using your own text editor:
On line 16 replace <bucketname> with the name of the bucket you created.
On line 17 replace <gpguser> with the name of your gpg key.
If you want to backup your Apache config uncomment lines 42-57.
On line 60 replace nobody@nowhere.com with your email address.
Modify .awssecret with your access key and secret key.

2) Upload the files to /home/tools and also create /home/tools/logs. All files should be uploaded/owned by root.

3) Log in via SSH as root.

4) Make the file executable: chmod 700 /home/tools/backups.
You should also make sure .awssecret can only be access by yourself just in case: chmod 700 /home/tools/.awssecret

7) Create a gpg key:
"gpg --gen-key"
Answer the questions entering your name and email address. For the keys I followed the defaults. Be sure to use a passphrase that is secure. If you have already created a GPG key then of course you may use that one.

8) Edit the crontab to run the file:
"crontab -e"
This is the schedule I use:
0 2 * * 0 /home/tools/backups
This will run the file on Sundays at 2:00 AM. Modify as you like of course.
Place it between these lines so as to avoid an email sent by cPanel's backup script saying it couldn't change the date/time.
MAILTO=""
0 2 * * 0 /home/tools/s3backup
MAILTO=root
Save the file and exit: Control+x, y, enter

Notes on GPG
Files are encrypted using GPG which you should have installed on your server already. I upgraded to the latest version from source on my VPS and if you want to do that we can address it separately. What GPG will do is encrypt the file and change the name to end with .gpg to indicate that it is GPG encrypted. This command: "gpg -r <gpgname> -e <path/file>" will encrypt the file with your key.

In order to restore files from your backups you will then need to decrypt them. To do this on your server simply type "gpg -r <gpgname> <path/filename>" and you will be prompted for the passphrase then it will decrypt and restore your file.

To do this on Windows you will need to export your key from your server first. The command "gpg --export-secret-keys <gpgname> > <filename.gpg>". This will write your public and private keys to the file you specify in <filename.gpg> which you can then download to your local machine. Then there is a piece of software called GPG4Win. This software includes an Outlook plug in also but if you are using Outlook 2007 it will not work so tell the installer not to install it. After installation you will need to run GPA and import the keys you downloaded and then right click on the key to set the Owner Trust to ultimate. After that you can simply right click on an encrypted file and select GPGee | Verify/Decrypt and you will be prompted for your passphrase then your file will be decrypted and restored.

Notes about cPanel backups
These are backups created using cPanel's own script and so are complete backups including email, dns zone files, home directory, etc. They are a tar.gz which you can extract and restore a single file from or do a full domain restoration from.

Additional files
Using these commands:
tar -Pczf /home/apache-conf.tar.gz /usr/local/apache/conf

I tar and backup my Apache config which is run more frequently. If you want to do anything like this then you will need to modify this line to suit your paths otherwise delete or comment it and the lines clear to the ")" out.


Notes on cron
1 2 3 4 5

1 = Minutes (0-59)
2 = Hour (0-23 0=midnight)
3 = Day of month (1-31)
4 = Month (1-12)
5 = Day of week (0-6 0=Sunday)

And * is used to signify all. So * * * * * would execute every minute every hour etc.

Notes on accessing S3
There are a number of utilities you can use to store and access your files in your S3 space. One I say is a must have is the Firefox addon S3Fox.
I also use my S3 space to store files from other computers and to do this I have purchased a copy of JungleDisk which allows you to map your S3 drive to a drive letter like a network drive. JungleDisk stores files kind of strangely meaning that you can't have direct access with S3fox or anything but you can install it on as many computers as you want and even put it onto a USB memory stick so that's not that big a deal for me.
I currently no longer use Jungledisk (12/16/13) but instead use Cloudberry Labs S3 Explorer. It does not assign a drive letter for your S3 buckets but is still straight forward. It's interface is just like an FTP client. You shouldn't need to purchase the full version unless you want the features it adds.
Another very nice piece of software that I ran across when I could not delete a folder is Bucket Explorer which is a mature robust piece of software also.

Thanks!
Thanks to Josh, Khiltd, and KH-Paul for their input and direction on the original script this was derived from :D
------------------
Edit 3/23/10: Correct crontab commandline file name.
Edit 4/8/10: Add variables to script for bucket name and gpg user.
Edit 12/16/13: Add lines about uncommenting lines for Apache config backup and entering email address. Removed option for MySQL backups. Add comment about no longer using Jungledisk. Reattached zip file to post.
 

Attachments

  • s3backup.zip
    1.1 KB · Views: 9,112
Last edited:
Thanks for your time and for writing this but i have some probs.

I get the following errors and although i get a confirmation email, nothing is loaded to s3. I'm assuming because curl can't open the gpg files.
curl: Can't open '/home/tools/cpmove-webhost.tar.gz.gpg

and in terminal it spits back
line 24: gpguser: No such file or directory
from each line that gpguser is

on a side note I also get
rdate: rdate: could not set system time: Operation not permitted
but maybe because i running in a vps?

thanks for your help.
 
I get the following errors and although i get a confirmation email, nothing is loaded to s3. I'm assuming because curl can't open the gpg files.
curl: Can't open '/home/tools/cpmove-webhost.tar.gz.gpg

Did you by chance edit line 30 to upload from /home/tools? That's about the only way you could be receiving this error. The cPanel backup script puts the backup files into /home not /home/tools.

and in terminal it spits back
line 24: gpguser: No such file or directory
from each line that gpguser is

If you created a gpg key in step 7 and you are using the correct name for that key (can you encrypt a test file ok?) then perhaps you have changed the path on these lines as well? Again these files will be in /home.

on a side note I also get
rdate: rdate: could not set system time: Operation not permitted
but maybe because i running in a vps?

Yes this is a byproduct of the cPanel script and being on a vps :)

thanks for your help.

No problems and we'll get ya going!

If you can't get it I can PM you my email and I can look at your script.
 
worked like a champ! I made the mistake of doing a find and replace and screwed up the file paths. (doh!)
I also figured out I left the carrots on my gpg name. (double doh!)
beautiful script! I love the way it works with aws. I hope to figure out how to add daily buckets now instead of weekly.
Edit: Im thinking it might just be increasing the number of buckets (7 instead of 4) and then in the counterfile part of the script (if [ $week == 7 ] and then adding a cron to roll every night. Sound close?
Thanks for taking the time to write such a great short script! Your brain is much appreciated!
 
worked like a champ! I made the mistake of doing a find and replace and screwed up the file paths. (doh!)
I also figured out I left the carrots on my gpg name. (double doh!)
beautiful script! I love the way it works with aws. I hope to figure out how to add daily buckets now instead of weekly.
Edit: Im thinking it might just be increasing the number of buckets (7 instead of 4) and then in the counterfile part of the script (if [ $week == 7 ] and then adding a cron to roll every night. Sound close?
Thanks for taking the time to write such a great short script! Your brain is much appreciated!

hehehe I can't even tell you how many times copy and paste has bit me in the a$$ ;)

Glad you got it going!

Sure, to run it daily that would be fine. You could change 'week' to 'day' to be more accurate (don't forget to change the counterfile too). And then I would set 0=sunday etc too just by modifying the counterfile so that when it next increments it will match the current day (ex: set to 2 for tues +1 today = 3 for wed (today)).
 
Backup script for Amazon S3 using cPanel backups encryption and logging

You can configure automated backup creation by going to WHM >> Backup >> Configure Backup. If you want to create backups for some specific accounts only, then click "Select >>" button at the bottom of the backup configuration screen and select only accounts that needs to be backed up.Regards,PaulPS Thread was moved to Linux VPS - cPanel
 
Backup script for Amazon S3 using cPanel backups encryption and logging

I have very basic knowledge of cpanel. Now I had created backup yesterday and tried to install a MOD but it went wrong and hence I want to restore the backup.Backup is located on the the server itself. How do I restore the backup? Please help.
 
I have very basic knowledge of cpanel. Now I had created backup yesterday and tried to install a MOD but it went wrong and hence I want to restore the backup.Backup is located on the the server itself. How do I restore the backup? Please help.

Hello SaCron,

You can either follow the directions above pertinent to restoring a backup or in WHM go to Backup | Restore Backups.
 
I am unable to find where to edit the file to change /root to /home/tools. I have searched it over and over but its not there.
 
I am unable to find where to edit the file to change /root to /home/tools. I have searched it over and over but its not there.

Hi tomdchi,

The file does not have /root in it, it has $home. If you search for .awssecret you should find it.

Hope that helps
 
im tryin to get this to work and what its suppose to do is backup all the Cpanel backups to home/tools/backups and then when all the sites are backed up to that folder that folder is moved to amazon s3 right? or is it suppose to work another way?

At the bottom of Configure Backup what should be added/changed to move the backups to home/tools/backups?

Here is an image but I cant post links yet so i had to split the url up.. just remove spaces :)
htt p:// img708.imageshack .us/img708/6578/screenshot01x.jpg

thx !
 
Hi Jayme,

You do not need to move the backup files anywhere. One backup file will be created at a time in the default location which is /home. It will then be copied to S3. Afterward it is deleted and the next backup starts.

If you use this S3 script you will want to disable backups in WHM as the script calls the cPanel backup script directly.
 
Hi Jayme,

You do not need to move the backup files anywhere. One backup file will be created at a time in the default location which is /home. It will then be copied to S3. Afterward it is deleted and the next backup starts.

If you use this S3 script you will want to disable backups in WHM as the script calls the cPanel backup script directly.

I disabled it in WHM and my crontab-e looks like this

Code:
27 0 * * * /scripts/upcp
0 1 * * * /scripts/cpbackup
0 2 * * * /scripts/mailman_chown_archives
35 * * * * /usr/bin/test -x /usr/local/cpanel/bin/tail-check && /usr/local/cpanel/bin/tail-check
11,26,41,56 * * * * /usr/local/cpanel/whostmgr/bin/dnsqueue > /dev/null 2>&1
30 */4 * * * /usr/bin/test -x /scripts/update_db_cache && /scripts/update_db_cache
45 */8 * * * /usr/bin/test -x /usr/local/cpanel/bin/optimizefs && /usr/local/cpanel/bin/optimizefs
*/5 * * * * /usr/local/cpanel/bin/dcpumon >/dev/null 2>&1
2,58 * * * * /usr/local/bandmin/bandmin
0 0 * * * /usr/local/bandmin/ipaddrmap
55 0 * * * /usr/local/cpanel/whostmgr/docroot/cgi/cpaddons_report.pl --notify
0 6 * * * /scripts/exim_tidydb > /dev/null 2>&1
MAILTO=""
0 2 * * * /home/tools/backups
MAILTO=root
How could I test this out to see if its workin properly?
 
Hi Jayme,

That will run the script daily at 0200. If that is what you want then that looks good so long as the script is executable.

I do see that you still have /scripts/cpbackup in your crontab though. If backups were disabled in WHM then I believe that line should not be there. At least it isn't on my server.

To test the script you can simply run it from the command line. Change to the directory it is in 'cd /home/tools' and then run it './backups'. You could also run it by 'bash -x backups' and that will step through the script showing you what commands are running at the time.

Hope that helps :)
 
crontab question

MAILTO=""
0 2 * * 0 /home/tools/backups
MAILTO=root
Hi dan:

I am VERY new at this and I was wondering, I have set up the cron just like you said but, when I downloaded the .zip package it came with a file called s3backup, is this the file I should run in the cron? so would my code look like:
MAILTO=""
0 2 * * * /home/tools/s3backup
MAILTO=root

instead of "backup" ?

Also, if I wanted to run this script daily would it erase after four days? so I would need to set the limit on the counter to say 28 instead?

Thanks for you help on this!
-Mike
 
Hi Mike :)

Yes you are absolutely correct about the file name, it should be s3backup. I have corrected the original post to reflect this.

Yes, again, you are correct that the script as it is will overwrite in 4 days if you run it daily and you will need to edit the script to roll over at 28 rather than four. If you want to keep 28 days worth then change "if [ $week == 4 ]" to be "if [ $week == 28 ]"). You will also need to create week0 through week27 on your S3 bucket.

I have not tested this script with so many but I should think it will work fine. Only possible problem I can see is incrementing to double digits. If you have any problems then just let me know and we can look into it.

Hope that helps!
 
It's Alive!

Thanks so much for your help Dan!

It's working! However I was scrolling through the log and I did notice that I have two errors

tar: /usr/local/share/htdocs: Cannot stat: No such file or directory
tar: Error exit delayed from previous errors
tar: /home/sqlbu/export: Cannot stat: No such file or directory
tar: Error exit delayed from previous errors

I am guessing I just need to find the correct paths to these files? Is there more than one "htdocs" on a linux platform (yes, I'm that new to this..)?

Thanks so much, I literally threw my hands in the air when it worked and I saw it in my s3 bucket!

-Mike
 
Hi newe1344,

Glad to hear you got it working! :)

I backup additional files using that section:

Additional files
Using these commands:
tar -Pczf /home/apache-conf.tar.gz /usr/local/apache/conf
tar -Pczf /home/shared-htdocs.tar.gz /usr/local/share/htdocs
tar -Pczf /home/sqlbu.tar.gz /home/sqlbu/export

I tar and backup my Apache config, my globally shared applications (Roundcube, Group-office, etc), and separate database backups which are run more frequently. If you want to do anything like this then you will need to modify this line to suit your paths otherwise delete or comment it and the lines clear to the ")" out.
You can adjust those paths for your own needs if there is anything additional you want to backup or you can simply comment out that whole section:
Code:
#Tar Apache config, shared applications, and MySQL backups. Encrypt them and then upload to S3 weekn (0-4).
#tar -Pczf /home/apache-conf.tar.gz /usr/local/apache/conf >> $logfile 2>&1
#(if [ $? -eq 0 ]; then
#  echo `date '+%D %T'` "Compressed Apache config backup." >> $logfile
#  else exit $?
#fi)
#tar -Pczf /home/shared-htdocs.tar.gz /usr/local/share/htdocs >> $logfile 2>&1
#(if [ $? -eq 0 ]; then
#  echo `date '+%D %T'` "Compressed shared htdocs backup." >> $logfile
#  else exit $?
#fi)
#tar -Pczf /home/sqlbu.tar.gz /home/sqlbu/export >> $logfile 2>&1
#(if [ $? -eq 0 ]; then
#  echo `date '+%D %T'` "Compressed SQL backup." >> $logfile
#  else exit $?
#fi)
#gpg -r "$gpgid" -e /home/apache-conf.tar.gz >> $logfile 2>&1
#(if [ $? -eq 0 ]; then
#  echo `date '+%D %T'` "Encrypted Apache config backup." >> $logfile
#  else exit $?
#fi)
#gpg -r "$gpgid" -e /home/shared-htdocs.tar.gz >> $logfile 2>&1
#(if [ $? -eq 0 ]; then
#  echo `date '+%D %T'` "Encrypted shared htdocs backup." >> $logfile
#  else exit $?
#fi)
#gpg -r "$gpgid" -e /home/sqlbu.tar.gz >> $logfile 2>&1
#(if [ $? -eq 0 ]; then
#  echo `date '+%D %T'` "Encrypted SQL backup." >> $logfile
#  else exit $?
#fi)
#rm -f /home/*gz
#s3put "$bucket"/week"$week"/apache-conf.tar.gz.gpg /home/apache-conf.tar.gz.gpg >> $logfile 2>&1
#(if [ $? -eq 0 ]; then
#  echo `date '+%D %T'` "Uploaded Apache config backup to S3 #"$bucket"/week"$week"." >> $logfile
#  else exit $?
#fi)
#s3put "$bucket"/week"$week"/shared-htdocs.tar.gz.gpg /home/shared-#htdocs.tar.gz.gpg >> $logfile 2>&1
#(if [ $? -eq 0 ]; then
#  echo `date '+%D %T'` "Uploaded shared htdocs backup to S3 #"$bucket"/week"$week"." >> $logfile
#  else exit $?
#fi)
#s3put "$bucket"/week"$week"/sqlbu.tar.gz.gpg /home/sqlbu.tar.gz.gpg >> $logfile 2>&1
#(if [ $? -eq 0 ]; then
#  echo `date '+%D %T'` "Uploaded SQL backup to S3 "$bucket"/week"$week"." >> $logfile
#  else exit $?
#fi)
#rm -f /home/*gpg
Hope that helps!
 
I 4got my gpg username before i could continue setting this up :( so im getting this error

./s3backup: line 24: gpguser: No such file or directory

other than changing where it says <gpguser> on line 24 is there anything else in s3backup i need to change? and how to i retrieve my username/password if possible and if not how do i erase the user i created so i can recreate the gpg user?
 
Hi Jayme,

There were originally multiple instances of <gpguser> to change.

You can list the keys by issuing 'gpg --list-keys' although you cannot list the passphrase. If you want to simply delete the key and create a new one then the command is 'gpg --delete-key <keyname>'.

I just uploaded a new version of the script so that bucketname and gpguser now use a variable which you change only in one place. Might be easier for you to use that if you like.

Hope that helps!
 
Top