[SPLIT] backup script

Status
Not open for further replies.

ppc

Moderator
I use BQbackup as well and have written a script that simply runs the CPanel backup script, tars a couple of things and is run by cron when scheduled. and I would be happy to share it.

Would you please? Maybe post it as a tutorial?

Regards,
 
****Please see updated backup script here****

Heya Josh,

First let's see if there's any ironing out that we can do to it. I'm by no means an expert in any way, shape, or form :)

I am on Cpanel so steps here use Cpanel's backup scripts. If your VPS provides a backup script that you can run through SSH then there is no reason that things cannot be modified for your environment.

Firstly I went through BQbackup's instructions which I will replicate here:

1) Log in through SSH. They talk about rsync and the access it has to files but I am not using rsync here as I wanted to be able to send one compressed file.

2) Create an RSA encryption key for use with the SSH transport. You may check if a key already exists by executing the following command:

# cat ~/.ssh/id_rsa.pub

If the file already exists, you may skip to step 3. Otherwise, create a key with the ssh-keygen utility:

# ssh-keygen -t rsa -N '' (note: these are two single quotes)

This is done so you do not have to supply a username and password. The key is not password protected here so be sure to keep it secure.

3) Copy your RSA encryption key to the BQ Internet backup system. You may do this through the shell as well.

# scp ~/.ssh/id_rsa.pub <bqbackupusername>@<bqbackupusername>.bqbackup.com:keys/server1

# ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mergekeys

They have you name it server1 but I gave it my server's name instead. Up to you what you do here.

4 and 5) At this point they go into rsync and it's use but as stated I am not using rsync. The prior SCP commands had intrigued me so I did some digging and figured it would work very well for me.

Here is the script that I came up with. Text after a # is comments which do not effect the code in anyway.
Code:
#!/bin/sh
#This is a file written to run the CPanel backup utility for each users home directory and then upload it to BQbackup.
#Four weeks of backups will be rotated and kept.
#First rotate/create directories.

( ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com rm -rf week4
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mv week3 week4
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mv week2 week3
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mv week1 week2
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mkdir week1
#This directory is for SQL backups created using [URL="http://sourceforge.net/projects/phpmybackup/"]PHPMyBackupPro[/URL].
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mkdir week1/sqlbu

#Change to home directory.  Run Cpanel backup script.  Upload to BQbackup.  Delete backup file.  Repeat as neccessary for each account.

cd /home
/scripts/pkgacct <account1>
scp cpmove-<account1>.tar.gz <bqbackupusername>@<bqbackupusername>.bqbackup.com:week1/
rm -f cpmove-<account1>.tar.gz

#Tar Apache config.  Upload to BQbackup.  Delete backup file.

cd /usr/local/apache
tar cf /home/apache-conf.tar.gz conf
scp /home/apache-conf.tar.gz <bqbackupusername>@<bqbackupusername>.bqbackup.com:week1/
rm -f /home/apache-conf.tar.gz

#Tar shared applications.  Upload to BQbackup.  Delete backup file.

cd /usr/local/share
tar cf /home/shared-htdocs.tar.gz htdocs
scp /home/shared-htdocs.tar.gz <bqbackupusername>@<bqbackupusername>.bqbackup.com:week1/
rm -f /home/shared-htdocs.tar.gz

#Change to SQL backup folder.  Upload files to BQbackup.
cd /home/sqlbu/export
scp *gz <bqbackupusername>@<bqbackupusername>.bqbackup.com:week1/sqlbu/
)
Replace everything between <> with your pertinent information.

I chose to tar and backup my Apache config and my globally shared applications (Roundcube, Group-office, etc). You may not chose to do so, totally up to you of course.

The code shown is in a file named 'backups' which is called via cron. To create/save this file you first need to decide where to keep it. I keep it in a folder in my home directory in an attempt to avoid having bits and pieces scattered everywhere.

1) Copy and modify the code as needed using your own text editor then copy it to the clipboard.

2) Log in via SSH

3) Create a directory to keep such things in: mkdir /home/tools

4) Create the file: touch /home/tools/backups

5) Edit the file: pico /home/tools/backups

6) Paste the code into the file (right click should paste it in).

7) Save the file and exit: control+x, y, enter

8) Make the file executeable: chmod 764 /home/tools/backups

9) Edit the crontab to run the file: crontab -e

10) This is the schedule I use: 0 2 * * 0 /home/tools/backups

This will run the file on Sundays at 2:00 AM. Modify as you like of course.

Here is a brief rundown of cron scheduling:

1 2 3 4 5

1 = Minutes (0-59)
2 = Hour (0-23 0=midnight)
3 = Day of month (1-31)
4 = Month (1-12)
5 = Day of week (0-6 0=Sunday)

And * is used to signify all. So * * * * * would execute every minute every hour etc.

11) Save the file and exit: Control+x, y, enter

I recommend running everything via the command line to insure accuracy before committing it to file and cron.
 
Dan said:
#This is a file written to run the CPanel backup utility for each users home directory and then upload it to BQbackup.

Will this script run through every account in cPanel?

i.e. <account1> Doesn't that mean you want us to stick the username their?

Let's say we want this script to backup every account?
 
Josh,

Yes you replace <account1> with the first account name and then repeat that section for each additional account.
 
Josh,

Yes you replace <account1> with the first account name and then repeat that section for each additional account.

How can we get this to automatically pull each account and do it? I don't want to sit and edit this every time a new account is added.

Thanks for your help,
 
How can we get this to automatically pull each account and do it? I don't want to sit and edit this every time a new account is added.

Thanks for your help,

You could either walk the /home directory or parse /etc/passwd.

Code:
cat /etc/passwd | awk -F':' '/\/home/ { print $1 }' |
while read user; do

        #Do something useful here
	echo "$user"

done
 
You could either walk the /home directory or parse /etc/passwd.

Code:
cat /etc/passwd | awk -F':' '/\/home/ { print $1 }' |
while read user; do

        #Do something useful here
    echo "$user"

done

Thank you very much khiltd. Not to sound too naive but how do I incorporate your script into Dan's :p
 
You'd put whatever code needs a specific username in place of the echo.

Does this look ok?

#!/bin/sh
#This is a file written to run the CPanel backup utility for each users home directory and then upload it to BQbackup.
#Four weeks of backups will be rotated and kept.
#First rotate/create directories.

( ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com rm -rf week4
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mv week3 week4
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mv week2 week3
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mv week1 week2
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mkdir week1
#This directory is for SQL backups created using PHPMyBackupPro.
ssh <bqbackupusername>@<bqbackupusername>.bqbackup.com mkdir week1/sqlbu

#Change to home directory. Run Cpanel backup script. Upload to BQbackup. Delete backup file. Repeat as neccessary for each account.

cat /etc/passwd | awk -F':' '/\/home/ { print $1 }' |
while read user; do

#Do something useful here
cd /home
/scripts/pkgacct "$user"
scp cpmove-"$user".tar.gz <bqbackupusername>@<bqbackupusername>.bqbackup.com:week1/
rm -f cpmove-"$user".tar.gz

done


#Tar Apache config. Upload to BQbackup. Delete backup file.

cd /usr/local/apache
tar cf /home/apache-conf.tar.gz conf
scp /home/apache-conf.tar.gz <bqbackupusername>@<bqbackupusername>.bqbackup.com:week1/
rm -f /home/apache-conf.tar.gz

#Tar shared applications. Upload to BQbackup. Delete backup file.

cd /usr/local/share
tar cf /home/shared-htdocs.tar.gz htdocs
scp /home/shared-htdocs.tar.gz <bqbackupusername>@<bqbackupusername>.bqbackup.com:week1/
rm -f /home/shared-htdocs.tar.gz

#Change to SQL backup folder. Upload files to BQbackup.
cd /home/sqlbu/export
scp *gz <bqbackupusername>@<bqbackupusername>.bqbackup.com:week1/sqlbu/
)
 
You won't need to change the working directory in every iteration. Actually, I think most of those "cd"s are superfluous; it's always best to just put the desired output location right there in each command that supports it and leave the working directory alone unless you're going to restore it when you're done.
 
NICE!! going to have to look up that cat command and see what the hey is going on there!

Thanks Khiltd!
 
Since there is access to root WHM, why not use the built-in backup facility in WHM to do account backups? And then ftp or rsync the files to a remote ftp / backup server?
 
Since there is access to root WHM, why not use the built-in backup facility in WHM to do account backups? And then ftp or rsync the files to a remote ftp / backup server?

The base backup utility is fairly limited on backup rotation and historically when I used it it was very cpu intensive. Granted I do not know how it is now but I am able to rotate my backups on a schedule I know and I can also backup more than just accounts this way.
 
OMG that is SO sweet! All you need to run the backups is this
Code:
#!/bin/sh
( ssh <bqbackupuser>@<bqbackupuser>.bqbackup.com rm -rf week4;mv week3 week4;mv week2 week3;mv week1 week2;mkdir week1;mkdir week1/sqlbu
cat /etc/passwd | awk -F':' '/\/home/ { print $1 }' |
while read user; do
/scripts/pkgacct "$user"
scp /home/cpmove-"$user".tar.gz <bqbackupuser>@<bqbackupuser>.bqbackup.com:week1/
rm -f /home/cpmove-"$user".tar.gz
done
)
Four weeks of backups kept and rotated. All users backed up using Cpanels backup script so full restores can be done.

NICE

Thanks Khiltd for the help and Josh for asking the questions! :D
 
OMG that is SO sweet! All you need to run the backups is this
Code:
#!/bin/sh
( ssh <bqbackupuser>@<bqbackupuser>.bqbackup.com rm -rf week4;mv week3 week4;mv week2 week3;mv week1 week2;mkdir week1;mkdir week1/sqlbu
cat /etc/passwd | awk -F':' '/\/home/ { print $1 }' |
while read user; do
/scripts/pkgacct "$user"
scp /home/cpmove-"$user".tar.gz <bqbackupuser>@<bqbackupuser>.bqbackup.com:week1/
rm -f /home/cpmove-"$user".tar.gz
done
)
Four weeks of backups kept and rotated. All users backed up using Cpanels backup script so full restores can be done.

NICE

Thanks Khiltd for the help and Josh for asking the questions! :D

Great, thanks so much! Do we need to create the directories week1, week2 etc. first?
 
It keeps saying at the start of the backup:

mv: cannot stat `week4': No such file or directory
mv: cannot stat `week3': No such file or directory
mv: cannot stat `week2': No such file or directory

etc.

even after creating those directories....
 
You are absolutely right Josh. I combined these thinking to get it all done in one pass but for some reason it does not want to work as you'd think it would so let's break them back out into separate commands. And yes these directories should be created beforehand too.

Code:
  ( ssh <bqbackupuser>@<bqbackupuser>.bqbackup.com rm -rf week4
  ssh <bqbackupuser>@<bqbackupuser>.bqbackup.com mv week3 week4
  ssh <bqbackupuser>@<bqbackupuser>.bqbackup.com mv week2 week3
  ssh <bqbackupuser>@<bqbackupuser>.bqbackup.com mv week1 week2
  ssh <bqbackupuser>@<bqbackupuser>.bqbackup.com mkdir week1
  ssh <bqbackupuser>@<bqbackupuser>.bqbackup.com mkdir week1/sqlbu
  cat /etc/passwd | awk -F':' '/\/home/ { print $1 }' |
  while read user; do
  /scripts/pkgacct "$user"
  scp /home/cpmove-"$user".tar.gz <bqbackupuser>@<bqbackupuser>.bqbackup.com:week1/
  rm -f /home/cpmove-"$user".tar.gz
  done
  )
 
I would change this part:

Code:
cat /etc/passwd | awk -F':' '/\/home/ { print $1 }' |

to:

Code:
find /var/cpanel/users -printf "%f\n" |
 
The base backup utility is fairly limited on backup rotation and historically when I used it it was very cpu intensive. Granted I do not know how it is now but I am able to rotate my backups on a schedule I know and I can also backup more than just accounts this way.

The WHM backup appears to be the same as the cPanel full backup process. Thus it backs up the account home directory (website and email files etc) and also databases. Also other related files for the account. Therefore you can restore an account from a single .gz file. It also seems to backup other miscellaneous files outside the account's space, ie some system and configuration files.

It has options to do a daily, weekly and monthly backup of all or selected accounts. It has full or incremental backup options. You can backup the accounts on the server hard disk itself or ftp it to a remote server.

Yes, the server load goes up while running backups, but I think that will be the same case if you run cPanel account backups as well.
 
Status
Not open for further replies.
Top