Backup script for Amazon S3 using cPanel backups, encryption, and logging

JaymeNYC

New Member
Hi Jayme,

There were originally multiple instances of <gpguser> to change.

You can list the keys by issuing 'gpg --list-keys' although you cannot list the passphrase. If you want to simply delete the key and create a new one then the command is 'gpg --delete-key <keyname>'.

I just uploaded a new version of the script so that bucketname and gpguser now use a variable which you change only in one place. Might be easier for you to use that if you like.

Hope that helps!
thx 4 the help! i remade my gpguser and using the new s3backup file i can get the "./s3backup" to work but im getting
/usr/bin/sha1sum: /home/cpmove-site1.tar.gz.gpg: No such file or directory
/usr/bin/sha1sum: /home/cpmove-site2.tar.gz.gpg: No such file or directory
/usr/bin/sha1sum: /home/cpmove-site3.tar.gz.gpg: No such file or directory
would this have somethin to do with /usr/bin/sha1sum? I ran it right after i typed in "cd /home/tools"
 

Dan

Moderator
Hi Jayme,

I just got your log file! Idiot me forgot to change the email at the bottom to the default <nobody@nowhere.com>. Just change it to your email and it should be good.

Let me look at this error message and I will get back to you.
 

Dan

Moderator
Hi Jayme,

It looks like it is a path problem.

Find out where gpg and sha1sum are at on your server by doing 'whereis gpg' and 'whereis sha1sum'. Then replace all instances of '/usr/local/bin/gpg' with the correct path to your gpg and all instances of '/usr/bin/sha1sum' with the correct path to your sha1sum.
 

JaymeNYC

New Member
thx 4 the replies dan !

I just did whereis for both and came out with this

htt p://img580.imageshack. us/img580/5334/screenshot01.jpg

(have to paste in URL bar and close the spaces)

so gpg would be /usr/bin/gpg and sha1sum would be /usr/bin/sha1sum? and i change the paths of these in s3backup or somewhere else?
 

Dan

Moderator
Hi Jayme :)

You change them in the s3backup script and then you should be good to go!

And you are quite welcome!
 

samsh

New Member
File size problem in S3 - 5GB limit

How can I add splitting of files to the script?
The problem is that S3 does not take files larger then 5GB, and some backups are much more.

Thanks,
Sam.
 

Dan

Moderator
Hi Sam,

I think that this can be addressed but it will take some doing figuring out the best way to address it. Might even need to ask for help :)

But just so you know I am mulling it over and looking at it!
 

joshuayip

New Member
Thanks!

Hi Dan,

Thanks for pointing this to me.

I actually went with the following before I heard from you
jackal777 dot wordpress dot com/2011/03/22/cpanel-backup-to-amazon-s3\

But I am kind of stuck with my shell script not sure why all the compile error. Would you mind take a look at this?
 

Dan

Moderator
Hi joshuayip,

I don't use this script and from the looks of it the backups are not stored only on S3 but also on your VPS so I would not personally be keen to use it.

Without seeing the error and which step it is actually in I would be shooting in the dark. You say a compile error but talk about your shell script when the only thing you should be compiling is s3cmd.

This is unrelated to this how to so if you want to shoot me a PM and we can discuss it further.
 

ericabiz

New Member
Thanks! Got this working. Just a note, with my default cpanel install I had to change /usr/local/bin/gpg in the s3backup file to /usr/bin/gpg --other than that, it appears to work great. Thanks for the script!
 

Dan

Moderator
Thanks! Got this working. Just a note, with my default cpanel install I had to change /usr/local/bin/gpg in the s3backup file to /usr/bin/gpg --other than that, it appears to work great. Thanks for the script!
Glad to hear it :)
 

Dan

Moderator
Hi ivanbg,

Give it a try now. When they changed the forum software it must not have gotten ported over. I've done some revisions to the script as well which should make it easier to get up and running.

Please let me know if you run into any difficulties.
 
Top