PDA

View Full Version : brand new: Backup best practices


brewster
November 20th, 2005, 07:43 PM
I have a new PHP photo site up and before it really gets going, I'd like to know what are the best things I can do to back everything up.

I tried the backup feature in the interface but got these errors:

PhotoPost Database Backup

Command being used to execute dumps (specific tables are executed individually):

/usr/bin/mysqldump --opt -h localhost –[password and username omitted]
Results:


Warning: filesize(): Stat failed for /home/mydomian/public_html/backup/pp_users.sql (errno=2 - No such file or directory) in /home/mydomian/public_html/gallery/adm-misc.php on line 1545
Warning: pp_users.sql is 0 bytes or did not get created.

You should double-check your .sql files to be sure the backup completed successfully.

I made sure to “close” the gallery but still get the error.

“backup” is a directory I created. Should I not do that and point the backup path to another area of the site?

also: Where do the actual .sql files reside?

Chuck S
November 20th, 2005, 11:56 PM
Hello did you make the backup directory 777?

brewster
November 21st, 2005, 02:44 PM
Hello did you make the backup directory 777?Chuck, thanks, I'll try that.

Any idea with about 50 MB of photos how long the site needs to be down for backup? -- Dan

Chuck S
November 21st, 2005, 03:02 PM
thought you where talking mysql backup which is what the thing your trying is.

To backup data you would use your sites backup utility or FTP to backup files. To backup files via ftp you do not need to shut down your site

brewster
November 22nd, 2005, 01:29 AM
I did the backup from the admin area that backs up the SQL files. (for the site content, I just FTP'd the whole thing down)

I gave the backup folder 777 permissions and the backup seemed to go fine, no errors this time but the SQL fields seem to be empty in all the backup files, they just have headers. I did close the gallery while I did it.

Here is an example of the users file-- MySQL dump 9.11
--
-- Host: localhost Database: mydatabase_name_photopostpp_settings
-- ------------------------------------------------------
-- Server version 4.0.25-standard
That is the pp_users.sql file and all files are like this, empty of any data.

Chuck S
November 22nd, 2005, 10:30 AM
I get the same thing just headers with mysql4. let me go check some things and I will get back with you.

Meantime you can use your control panels phpmyadmin to backup the databases

Zachariah
November 22nd, 2005, 11:00 AM
Another way

Log into your Cpanel / Vdeck of your domain / webhost and use "backup" area to save your database.

There are options usaly to save the DB on the localhost to FTP later or download instant.

brewster
November 22nd, 2005, 09:28 PM
Ok, PHPMyAdmin was able to do this very easily for all the SQL build scripts in one file as well as all the data currently in all the tables, very cool.

Also my host lets me download a daily backup as a tar.gz file of everything on my server space, so I am covered in all ways.

Axe
November 22nd, 2005, 10:56 PM
Just wanted to add this in case any of you were interested, I just use mysqldump from the command line in a cron job to backup my database, and I have another cron job running locally on a Linux laptop here to download it...

This is the sqlbackup.sh file on the server..

#!/bin/sh
mysqldump -udatabase_username -pdatabase_password --opt --allow-keywords --databases database_name > /home/your_home_dir/sqldump.sql
gzip /home/your_home_dir/sqldump.sql
rm -rf /home/your_home_dir/public_html/backup/sqldump.sql.gz
mv /home/your_home_dir/sqldump.sql.gz /home/your_home_dir/public_html/backup
Remember to set your permissions accordingly so that other users on the system (for you shared hosting folks) can't view it and see your SQL passwords n' whatnot, but make sure you set it executable.

I put it into a .htaccess password protected directory...

Then on the Linux laptop I'm running this script (which is scheduled to run about an hour after the one above)..
#!/bin/sh
cd /home/your_home_dir/
wget --http-user=htaccess_user --http-password=htaccess_password http://www.yourdomain.com/backup/sqldump.sql.gz
gzip -d /home/your_home_dir/sqldump.sql.gz
mysql -udatabase_user -pdatabase_password -Ddatabase_name < /home/your_home_dir/sqldump.sql

database_username == The database username
database_password == Your database user's password
database_name == The name of the database
/home/your_home_dir/public_html/backup/ == (1st chunk of code) The appropriately protected directory on your server
htaccess_user == The username of the .htaccess protected directory
htaccess_password == The password for the .htaccess protected user account you picked above
/home/your_home_dir/ == (2nd chunk of code) The desired directory to which you wish to download & extract locally
I gzip because the .sql file created by the server is about 950Meg, and that'd take a while. Gzipped it's around 59Meg or so.

If you're running Windows locally, and just want to keep a backup (aren't bothered about mirroring your site on a local development/test machine - which is why I have it going straight to my Linux laptop), you should be able to download (even with the .htaccess protection) using the Micro$oft Scheduler that comes with Windows.

James Goddard
November 22nd, 2005, 11:04 PM
I have a new PHP photo site up and before it really gets going, I'd like to know what are the best things I can do to back everything up.

Here is what I do:

I have my sites running on dedicated servers runing Linux. I also have a Linux server at my house.

I regularly run two backup methods via cron:

1) Dumps the mysql database of my sites locally to a file.
2) Uses rsync to duplicate all the files and DB directories of all of my sites to my local server.

Works well for me. YMMV...

V-Rodforums
November 22nd, 2005, 11:08 PM
I see a lot of options for backups of sql data and photo data but a remote backup is always the safest route to go. I don't use cpanel only SSH so I can't tell you how to do it with cpanel but I run a backup server at my house that I keep a carbon copy of my servers that are online with. Have your cron schedule to do sql dumps on the main server and then on your backup server have cron perform a simple command shortly after the sql dump takes place. Make sure it cd /var/www or what ever your backup directory is first and then run the command rsync -avzP --rsh=ssh whatever@whatever.com:/home . or whatever and it will make a carbon copy of the entire directory. If you use the same password for the backup as the primary non will be needed. If they are different it will prompt for a password. This might not be exactly what your asking for but one of my galleries is almost 5 gigs and I would hate to lose the data so I keep three copies total, two on running machines and one burned to dvd on a weekly basis.

V-Rodforums
November 22nd, 2005, 11:12 PM
I see James beat me to the rsync command while I was typing. Glad to see I'm not the only one that uses this practice. It's also nice to have the backup server to use as a proxy if your work runs websense or some other stupid filter that will take you about 5 seconds to get around. :)

James Goddard
November 22nd, 2005, 11:16 PM
I see James beat me to the rsync command while I was typing. Glad to see I'm not the only one that uses this practice. It's also nice to have the backup server to use as a proxy if your work runs websense or some other stupid filter that will take you about 5 seconds to get around. :)

Yea, if you're running Linux then rsync is definatly the way to go (assuming remote hosting, if you're hosting locally then a good tape backup does the job every time).

My only concern is if my dedicated server host and my house burn down at the same time. Though I imagine that if that happens, my boards won't be high on my prority list...

James

V-Rodforums
November 22nd, 2005, 11:38 PM
James it is worth pointing out to others that may not understand that the rsync command is powerful because it compares current data with saved data and only gets files that have been changed or added. This means you can update your 20 gig backup of data in about 5 minutes if only a couple of hundred megs have changed. Very powerful tool and very handy.

James Goddard
November 22nd, 2005, 11:46 PM
James it is worth pointing out to others that may not understand that the rsync command is powerful because it compares current data with saved data and only gets files that have been changed or added. This means you can update your 20 gig backup of data in about 5 minutes if only a couple of hundred megs have changed. Very powerful tool and very handy.


Yep. For public knowledge, it's used by most mirroring systems. And if done properly it's secure. You can store public/private keys on the systems and do the entire backup over a secure socket.

http://www-128.ibm.com/developerworks/linux/library/l-backup/?ca=dgr-lnxw07Backup

brewster
November 23rd, 2005, 01:50 AM
Yea, if you're running Linux then rsync is definatly the way to go (assuming remote hosting...James: Do you have a good recommendation for a Linux Distro for using rsync?

James Goddard
November 23rd, 2005, 08:33 AM
James: Do you have a good recommendation for a Linux Distro for using rsync?

Most any distro with a decent package system would work. Are you looking for your home system? If so I'd go with Mandrake as it's a little more user friendly for a workstation. If you want something more server like, I'd go with CentOS.

Actually, if you're just looking for your home system to store a backup, you really don't have to do Linux at all. You can install cygwin which has an optional rsync package.

James



James