bear Posted December 14, 2006 Share Posted December 14, 2006 Here's the one I've been using with great success: #!/bin/sh # This script will backup one or more mySQL databases # and then optionally email them and/or FTP them # This script will create a different backup file for each database by day of the week # i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1) # This is a trick so that you never have more than 7 days worth of backups on your FTP server. # as the weeks rotate, the files from the same day of the prev week are overwritten. ############################################################ #===> site-specific variables - customize for your site # List all of the MySQL databases that you want to backup in here, # each seperated by a space databases="db_name" # Directory where you want the backup files to be placed backupdir=/home/username/backup # MySQL dump command, use the full path name here mysqldumpcmd=/usr/bin/mysqldump # MySQL Username and password userpassword=" --user=user_name --password=pass_here" # MySQL dump options dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables" # Unix Commands gzip=/bin/gzip uuencode=/usr/bin/uuencode mail=/bin/mail # Send Backup? Would you like the backup emailed to you? # Set to "y" if you do sendbackup="n" subject="Backup is done" mailto="email@domain.tld" #===> site-specific variables for FTP Use FQDN or IP ftpbackup="y" ftpserver="ftp.domain.tld" ftpuser="ftp_username" ftppasswd="ftp_pass" # If you are keeping the backups in a subdir to your FTP root ftpdir="/" #===> END site-specific variables - customize for your site ############################################################ # Get the Day of the Week (0-6) # This allows to save one backup for each day of the week # Just alter the date command if you want to use a timestamp DOW=`date +%w` # Create our backup directory if not already there mkdir -p ${backupdir} if [ ! -d ${backupdir} ] then echo "Not a directory: ${backupdir}" exit 1 fi # Dump all of our databases echo "Dumping MySQL Databases" for database in $databases do $mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql done # Compress all of our backup files echo "Compressing Dump Files" for database in $databases do rm -f ${backupdir}/${DOW}-${database}.sql.gz $gzip ${backupdir}/${DOW}-${database}.sql done # Send the backups via email if [ $sendbackup = "y" ] then for database in $databases do $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${DOW}-${database}.sql.gz.uu $mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu done fi # FTP it to the off-site server echo "FTP file to $ftpserver FTP server" if [ $ftpbackup = "y" ] then for database in $databases do echo "==> ${backupdir}/${DOW}-${database}.sql.gz" ftp -n $ftpserver <<EOF user $ftpuser $ftppasswd bin prompt cd $ftpdir lcd ${backupdir} put ${DOW}-${database}.sql.gz quit EOF done fi # And we're done ls -l ${backupdir} echo "Dump Complete!" exit Name it "whatever_you_like.sh" create a cron, call this script (CHMOD 755 first) using: sh /script/location/scriptname.sh > /dev/null (you may need to use /bin/sh or similar on your server...keep it away from publicly accessible directories). It creates a daily backup, FTPs to a remote server, can email a copy, and rotates them every week...saved my butt a few times. Enjoy. 0 Quote Link to comment Share on other sites More sharing options...
YodaCows Posted December 15, 2006 Share Posted December 15, 2006 Thanks bear. 0 Quote Link to comment Share on other sites More sharing options...
webresellers Posted February 28, 2007 Share Posted February 28, 2007 Please move this thread to CONTRIBUTIONS 0 Quote Link to comment Share on other sites More sharing options...
dbosch Posted March 6, 2007 Share Posted March 6, 2007 great stuff bear 0 Quote Link to comment Share on other sites More sharing options...
Angler Posted April 26, 2007 Share Posted April 26, 2007 works a treat thanks 0 Quote Link to comment Share on other sites More sharing options...
Patty Posted April 27, 2007 Share Posted April 27, 2007 A very easy one: just run a cronjob and it will back up databases from any server! Change what's in red. Cronjob: curl http://cPanelusername:password@domain.com:2082/getsqlbackup/DB.gz --output /home/user/DB.gz That's it! Now if you want to get fancy, run the script below with cron to create a directory in your remote server to store the backups and then delete it after a given number of days (15 in this case, but you can change it). This is how I do db backups for all my clients. 1) Create a directory above public_html (ie: backup) 2) Create a file with any name (ie: managedbbackups.php) and copy the script below on it. Change "user" to your cPanel username. Save it. 3) Upload the file to the directory you created 4) Create the following cronjob: /usr/local/bin/php /home/cPanel_user/backup/managedbbackups.php <?php // in this script we do the following // 1. create a folder called 'backup-d-a-t-e' // 2 move all *.gz file to the backup folder creates in step 1 // 3 delete a folder 'backup-deldate' where deldate is 15 days back // does we have backup of everyday, for the last 15 days, directory getting older gets deleted. // we run a cron-job, once a day, after we have backup of all domains done. // step 1 //whats the date ? $date = DATE('\b\a\c\k\u\p\-Y-m-d', time()); //make dir in name of date inside backup folder // replace BackupID with your backup.com userid $path = "/home/user/".$date; mkdir($path, 0700); // step 2 // find files in the home directory having extension .gz $files = array(); // replace BackupID with your backup.com userid $dir=opendir("/home/user/"); while(($file = readdir($dir)) !== false) { if($file !== '.' && $file !== '..' && !is_dir($file)) { $files[] = $file; } } // close dir closedir($dir); natcasesort($files); // move all *.gz file to backup folder for($i=0; $i<count($files); $i++) { // $filename = strtolower($files[$i]) ; $exts = split("[/\\.]", $files[$i]) ; $n = count($exts)-1; $exts = $exts[$n]; if ($exts == gz) { //move file only if extension is gz // replace BackupID with your backup.com userid rename ( "/home/user/".$files[$i], $path."/".$files[$i]); } } // step 3 : deleting / removing old backup folder //now lets delete folder which are 15 days only $deldate = DATE('\b\a\c\k\u\p\-Y-m-d', time()-15*24*60*60); //delete dir in name of date inside backup folder // replace BackupID with your backup.com userid $path = "/home/user/".$deldate; function rmdirr($dir) { if (substr($dir,-1) != "/") $dir .= "/"; if (!is_dir($dir)) return false; if (($dh = opendir($dir)) !== false) { while (($entry = readdir($dh)) !== false) { if ($entry != "." && $entry != "..") { if (is_file($dir . $entry) || is_link($dir . $entry)) unlink($dir . $entry); else if (is_dir($dir . $entry)) rmdirr($dir . $entry); } } closedir($dh); rmdir($dir); return true; } return false; } // Example: $res = rmdirr($path); // Bye bye ?> 0 Quote Link to comment Share on other sites More sharing options...
ren Posted July 27, 2007 Share Posted July 27, 2007 Will this work on Windows servers? I tried running the script but am getting an error: "Warning: exec(): Unable to fork [/usr/bin/mysqldump...]" usr/bin/mysqldump relates to linux servers doesn't it? Thanks 0 Quote Link to comment Share on other sites More sharing options...
jayh Posted July 29, 2007 Share Posted July 29, 2007 This is for *nix servers and will not work on Windows. 0 Quote Link to comment Share on other sites More sharing options...
reds Posted September 5, 2007 Share Posted September 5, 2007 This looks great. Thanks very much! 0 Quote Link to comment Share on other sites More sharing options...
reds Posted September 28, 2007 Share Posted September 28, 2007 Having some problems with this. Can you help out? In cpanel my command to run is sh /home/username/dbbackup.sh > /dev/null I have the details in the .sh file filled out. I get an email at the specified time each morning with the following /home/username/dbbackup.sh: line 2: : command not found /home/username/dbbackup.sh: line 5: : command not found /home/username/dbbackup.sh: line 10: : command not found /home/username/dbbackup.sh: line 13: : command not found /home/username/dbbackup.sh: line 17: : command not found /home/username/dbbackup.sh: line 20: : command not found /home/username/dbbackup.sh: line 23: : command not found /home/username/dbbackup.sh: line 26: : command not found /home/username/dbbackup.sh: line 29: : command not found /home/username/dbbackup.sh: line 34: : command not found /home/username/dbbackup.sh: line 40: : command not found /home/username/dbbackup.sh: line 48: : command not found /home/username/dbbackup.sh: line 51: : command not found /home/username/dbbackup.sh: line 56: : command not found /home/username/dbbackup.sh: line 68: syntax error near unexpected token `do ' /home/username/dbbackup.sh: line 68: `do ' Can you help out please? 0 Quote Link to comment Share on other sites More sharing options...
congkai Posted October 23, 2007 Share Posted October 23, 2007 [root@sgoxygen ~]# sh /etc/bdbackup.sh > /bin/sh -bash: /bin/sh: Text file busy [root@sgoxygen etc]# ./dbbackup.sh : No such file or directory 0 Quote Link to comment Share on other sites More sharing options...
Unspecific Posted October 23, 2007 Share Posted October 23, 2007 [root@sgoxygen ~]# sh /etc/bdbackup.sh > /bin/sh-bash: /bin/sh: Text file busy [root@sgoxygen etc]# ./dbbackup.sh : No such file or directory You're running this as 'root'?! 0 Quote Link to comment Share on other sites More sharing options...
bear Posted October 23, 2007 Author Share Posted October 23, 2007 You're running this as 'root'?! Testing as root, probably. Cron doesn't typically run as root for this sort of thing. If he can't run as root from the command line, he is having an issue that can't likely be fixed by testing as the user account. [root@sgoxygen ~]# sh /etc/bdbackup.sh > /bin/sh This should really be located within the user account that's running it: /home/user/backup.sh or similar, so the Cpanel user can access via cron. Note the spelling error as well: dbbackup vs bdbackup 0 Quote Link to comment Share on other sites More sharing options...
dream1 Posted January 21, 2008 Share Posted January 21, 2008 Thanks Bear , works great . 0 Quote Link to comment Share on other sites More sharing options...
columbusgeek Posted April 1, 2008 Share Posted April 1, 2008 FYI, this is great for the lazy. http://www.mysqldumper.de/en/ 0 Quote Link to comment Share on other sites More sharing options...
minadreapta Posted April 21, 2008 Share Posted April 21, 2008 i can't get this to work. i get this error: /home/myusername/db_backup.sh: line 86: mysql_backupz/1-db_name.sql.gz.uu: No such file or directory i've set it up to email the db... 0 Quote Link to comment Share on other sites More sharing options...
joe123 Posted April 21, 2008 Share Posted April 21, 2008 i can't get this to work. i get this error: /home/myusername/db_backup.sh: line 86: mysql_backupz/1-db_name.sql.gz.uu: No such file or directory i've set it up to email the db... because the script is in /home/myusername/public_html/db_backup.sh and you better put it in a protected directory 0 Quote Link to comment Share on other sites More sharing options...
minadreapta Posted April 22, 2008 Share Posted April 22, 2008 no, the script is not in public_html, it is in home/myusername/ 0 Quote Link to comment Share on other sites More sharing options...
bear Posted April 22, 2008 Author Share Posted April 22, 2008 i can't get this to work. i get this error: /home/myusername/db_backup.sh: line 86: mysql_backupz/1-db_name.sql.gz.uu: No such file or directory UU? Check the path to the file it's looking for, and check spelling, etc. I usually place the script and backups in the same folder, since it's above root. 0 Quote Link to comment Share on other sites More sharing options...
minadreapta Posted April 22, 2008 Share Posted April 22, 2008 i've put the script in the same directory, here is how it looks in the file: # List all of the MySQL databases that you want to backup in here databases="username_db" # Directory where you want the backup files to be placed backupdir=/home/username/mydbs # MySQL dump command, use the full path name here mysqldumpcmd=/usr/bin/mysqldump it doesn;t seem to work: i receive this error by email: /home/username/mydbs/mydbs.sh: line 86: /home/username/mydbs/2-username_db.sql.gz.uu: No such file or directory now when i check mydbs directory, it has 3 files: 2-username_db.sql.gz username_db.sql.gz.uu mydbs.sh what am i doing wrong? the mydbs directory and mydbs.sh are 755 both. 0 Quote Link to comment Share on other sites More sharing options...
minadreapta Posted April 22, 2008 Share Posted April 22, 2008 I GOT IT: there is an error in the script: at line 85: $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu this should be: $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${DOW}-${database}.sql.gz.uu there was a {DOW} missing . 0 Quote Link to comment Share on other sites More sharing options...
bear Posted April 22, 2008 Author Share Posted April 22, 2008 Was just coming here to say that. I'll adjust the first post with the script. 0 Quote Link to comment Share on other sites More sharing options...
minadreapta Posted April 22, 2008 Share Posted April 22, 2008 another problem, it seems the emailed .gz archive is corrupted everytime... i will test it some more and i'll get back. 0 Quote Link to comment Share on other sites More sharing options...
minadreapta Posted April 22, 2008 Share Posted April 22, 2008 when sent to a yahoo email, i receive only a text: "begin 644 /home/username/mydbs/22-username_db.sql.gz ` end" the same email sent to the server's email address has a .gz archive attached which seems to be corrupt, i can;t manage to open it. THE PROBLEM IS that the archive is OK on the server, but not on the email. any ideeas? 0 Quote Link to comment Share on other sites More sharing options...
minadreapta Posted April 22, 2008 Share Posted April 22, 2008 does this have to do with the uu encoding of the archives? would it be a big problem if the databases are not encoded and sent in .sql.gz format? 0 Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.