Jump to content

Automated Database Backup


bear

Recommended Posts

Here's the one I've been using with great success:

#!/bin/sh

# This script will backup one or more mySQL databases
# and then optionally email them and/or FTP them

# This script will create a different backup file for each database by day of the week
# i.e. 1-dbname1.sql.gz for database=dbname1 on Monday (day=1)
# This is a trick so that you never have more than 7 days worth of backups on your FTP server.
# as the weeks rotate, the files from the same day of the prev week are overwritten.

############################################################
#===> site-specific variables - customize for your site

# List all of the MySQL databases that you want to backup in here, 
# each seperated by a space
databases="db_name"

# Directory where you want the backup files to be placed
backupdir=/home/username/backup

# MySQL dump command, use the full path name here
mysqldumpcmd=/usr/bin/mysqldump

# MySQL Username and password
userpassword=" --user=user_name --password=pass_here"

# MySQL dump options
dumpoptions=" --quick --add-drop-table --add-locks --extended-insert --lock-tables"

# Unix Commands
gzip=/bin/gzip
uuencode=/usr/bin/uuencode
mail=/bin/mail

# Send Backup?  Would you like the backup emailed to you?
# Set to "y" if you do
sendbackup="n"
subject="Backup is done"
mailto="email@domain.tld"

#===> site-specific variables for FTP Use FQDN or IP
ftpbackup="y"
ftpserver="ftp.domain.tld"
ftpuser="ftp_username"
ftppasswd="ftp_pass"
# If you are keeping the backups in a subdir to your FTP root
ftpdir="/"

#===> END site-specific variables - customize for your site
############################################################

# Get the Day of the Week (0-6)
# This allows to save one backup for each day of the week
# Just alter the date command if you want to use a timestamp
DOW=`date +%w`

# Create our backup directory if not already there
mkdir -p ${backupdir}
if [ ! -d ${backupdir} ] 
then
  echo "Not a directory: ${backupdir}"
  exit 1
fi

# Dump all of our databases
echo "Dumping MySQL Databases"
for database in $databases
do
  $mysqldumpcmd $userpassword $dumpoptions $database > ${backupdir}/${DOW}-${database}.sql
done

# Compress all of our backup files
echo "Compressing Dump Files"
for database in $databases
do
  rm -f ${backupdir}/${DOW}-${database}.sql.gz
  $gzip ${backupdir}/${DOW}-${database}.sql
done

# Send the backups via email
if [ $sendbackup = "y" ]
then
  for database in $databases
  do
     $uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${DOW}-${database}.sql.gz.uu
     $mail -s "$subject : $database" $mailto < ${backupdir}/${DOW}-${database}.sql.gz.uu
  done
fi

# FTP it to the off-site server
echo "FTP file to $ftpserver FTP server"
if [ $ftpbackup = "y" ]
then
  for database in $databases
  do
     echo "==> ${backupdir}/${DOW}-${database}.sql.gz"
ftp -n $ftpserver <<EOF
user $ftpuser $ftppasswd 
bin
prompt
cd $ftpdir
lcd ${backupdir}
put ${DOW}-${database}.sql.gz
quit
EOF
  done
fi

# And we're done
ls -l ${backupdir}
echo "Dump Complete!"
exit

Name it "whatever_you_like.sh" create a cron, call this script (CHMOD 755 first) using:

sh /script/location/scriptname.sh > /dev/null

(you may need to use /bin/sh or similar on your server...keep it away from publicly accessible directories).

 

It creates a daily backup, FTPs to a remote server, can email a copy, and rotates them every week...saved my butt a few times.

Enjoy.

Link to comment
Share on other sites

  • 2 months later...
  • 1 month later...

A very easy one: just run a cronjob and it will back up databases from any server!

Change what's in red.

 

Cronjob:

 

curl http://cPanelusername:password@domain.com:2082/getsqlbackup/DB.gz --output /home/user/DB.gz

 

That's it!

 

 

Now if you want to get fancy, run the script below with cron to create a directory in your remote server to store the backups and then delete it after a given number of days (15 in this case, but you can change it). This is how I do db backups for all my clients. :)

 

1) Create a directory above public_html (ie: backup)

2) Create a file with any name (ie: managedbbackups.php) and copy the script below on it. Change "user" to your cPanel username. Save it.

3) Upload the file to the directory you created

4) Create the following cronjob:

 

/usr/local/bin/php /home/cPanel_user/backup/managedbbackups.php

 

 

 

<?php
// in this script we do the following
// 1. create a folder called 'backup-d-a-t-e'
// 2 move all *.gz file to the backup folder creates in step 1
// 3 delete a folder 'backup-deldate' where deldate is 15 days back
// does we have backup of everyday, for the last 15 days, directory getting older gets deleted.
// we run a cron-job, once a day, after we have backup of all domains done.
// step 1 
//whats the date ?
$date =   DATE('\b\a\c\k\u\p\-Y-m-d', time()); 
//make dir in name of date inside backup folder
// replace BackupID with your backup.com userid
$path = "/home/user/".$date;
mkdir($path, 0700);

// step 2 
// find files in the home directory having extension .gz
$files = array();
// replace BackupID with your backup.com userid
$dir=opendir("/home/user/");
while(($file = readdir($dir)) !== false) 

{  
if($file !== '.' && $file !== '..' && !is_dir($file))  
{  
$files[] = $file;  
} 
}  
// close dir
closedir($dir);
natcasesort($files);
// move all *.gz file to backup folder
for($i=0; $i<count($files); $i++)  
{
//                          $filename = strtolower($files[$i]) ; 
                         $exts = split("[/\\.]", $files[$i]) ; 
                         $n = count($exts)-1; 
                         $exts = $exts[$n]; 
                          if ($exts == gz)
                        {
                           //move file only if extension is gz
                           // replace BackupID with your backup.com userid
                          rename ( "/home/user/".$files[$i], $path."/".$files[$i]);
                         }
} 

// step 3 : deleting / removing old backup folder
//now lets delete folder which are 15 days only
$deldate =   DATE('\b\a\c\k\u\p\-Y-m-d', time()-15*24*60*60); 
//delete dir in name of date inside backup folder
// replace BackupID with your backup.com userid
$path = "/home/user/".$deldate;
function rmdirr($dir) {
 if (substr($dir,-1) != "/") $dir .= "/";
 if (!is_dir($dir)) return false;
 if (($dh = opendir($dir)) !== false) {
  while (($entry = readdir($dh)) !== false) {
    if ($entry != "." && $entry != "..") {
      if (is_file($dir . $entry) || is_link($dir . $entry)) unlink($dir . $entry);
      else if (is_dir($dir . $entry)) rmdirr($dir . $entry);
    }
  }
  closedir($dh);
  rmdir($dir);
  return true;
 }
 return false;
}
// Example:
$res = rmdirr($path); // Bye bye
?>

Link to comment
Share on other sites

  • 3 months later...

Will this work on Windows servers?

 

I tried running the script but am getting an error:

"Warning: exec(): Unable to fork [/usr/bin/mysqldump...]"

 

usr/bin/mysqldump relates to linux servers doesn't it?

 

Thanks

Link to comment
Share on other sites

  • 1 month later...
  • 4 weeks later...

Having some problems with this. Can you help out?

 

In cpanel my command to run is

sh /home/username/dbbackup.sh > /dev/null 

 

I have the details in the .sh file filled out. I get an email at the specified time each morning with the following

/home/username/dbbackup.sh: line 2: 
: command not found
/home/username/dbbackup.sh: line 5: 
: command not found
/home/username/dbbackup.sh: line 10: 
: command not found
/home/username/dbbackup.sh: line 13: 
: command not found
/home/username/dbbackup.sh: line 17: 
: command not found
/home/username/dbbackup.sh: line 20: 
: command not found
/home/username/dbbackup.sh: line 23: 
: command not found
/home/username/dbbackup.sh: line 26: 
: command not found
/home/username/dbbackup.sh: line 29: 
: command not found
/home/username/dbbackup.sh: line 34: 
: command not found
/home/username/dbbackup.sh: line 40: 
: command not found
/home/username/dbbackup.sh: line 48: 
: command not found
/home/username/dbbackup.sh: line 51: 
: command not found
/home/username/dbbackup.sh: line 56: 
: command not found
/home/username/dbbackup.sh: line 68: syntax error near unexpected token `do
'
/home/username/dbbackup.sh: line 68: `do
'

 

Can you help out please? :)

Link to comment
Share on other sites

  • 4 weeks later...
You're running this as 'root'?! :?

Testing as root, probably. Cron doesn't typically run as root for this sort of thing. If he can't run as root from the command line, he is having an issue that can't likely be fixed by testing as the user account. ;)

 

[root@sgoxygen ~]# sh /etc/bdbackup.sh > /bin/sh

This should really be located within the user account that's running it:

/home/user/backup.sh or similar, so the Cpanel user can access via cron.

 

Note the spelling error as well:

dbbackup vs bdbackup

Link to comment
Share on other sites

  • 2 months later...
  • 2 months later...
  • 3 weeks later...
i can't get this to work.

 

i get this error:

/home/myusername/db_backup.sh: line 86: mysql_backupz/1-db_name.sql.gz.uu: No such file or directory

 

i've set it up to email the db...

because the script is in

/home/myusername/public_html/db_backup.sh

and you better put it in a protected directory

Link to comment
Share on other sites

i can't get this to work.

 

i get this error:

/home/myusername/db_backup.sh: line 86: mysql_backupz/1-db_name.sql.gz.uu: No such file or directory

 

UU?

Check the path to the file it's looking for, and check spelling, etc. I usually place the script and backups in the same folder, since it's above root.

Link to comment
Share on other sites

i've put the script in the same directory, here is how it looks in the file:

 

# List all of the MySQL databases that you want to backup in here

databases="username_db"

 

# Directory where you want the backup files to be placed

backupdir=/home/username/mydbs

 

# MySQL dump command, use the full path name here

mysqldumpcmd=/usr/bin/mysqldump

 

 

 

 

it doesn;t seem to work: i receive this error by email:

/home/username/mydbs/mydbs.sh: line 86: /home/username/mydbs/2-username_db.sql.gz.uu: No such file or directory

 

 

 

now when i check mydbs directory, it has 3 files:

2-username_db.sql.gz

username_db.sql.gz.uu

mydbs.sh

 

what am i doing wrong?

the mydbs directory and mydbs.sh are 755 both.

Link to comment
Share on other sites

I GOT IT:

there is an error in the script:

 

at line 85:

$uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${database}.sql.gz.uu

 

this should be:

$uuencode ${backupdir}/${DOW}-${database}.sql.gz > ${backupdir}/${DOW}-${database}.sql.gz.uu

 

there was a {DOW} missing .

Link to comment
Share on other sites

when sent to a yahoo email, i receive only a text:

"begin 644 /home/username/mydbs/22-username_db.sql.gz

`

end"

 

the same email sent to the server's email address has a .gz archive attached which seems to be corrupt, i can;t manage to open it.

 

 

THE PROBLEM IS that the archive is OK on the server, but not on the email.

 

any ideeas?

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use & Guidelines and understand your posts will initially be pre-moderated