mysql mysqldump data backup and incremental backup
- 2020-06-03 08:35:27
- OfStack
This article focuses on how to use shell to achieve full and incremental backup of mysql. Incremental backup will copy ES2en-ES3en.00000 * to the specified directory at 3am on Week1-6. The full backup USES mysqldump to export all the databases, which is executed every Sunday at 3am, and removes mysq-ES6en.00000 * left over from the previous week. The backup of mysql is then kept in the bak.log file. The diagram below:
sql.tgz succ is generated by ES14en. sh. It is backed up once a week. 000001 copying mysql - bin.; skip mysql - bin. 000002! ; Bakup succ! It is produced by ES24en.sh once a day.
Implementation:
1. Write full backup scripts
2. Write incremental backup scripts
3. Set up the crontab task and execute the backup script every day
Appendix:
sh-n /root/ DBFullyBak.sh can be used to check if shell syntax is correct
sql.tgz succ is generated by ES14en. sh. It is backed up once a week. 000001 copying mysql - bin.; skip mysql - bin. 000002! ; Bakup succ! It is produced by ES24en.sh once a day.
Implementation:
1. Write full backup scripts
# vim /root/DBFullyBak.sh // Add the following
#!/bin/bash
# Program
# use mysqldump to Fully backup mysql data per week!
# History
# 2013-04-27 guo first
# Path
# ....
BakDir=/home/mysql/backup
LogFile=/home/mysql/backup/bak.log
Date=`date +%Y%m%d`
Begin=`date +"%Y years %m month %d day %H:%M:%S"`
cd $BakDir
DumpFile=$Date.sql
GZDumpFile=$Date.sql.tgz
/usr/local/mysql/bin/mysqldump -uroot -p123456 --quick --all-databases --flush-logs --delete-master-logs --single-transaction > $DumpFile
/bin/tar czvf $GZDumpFile $DumpFile
/bin/rm $DumpFile
Last=`date +"%Y years %m month %d day %H:%M:%S"`
echo start :$Begin The end of the :$Last $GZDumpFile succ >> $LogFile
cd $BakDir/daily
rm -f *
2. Write incremental backup scripts
# cat /root/DBDailyBak.sh // Content is under
#!/bin/bash
# Program
# use cp to backup mysql data everyday!
# History
# 2013-05-02 guo first
# Path
# ....
BakDir=/home/mysql/backup/daily
BinDir=/home/mysql/data
LogFile=/home/mysql/backup/bak.log
BinFile=/home/mysql/data/mysql-bin.index
/usr/local/mysql/bin/mysqladmin -uroot -p123456 flush-logs
# This is used to generate new ones mysql-bin.00000* file
Counter=`wc -l $BinFile |awk '{print $1}'`
NextNum=0
# this for Cycles are used for comparison $Counter,$NextNum These two values determine whether the file exists or is up to date.
for file in `cat $BinFile`
do
base=`basename $file`
#basename Used to capture mysql-bin.00000* Filename, get rid of ./mysql-bin.000005 In front of ./
NextNum=`expr $NextNum + 1`
if [ $NextNum -eq $Counter ]
then
echo $base skip! >> $LogFile
else
dest=$BakDir/$base
if(test -e $dest)
#test -e Used to detect the existence of the target file, the existence of write exist! to $LogFile Go to.
then
echo $base exist! >> $LogFile
else
cp $BinDir/$base $BakDir
echo $base copying >> $LogFile
fi
fi
done
echo `date +"%Y years %m month %d day %H:%M:%S"` $Next Bakup succ! >> $LogFile
3. Set up the crontab task and execute the backup script every day
# crontab -l // Content is under
# Every Sunday in the wee hours 3:00 Execute the full backup script
0 3 * * 0 /root/DBFullyBak.sh >/dev/null 2>&1
# weeks 1 To the week 6 In the morning 3:00 Do incremental backup
0 3 * * 1-6 /root/DBDailyBak.sh >/dev/null 2>&1
Appendix:
sh-n /root/ DBFullyBak.sh can be used to check if shell syntax is correct