7. Nov. 2008

Some of the low cost Hosting Providers do not even offer an interface to backup your webspace.

Personally I do think about it as annoying to download all the previous uploaded files again and again through a slow and less reliable FTP Client interface when an easy script executed on the server can do so much quicker for you instead.

To give you an idea what about I’m talking, I’ve written together an easy example you may feel free about to use it in your own environment in future.

It is able to do both, creating a local backup in a separate folder on your server, as also to transfer this backup to your local machine or even to another FTP Server on the net:

# FTP Backup by Michael Lohmar
# Script: ftpbackup.sh
# Author: Michael Lohmar
# Contact? info@mikelo.com

if [ $# != 3 ];then
echo ""
echo "Shell script for backing up one given domain."
echo "Usage: $(basename $0) domain_to_backup [FTP/NOFTP] [DEL/NODEL]"
echo ""



# Your remote servers IP address
# EG: serverip=

# The FTP login for the remote server
# EG: serveruser=bob

# The FTP password for the remote server
# EG: serverpass=mypassword

# EG: localdir=/backup/folder/daily

# EG: localdir=/domain/source/folder

# FTP directory where you want to save files to
# This directory must exist on the FTP server!
# EG: remotedir=/serverdirectory

##### END YOUR INFO HERE #####

# CHMOD the script to 755: # chmod 755 ftpbackup.sh

# Add the script to a scheduled cron job to run as often as you like (if wished!)

# In SSH do crontab -e, then paste in the following
# 0 6 * * 0,1,3,5 /home/admin/ftpbackup.sh
# This does a FTP backup every second day of the week, lookup cronjobs for more info on setting dates and times.
# Don’t forget to substitue the path info to the script with your details

cd $sourcedir

echo "Starting FTP Backup on " $host

# Creating a local tar.gz Archive
tar cfvz $localdir/$1_`date +%y_%m_%d`.tar.gz $1

# Transfer the tar.gz Archive to remote server
if [ $2 == FTP ];then
cd $localdir
echo "user $serveruser $serverpass
cd $remotedir
put $1_`date +%y_%m_%d`.tar.gz
" | ftp -i -n $serverip

# Delete local tar.gz Archive again
if [ $3 == DEL ];then
rm $localdir/$1_`date +%y_%m_%d`.tar.gz

echo "Ftp backup complete on " $host
exit 0

4. Nov. 2008

Sooner or later with a permanent growing database we will end up in a situation that our night isn’t just long enough anymore to backup our database completely.

Now you might wonder about what size of database I’m talking about right now. But seriously, database at a size of several terrabyte are nothing uncommon today anymore. Especially when it comes to SAP or similar applications working on them databases can grow exceptional sizes..

The Oracle Database Version 10g now comes along with some new feature helping us to deal with that situation. It is a feature called Block Change Tracking and marks down all modified database blocks changed by all transactions within an additional external file.

When a commit is issued against a data block, the block change tracking information is copied to a shared area in Large Pool called the CTWR buffer and during the next checkpoint, the CTWR process writes down the information from the CTWR RAM buffer to the former defined change-tracking file.

Now doing an incremental backup with Oracle 10g, RMAN has a mechanism to identify and bypass those data blocks which have not changed by just easily following the list of changed blocks within this file.

The syntax for Oracle block level change tracking is simple:

    USING FILE os_file_name;

By default, Oracle does not record block change information!

To enable this feature, we need to issue the following command:

SQL> alter database enable block change tracking;

To disable this feature, we issue this command:

SQL> alter database disable block change tracking;

So it’s an absolutely easy to configure mechanism, being able to speed up our nightly incremental backups dramatically. Leaving the full backups for the weekends, customers then hopefully don’t mind.

The only thing we have to be aware about is the space this file later on will need to get written. Based on our databases transaction load this change-tracking file can reach some serious size for sure.

31. Oct. 2008

Many webdevs are doing an excellent stuff and as we all know their webpages do impress us every day again and again. But neing experts one hand, at other topics, like doing necessary database backups, they often act like bloody beginners.

Often when it comes to database interactions using some CMS (Content Management System) they often neglect even the basics like doing backups frequently. But it can be that easy to have well ordered backups of your CMS’ underlaying database done automatically in an most convenient way. And in case you are in some ‘need’ – just restore your database backup and you are back in business again.

Sure you can code that backup all by yourself, but today we wll show you how easy the combination of just a few things can do all that backup stuff automatically for you. At least when it comes to MySQL databases which are widely common for all kind of web related database activities.

We first start with some free avalable PHP based backup tool in which we will define the backup to take, while later on we will use some easy to implement tricks to automise the backups taken for you.

The tool of our choise is called phpMyBackupPro which is available for free at phpMyBackupPro.net. (The tool itself is for free, but the author who has really made a great job offers to to donate some little money helping him to continue his excellent work.)

Installing the tool is easy done by extracting the archive and uploading it’s content to your webserver, but for sure it also works for you on your local system as well.

the basic configuration page(enlarge view)
Once uploaded it can get directly invoke and we do start with some basic configuration questions. It isn’t much about to know or perhaps guessing it needs  and filling out at that page – you should anyway know that already.

the scheduled MySQL backup pageFrom there we will head over the the more        (enlarge view)
interesting part of scheduling a backup. There we will now first select the databases to backup at the left side. Depending on your environment there might be just one, but in a more complex situation you and also choose more or all backing up them all together later on.

It’s not problem going with all other setting as they are defined by default, but feel free to play with it if wished. Even the dangerous appearing question about to add a ‘drop table’ command is fine and should be checked, as for it is referencing to a possible later restore of a database and in this case for sure a prior drop table instruction should be executed.

When we now press the button ‘show script’, as a result we will see some PHP code we can just copy&paste into some new file (lets call it cron.php) we later on will upload into the root folder of our phpMyBackupPro installation.

Depending on our basic installation invoking this script within any of the usual browsers will now generate us the wished database backup.

The result can then either get stored within some subfolder inside the tools’ folder, emailed to one of our email account or pretty much handy transfered via FTP to some remote server really creating an offline backup there.

Now calling this script agan and again out of your browser is not what I really would call automated at all. But honestly it does not need much more now.

Once we tested the script as working we do now just need some automatim calling it for us again and again.

This can be done by the OS itself when we’re going to help it a bit. What we now need is some line mode browser invoking the script. On UNIX platform this can be done by using either lynx or curl.

An easy and simple command line script will do:

  • for lynx (which is pretty much common on RedHat systems) we use:

    /usr/bin/lynx -source http://www.yourdomain.com/phpMyBackupPro/cron.php > /dev/null 2>&1

  •   and with curl (mostly used on Debian distributions) we use:

    /usr/bin/curl http://www.yourdomain.com/phpMyBackupPro/cron.php
So either one of these lines we will put into another new file (lets call it cron.sh and don’t forget about to proper ‘chmod’ it) which we then finally schedule out of the UNIX usual cron utility.

So with the command crontab -e we will invoke our crontab file and with adding there a line like:

0 0 * * * /home/ /cron.sh

the system will do an automated backup for us seven days a week around midnight.

For a higher frequence and or other specialties please refer into the separate cron documentation first.

« previous