9. Nov. 2008

Sometimes a situation can occur in which we need to do a bulk rename of files. Thinking about a situation in which we want to rename all .html files to make them inaccessible for a while or we want to rename all downloaded photos from your camera perhaps?

Being on Unix this can be done easily by using a simple 4 line shell script like the one below:

for a in *.html; do
  t=`echo $a | sed ‘s/.html$/.html.en/’`
  mv $a $t
done

In this example we will rename all .html file to .html.en e.q. index.html.en which comes handy in case you want to add another set of webpages to your server containing and supporting a 2nd languge.

Guess it is not really necessary to tell you that you should make a copy of your files first??

9. Nov. 2008

Sooner or later it can come to a situation in which we are keen to work with some blogging software or CMS. Or we need to upload a bigger set of photos to our server as for we want to publish them within an online slideshow or album.

Sure we can upload all zillion files with a handy ftp client  … file by file, but we soon will figure out about it will take ages to complete …. or we can try to speed it up using the best out of all worlds ……..

Most of todays webhosting is done on UNIX systems and even if we are not familiar with the UNIX Operation System itself, the knowledge of a few basic Unix Commands will help us speeding up with file handling in general. Assuming we DO have shell access to our webserver system (often called ssh (Secure Shell) access) the only thing we need is a ssh client and a minimum set of UNIX instructions to succeed.

As you might have got already, I’m suggesting you to upload all your files in a single set as an archive. Zip or Gzip your files locally and upload them to your server within one junk. As you will see later on, this has a lot of advantages as for it keeps the amount of data we have to transfer small as for it is zipped and therefore compressed.  As it also lowers or eliminates the rate of transfering problems you usually will have uploading several hundred files instead.

Alter uploading our file we now enter the server within our ssh client and switch to the folder in which we have uploaded our archive file earlier. We can do that with a simple command like cd /home/username/htdocs

Depending on which archive format we have used or deal with, we can now directly unzip the archive with: unzip filename.zip
(ZIP support is directly build in within all major UNIX implementations). But given the case we deal with some .gz file (mostly used within the UNIX world providing software distributions), we have to use a different set of commands.

In this case two different tools were used to provision us with the software package. One was the GZIP (compression) utility of UNIX, while before the Archive builder TAR was used to fit together all the several files and folders making it up to the final tool we are keen to use.

While GZIP is just a compressor, TAR could directly archive and compress itself. But to keep it easy here, I will explain it in two separate steps pointing out how the two tools work together to give you an idea about how it works.

In this mentioned second case we ususally start with some file called filename.tar.gz. Now we first have to decompress the archive file with a command: gzip -d  filename.tar.gz  and this will leave us with a decompress file now only named filename.tar.

As a now final step for getting our software at hand we finally have to extract all the content out of the archive file with the following command: tar xf filename.tar

Doing so will directly extract the archive content within the folder we are actually sitting in, by extracting all the achived directory structures stored within the archive file. As problem out of this we could end up with an unwished folder like product-version-x.x containing all the files, instead as wished having them directly within the root folder of our hosted environment.

This is not really done by the developer to annoy us as someone could assume now! In a difference to that, it is done for security purposes preventing us to mess up our file structures by mixing content which does not belong together … and at the end best practice on UNIX.

To correct this problem we now switch to the new subfolder with cd foldername and issue the command mv * ../.  invoking the UNIX command line utility move, which will move us all content one folder above.

Getting rid of the now orphaned subfolder is easily done with rmdir foldername  or … but be carefully with that rm -R foldername   as for it will delete everything within and below.  Is it necessary to mention that before we have to switch to the folder above with cd .. ???

And last and finally in case we should be happy with the new subfolder, but want to get rid of the version number though, we plainly can rename the folder by using the move utility again mv oldname newname e.q mv wordpress-2.6.3 wordpress

7. Nov. 2008

Some of the low cost Hosting Providers do not even offer an interface to backup your webspace.

Personally I do think about it as annoying to download all the previous uploaded files again and again through a slow and less reliable FTP Client interface when an easy script executed on the server can do so much quicker for you instead.

To give you an idea what about I’m talking, I’ve written together an easy example you may feel free about to use it in your own environment in future.

It is able to do both, creating a local backup in a separate folder on your server, as also to transfer this backup to your local machine or even to another FTP Server on the net:

#!/bin/bash
# FTP Backup by Michael Lohmar
# Script: ftpbackup.sh
# Author: Michael Lohmar
# Contact? info@mikelo.com

if [ $# != 3 ];then
echo ""
echo "Shell script for backing up one given domain."
echo "Usage: $(basename $0) domain_to_backup [FTP/NOFTP] [DEL/NODEL]"
echo ""
exit
fi

version=1.0

##### INSTALL INSTRUCTIONS: STEP 1 #####
##### START ENTER YOUR INFO HERE #####

serverip=yourserver.com
# Your remote servers IP address
# EG: serverip=192.168.1.1

serveruser=youruser
# The FTP login for the remote server
# EG: serveruser=bob

serverpass=yourpassword
# The FTP password for the remote server
# EG: serverpass=mypassword

localdir=/home/your/local/folder
# WHERE LOCAL FILES ARE TO BACKUP
# NO TRAILING SLASH
# EG: localdir=/backup/folder/daily

sourcedir=/home/your/source/folder
# WHERE LOCAL FILES ARE TO BACKUP
# NO TRAILING SLASH
# EG: localdir=/domain/source/folder

remotedir=your/remote/folder
# FTP directory where you want to save files to
# This directory must exist on the FTP server!
# NO TRAILING SLASH
# EG: remotedir=/serverdirectory

##### END YOUR INFO HERE #####

##### INSTALL INSTRUCTIONS: STEP 2 #####
# CHMOD the script to 755: # chmod 755 ftpbackup.sh

# Add the script to a scheduled cron job to run as often as you like (if wished!)

# In SSH do crontab -e, then paste in the following
# 0 6 * * 0,1,3,5 /home/admin/ftpbackup.sh
# This does a FTP backup every second day of the week, lookup cronjobs for more info on setting dates and times.
# Don’t forget to substitue the path info to the script with your details
##### INSTALL COMPLETE #####
# DO NOT MODIFY ANYTHING BELOW #

host=`hostname`
cd $sourcedir

echo "Starting FTP Backup on " $host

# Creating a local tar.gz Archive
tar cfvz $localdir/$1_`date +%y_%m_%d`.tar.gz $1

# Transfer the tar.gz Archive to remote server
if [ $2 == FTP ];then
cd $localdir
echo "user $serveruser $serverpass
cd $remotedir
bin
verbose
put $1_`date +%y_%m_%d`.tar.gz
" | ftp -i -n $serverip
fi

# Delete local tar.gz Archive again
if [ $3 == DEL ];then
rm $localdir/$1_`date +%y_%m_%d`.tar.gz
fi

echo "Ftp backup complete on " $host
exit 0

« previousnext »