Wednesday, May 24, 2006

Linux: Burn CD using CLI

To make a CD image of a directory with Rock Ridge and Joliet file support:

mkisofs -o name_of_file.iso -J -r /home/username/name_of_directory

Then

cdrecord -v speed=8 dev=0,0,0 name_of_iso_file.iso

(Note that you need to get the info for the 'dev=' part by running 'cdrecord -scanbus')

To copy a CD to an image file you could use this:

dd if=/dev/scd0 of=/home/username/isoimage.iso

or

cat /dev/scd0 >> /home/username/isoimage.iso

Sunday, May 21, 2006

MySQL: Increamental Backup

Author: phpfreak
Date: 08/27/2005
Version 1.0
Website: http://www.phpfreaks.com/tutorials/136/0.php

Backing up a database is something that should be done regularly on a dynamic driven website. You simply never know when the server is going to crash, or your database becomes corrupted. Therefore, it is a wise decision to take precautions and backup your database as often as possible.

In this very short tutorial, we're going to cover some of the "tried and true" methods to backup your database on the server your website is hosted on.
Shell Backup

If you have the ability to backup your database using some type of shell scripting, this would be easier. You can simply upload a shell script similar to this one and place it on your server then run it on the automated CRON daemon, and forget about it!

#!/bin/bash
TIMESTAMP=`date +%m-%d-%y-%H%M`
cd /home/backups/mysql
find /home/backups/mysql/*.tar.gz -mtime +7 -exec
rm -f {} ;
# my database
DBNAME=my_database
mysqldump --opt -c -e -Q -u MYSQL_USER -p'MYSQL_PASSWORD' $DBNAME > $DBNAME.sql tar czpf $DBNAME.$TIMESTAMP.tar.gz $DBNAME.sql
rm -f $DBNAME.sql

Now let's discuss this shell script a little in depth.

The beginning of the code is the shebang line and then followed by defining a variable called TIMESTAMP so that you can easily identify when the backup was created, other than the timestamp of the actual file. Kind of redundant, but prevents you from overwriting a backup done 10 minutes ago because the file name is different.

#!/bin/bash
TIMESTAMP=`date +%m-%d-%y-%H%M`

Next, we go ahead and change directory (cd) to the /home/backups/mysql directory. It is always the safe move to cd to the directory you are going to work in to prevent any kind of file system damage by automated scripts (just my preference).

cd /home/backups/mysql

Next, we run this handly find command to find any file tar.gz file in this directory that was modified more than 7 days ago. Thus, keeping backups only within the past 7 days. Usually, we don't need more than that, but it's smart to keep at least that if you can. You never know, you may go away on vacation, or out of town and not have access to your website for a few days. If you only keep one day worth of backups, you could be out of luck when it comes time to restore from the last known GOOD backup.

find /home/backups/mysql/*.tar.gz -mtime +7 -exec rm -rf {} ;

Moving along, the following code simply defines the database name DBNAME of the database we want to backup. There's a few reasons to define it as a variable, such as you would create variables in PHP, we do it to reuse the code much easier.

# my database
DBNAME=my_database
mysqldump --opt -c -e -Q -u MYSQL_USER -p'MYSQL_PASSWORD' $DBNAME > $DBNAME.sql tar czpf $DBNAME.$TIMESTAMP.tar.gz $DBNAME.sql
rm -f $DBNAME.sql

The previous code will dump the database using the best combination of options that we could find. Many times in the past, just using mysqldump without any flags causes import process to fail, so we've had great success rate with this combination: --opt -c -e -Q

Here's a trick for you. If you have multiple databases, you can simply copy the previous codeblock over and over, change the DBNAME variable and backup multiple databases at one time.

# my second database
DBNAME=my_database2
mysqldump --opt -c -e -Q -u MYSQL_USER -p'MYSQL_PASSWORD' $DBNAME > $DBNAME.sql tar czpf $DBNAME.$TIMESTAMP.tar.gz $DBNAME.sql
rm -f $DBNAME.sql

# my 3rd database
DBNAME=my_database3
mysqldump --opt -c -e -Q -u MYSQL_USER -p'MYSQL_PASSWORD' $DBNAME > $DBNAME.sql tar czpf $DBNAME.$TIMESTAMP.tar.gz $DBNAME.sql
rm -f $DBNAME.sql

# my 4th database DBNAME=my_database4 mysqldump --opt -c -e -Q -u MYSQL_USER -p'MYSQL_PASSWORD' $DBNAME > $DBNAME.sql
tar czpf $DBNAME.$TIMESTAMP.tar.gz $DBNAME.sql
rm -f $DBNAME.sql

Now that you have your script written, give it proper execution permissions and setup it up on a CRON job.
Restoring Your Database Backup

To restore your database, simply extract the database file you want ( tar zxpf database_name.TIMESTAMP.tar.gz ) and then import the mysql file:

mysql -u MYSQL_USER -p 'MYSQL_PASSWORD' DATABASE_NAME <>

Now, if you get any errors, you might need to drop the tables inside the database first, however, our dump command takes care of the DROP TABLE statements for you.

If you have a similar backup scenario, or a completely different one, feel free to discuss other options in the discussion thread at the end of this tutorial.

Saturday, May 20, 2006

Linux: Cronjob

Author : Laurence Hunter
Website : http://esm2.imt-mrs.fr/~staffelb/guide_linux/part1/cron.html

The Crontab File's Syntax

To tell cron what you want it to run, and how often you want it to run it, you need to create a crontab file. A crontab file is just a text file with the following syntax:

minute hour day-of-month month-of-year day-of-week command

Each of the above columns can be in one of the following formats (these examples are for the minute column):

30
Run command at 30 minutes past the hour.

0-59/10
Run command once every 10 minutes, for the entire hour.

15-30
Run command once every minute, from 15 to 30 minutes past the hour.

0,10,50
Run command at 0 minutes past the hour, 10 minutes past the hour, and 50 minutes past the hour.

*
Run command once every minute.

And here's the range of numbers available for each of the time and date columns:

minute : 0-59
hour : 0-23
day-of-month : 0-31
month-of-year: 1-12
day-of-week : 0-6
(0=Sun, 1=Mon, 2=Tue, 3=Wed, 4=Thu, 5=Fri, 6=Sat)

Here's an example crontab file:

30 0 * * * ./backup.sh
0,10,50 9-15 * * * ./compute.sh
0-59/10 * * * 1,3,5 ./netgrab.sh

30 0 * * * ./backup.sh
Run the backup.sh script (located in your home directory) at half-past (30) midnight (0), on every day of the month (*), and every day of the year (*), and every day of the week (*).

0,10,50 9-15 * * * ./compute.sh
Run the compute.sh script every 0 minutes, 10 minutes, and 50 minutes past the hours (0,10,50), between 9am and 5pm (9-15), every day of the year.

0-59/10 * * * 1,3,5 ./netgrab.sh
Run the netgrab.sh script every 10 minutes (0-59/10), every Monday, Wednesday, and Friday (1,3,5).

Saturday, January 01, 2005

Category