[ic] Interchange backup & external database

VS-HOST.COM ADMIN interchange-users@icdevgroup.org
Sat Dec 7 15:25:01 2002


-----Original Message-----
Subject: [ic] Interchange backup & external database

[SNIP]
Does anybody have some clever way of exporting all their tables and
backing up without having to export all their tables individually? Or is
exporting their tables in IC not the ideal way to go with this?

Paul

[/END SNIP]
----------------------------------------------------------------

Below perl script will compress and back up all IC and catalogs
(and other important server files including mysql databases) to
a remote server. Runs quick quickly here on our production servers in
our internal network. Run via cron every evening. If running on more than
one server, offset cron job times as not to slow process due to backup
server HD write and eth speed.

[[Start Script]]

#!/usr/bin/perl
# Backup Script - Copyright MediaServices Network 2000-2002

use DBI;

# List directories that are to be backed up.
# Precede with remote backup file name and then the local directory path.

#archive these paths
%path=('IC'=>'/usr/local/interchange/',
       'IC_Catalogs'=>'/path/to/catalogs',
       'Server'=>'/etc/httpd/ /etc/sysconfig/ /etc/mail/ /etc/*.conf
/etc/*.cf /etc/aliases* /etc/xinetd.d/ /etc/ftp*',
       'var'=>'/var/lib/ /var/spool/ /var/named/',
       );

# Update this area with local and remote variables.

# keep files $max_age old on this server
$max_age=2;

# Precede all back up file names with this (could be server ID).
$prefix='Server1_';

# Backup server IP that will be used in connection process.
$backupserverIP='000.000.000.000';

# If you are backing up MySQL files make sure to list a user ID
# that has full access to list databases.
$mysqllogin='login';
$mysqlpassword='password';

# Local temp directory which must exist and remote
# server directory which also must exist.
$backup_local_dir='/usr/backup';
$backup_remote_dir='/usr/backup';

# END OF USER CONFIG (Do Not Edit Below This Line).


$dbh=DBI->connect('dbi:mysql:mysql',$mysqllogin,$mysqlpassword);
$sth=$dbh->prepare("show databases");
$sth->execute();
while ($a=$sth->fetchrow_array())
        {
        push (@db,$a);
        }
$sth->finish();
$dbh->disconnect();

$data=scalar(localtime(time()));
$data=~s/(...) (...) (..) (..):(..):.. (....)/$6_$2_$3/;

$data=~tr/ /0/;

foreach $baza (@db)
    {
    $fname="$backup_local_dir/${prefix}db_${baza}_$data.gz";
    system("/usr/bin/mysqldump --add-drop-table -u
$mysqllogin -p$mysqlpassword $baza | gzip>$fname");
    system("ncftpput -V -u backup -p notInhere02 $backupserverIP
$backup_remote_dir $fname");
    }

foreach $baza (keys %path)
    {
    $fname="$backup_local_dir/${prefix}dir_${baza}_$data.tgz";
    system("/bin/tar -P -czf $fname $path{$baza} 2> /dev/null");
    system("ncftpput -V -u backup -p notInhere02 $backupserverIP
$backup_remote_dir $fname");
    }

@files=glob("$backup_local_dir/*");
foreach $file (@files)
    {
    $age=int(-M $file);
    if ($age>$max_age)
        {
        unlink $file;
        }
    }

# End Script



Hope this helps some

Thank You
Russ Smith
Mediaservices Network
http://mediaservices.net
----------------------------------------------