TAR Backups - Scalix 10 on CentOS 4.3

Discuss the Scalix Server software

Moderators: ScalixSupport, admin

LHD-Tech
Posts: 74
Joined: Tue Feb 20, 2007 1:25 pm
Location: Lompoc, CA
Contact:

TAR Backups - Scalix 10 on CentOS 4.3

Postby LHD-Tech » Mon Nov 05, 2007 8:01 pm

Every night we run our backup script and it backsup /var/opt/scalix. Each morning I validate the backup tar file with WinRar. More often then not my backups stop half way through with a corruption error but the file size says that its the full backup. I was wondering if I am going about this the right way and if there is anyone that could help me fix this so that I can have good backups all the time.

My Backup script

Code: Select all

/opt/scalix/bin/omshut                       2>&1 >> $MAINTLOG
sleep 60
tar -cf $BACKUP_DEVICE /var/opt/scalix   2>&1 >> $MAINTLOG
/opt/scalix/bin/omrc                                     2>&1 >> $MAINTLOG
mv /backup/2* /misc/mail_backup/$BACKUP_DATE-backup.tar


$BACKUP_DEVICE = mounted folder on local RAID
$BACKUP_DATE = names tar file with the date of the backup (2007-11-01 format)
/mail_backup = moutned network share for off server storage of backups



Any and all help is much appreciated!
-James-

kanderson

Postby kanderson » Tue Nov 06, 2007 2:08 am

Since you're not compressing it anyway, I'd drop tar and use rsync. The backups will go MUCH faster, and it'll be a useful filesystem that's easier to recover files from.

If the remote server is Windows, look here: http://www.itefix.no/cwrsync/

Note that the initial sync will take a very long time. Also note that it can run while your server is running. It'll miss a few files because they're in use, but don't worry about it. They'll be done as soon as you stop the server and run it to pull over just the updates.

Kev.

mito
Posts: 194
Joined: Fri Mar 24, 2006 11:33 am

Postby mito » Tue Nov 06, 2007 3:01 pm

Not to overstep on Kev here, as he is very good at what he does, I would just make note that by simply doing an rsync, there is no way to restore past the most recent backup. That is to say, there is no way to restore an email that someone deleted from his spam accidentally 3 days ago if you are doing an rsync backup daily. With tarballs, this is alleviated because you can have as many days backup as you have storage space to put them.

I would venture to say though, that your backups are corrupt because there are files that are not able to be backed up while the server is running. The solution to this is to shutdown the scalix services, rsync the files to a local dir, then startup the server again. At that point, do a tar -cfp (the -p to keep file permissions and ownerships) and tarball up the local backupdir to your backup file. This way you still get a fully restorable system (just untar and you're set) but you also get the added bonus of being able to restore any day that you have a tarball for.

There is a backup script around here in the forums that can automate that, and I use a modified version of that script myself.

Mito

kanderson

Postby kanderson » Tue Nov 06, 2007 3:27 pm

So, the solution to the lack of old versions is this.

on Monday, pull the rsync from your Scalix server, and drop it into a folder called Monday.

On Tuesday, copy Monday into Tuesday and then sync into Tuesday to make it current.

Wed copy Tues into Wed and then sync into Wed.

etc, etc.

This will allow your server to be down for FAR less time than the TAR, and you can manipulate the backup as you see fit on the second server.

OR...

just do 1 rsync, but zip it up on the backup server rather than the Scalix server. Save it out from there.

I prefer rsync because it's really fast. The less time the server is down, the better.

Mito, your input is just welcomed here. I still occasionally post things that are incorrect or not recommended. Heck, Florian corrected me on a post today... :) The main idea is to offer advice on what works based on our experience. Combined, we have WAY more experience than either of us alone.

Kev.

mito
Posts: 194
Joined: Fri Mar 24, 2006 11:33 am

Postby mito » Tue Nov 06, 2007 3:40 pm

kanderson wrote:OR...

just do 1 rsync, but zip it up on the backup server rather than the Scalix server. Save it out from there.


Just a clarification here...

What's the difference between this, and what I just said? The scalix services are up just as fast, since both use rsync to a local dir. The only difference is that mine uses tar on the local system, preserving file permissions, symlinks, etc and keeping it as a fully restorable filesystem. Using the backup server to do the zip isn't possible if the backup is nothing more than a NAS, but even if it's a windows system, you loose the fully restorable system portion.

If the worry here is that using the scalix server to perform the tar will cause too great of a load on the system, then one can get fancy and make use of different throttling methods, but in my experience the load isn't high enough to make the system non-usable. (Then again, I will admit that my system is only between 75-100 users big).

Again, not saying Kev is wrong, just providing an alternative solution, that still provides full restore capability (where using zip from a windows-based backup server would not allow such a restore). Using a remote tar from a *nix based remote backup server, using NFS instead of SMB for the network sharing would still be able to provide a fully-restorable solution, and limit the amount of time used overall, and cpu-wise.

Mito

kanderson

Postby kanderson » Tue Nov 06, 2007 3:48 pm

Nothing, I'd rsync it to the remote machine rather than local though because that way the processor load from the tar/zip wouldn't impact the mail clients. I'd take the hit in time when I needed to omcheck during a restore over the daily hit of compressing. Both will depend on need and server load. If you need fast recovery, compress it locally so you retain permissions. If your server is really busy or if you're more comfortable with WinRAR or WinZIP, then compress it remotely.

Kev.

LHD-Tech
Posts: 74
Joined: Tue Feb 20, 2007 1:25 pm
Location: Lompoc, CA
Contact:

Postby LHD-Tech » Thu Nov 08, 2007 2:15 pm

so my script should look something more like this:

Code: Select all

/opt/scalix/bin/omshut                                     2>&1 >> $MAINTLOG
sleep 60
cp /var/opt/scalix /backup/var/opt/scalix                  2>&1 >> $MAINTLOG
/opt/scalix/bin/omrc                                                   2>&1 >> $MAINTLOG
tar -cfp $backup_DEVICE /backup/var/opt/scalix       2>&1 >> $MAINTLOG
mv /backup/2* /misc/mail_backup/$BACKUP_DATE-backup.tar

mito
Posts: 194
Joined: Fri Mar 24, 2006 11:33 am

Postby mito » Thu Nov 08, 2007 2:28 pm

LHD-Tech wrote:so my script should look something more like this:


Using that as a basis, I would change it to this:

Code: Select all

/opt/scalix/bin/omshut                                     2>&1 >> $MAINTLOG
sleep 60
rsync -rtlpvH /var/opt/scalix /backup/var/opt/scalix                  2>&1 >> $MAINTLOG
/opt/scalix/bin/omrc                                                   2>&1 >> $MAINTLOG
tar -cfp $backup_DEVICE /backup/var/opt/scalix       2>&1 >> $MAINTLOG
rsync -rtlpvH /backup/2* /misc/mail_backup/$BACKUP_DATE-backup.tar


The huge reason behind using rsync is that first, it is FAST, and only copies what is needed. Besides that, it also does a hash check to make sure the source and destination finish with the same hashes, so you know the copy was complete. The other options on the command line preserve links, times, and permissions, something that copy does not do very well sometimes.

Also, the reason to not move the files, is that you loose your speed of rsync when you don't have a source destination already there, as rsync only transfers the files that have a different originating hash check.

LHD-Tech
Posts: 74
Joined: Tue Feb 20, 2007 1:25 pm
Location: Lompoc, CA
Contact:

Postby LHD-Tech » Thu Nov 08, 2007 3:51 pm

so the first rsync will probably need to be done manually as the time it takes is unknown, then each night afterwards can be done by the script?

and this rsync is something i need to install? - Nevermind, i see that its already installed.
Thanks for the help guys! I will test this on the test server and probably go live with it next week.

mito
Posts: 194
Joined: Fri Mar 24, 2006 11:33 am

Postby mito » Thu Nov 08, 2007 3:58 pm

LHD-Tech wrote:so the first rsync will probably need to be done manually as the time it takes is unknown, then each night afterwards can be done by the script?

and this rsync is something i need to install?


rsync should be installed by default, but if not, you can install it via 'yum install rsync'. As for the first rsync, it doesn't *have* to be done via hand, but it would definitely help increase the speed of the nightly backup. If you do perform the first rsync via hand, make sure to include the options that are in the script too. You might also want to add --progress immediately after the other options to have it show you it's transfer progress, otherwise it looks like it's not doing anything while it transfers the files, etc.

As an option, instead of manually calling the scalix startup and shutdown commands, you can also use the command ' /sbin/service scalix stop ' and ' /sbin/service scalix start ' to perform the options. Those are what I always use to start/stop scalix, I'm not sure what the output difference would be.

LHD-Tech
Posts: 74
Joined: Tue Feb 20, 2007 1:25 pm
Location: Lompoc, CA
Contact:

Postby LHD-Tech » Fri Nov 09, 2007 5:24 pm

WOW! The first rsync is waaaay long. I started it this morning and its still running. I have about 170 users which makes a TAR file around 26GB in size. Ill make sure to post the rest of my findings while testing this out.

Again, thanks for the help and info!

LHD-Tech
Posts: 74
Joined: Tue Feb 20, 2007 1:25 pm
Location: Lompoc, CA
Contact:

Postby LHD-Tech » Tue Nov 13, 2007 1:31 pm

It seems as if adding the /sbin/service scalix stop has fixed my problem with the corrupt TAR files. It has only been a couple days so I will be giving a couple more days before i call it a complete success.

Thanks a million for all the help!
-James-

techsharp
Posts: 436
Joined: Tue Jan 16, 2007 9:01 pm

Postby techsharp » Tue Nov 13, 2007 1:39 pm

LHD-Tech wrote:WOW! The first rsync is waaaay long. I started it this morning and its still running. I have about 170 users which makes a TAR file around 26GB in size. Ill make sure to post the rest of my findings while testing this out.

Again, thanks for the help and info!


The first time will take the longest and then the following ones are much shorter.

LHD-Tech
Posts: 74
Joined: Tue Feb 20, 2007 1:25 pm
Location: Lompoc, CA
Contact:

Postby LHD-Tech » Tue Nov 13, 2007 6:14 pm

yeah, but i ended up not using the RSYNC command. I have been able to get my TAR files to complete by simply using

Code: Select all

/sbin/service scalix stop


I am guessing there were still files being used in some way when using

Code: Select all

/opt/scalix/bin/omshut

LHD-Tech
Posts: 74
Joined: Tue Feb 20, 2007 1:25 pm
Location: Lompoc, CA
Contact:

Postby LHD-Tech » Wed Nov 14, 2007 3:19 pm

Hmm, well the backup was corrupt this morning.. I just may have to use the RSYNC command...


Return to “Scalix Server”



Who is online

Users browsing this forum: No registered users and 1 guest

cron