Free Republic
Browse · Search
General/Chat
Topics · Post Article

To: ThePythonicCow
For backup, I'd recommend USB external drives. Have at least three of them, and rotate them every week or three. Then even if one goes out once a year, it won't cost you too much lost data. Keep the ones not in use a fair distance from the computer, so that thieves, fires and other natural disasters that get to one don't get to the other.

To protect myself against malware, I've been thinking it would be nice to have a backup system where one computer does nothing except run a backup server (and should thus be immune to nasties). The backup server should enforce versioning of files in certain directories, so that if a file gets changed the backup server will keep old and new versions indefinitely; it should not be possible to overwrite the old version from the network.

I would think that such a system, properly configured, would provide considerable protection against malware and such. Anyone know of any software to help implement such a thing?

Also, are there any good programs for inventorying the contents of hard disks, including hash values for files, so as to allow a comparison of contents?

23 posted on 10/14/2006 5:02:25 PM PDT by supercat (Sony delenda est.)
[ Post Reply | Private Reply | To 15 | View Replies ]


To: supercat
Eh ... well, yes. But I wrote it for my own use. It's too weird to be generally usable. For example, when I want to do a restore, I have to custom write a restore script on the fly. And it relies on intricacies of the SCCS file format that few people on the planet will ever want to rely on. It keeps all versions forever, using a crypto strong checksum to identify any change. I can recover old compuserve files from over a decade ago, and restore any portion of my file system, to any point in time. It is very economical in resource use while backing up. I have it running a backup a couple times a day, and without looking at the cron script, I don't even know when these times are anymore.

One thing I do differently from what you describe. Instead of backing up to a separate dedicated server, I backup to an online disk partition, then replicate that backup partition using rsync, to a second backup on a removable SATA drive, that is only mounted briefly during the backup.

It is a big time saver to have near real time backups, online, whenever I need them. Given the multiple file copies, stored by non-obvious names (their hex check sum) in non-obvious places, I am at very low risk of some malware figuring this all out and corrupting all my backups including the copies rotated offline.

The best bet I know of that's generally available and usable is rdiff-backup. I use it myself on one system.

Another one that looks promising, but that I haven't tried, is RSyncBackup.

24 posted on 10/14/2006 5:44:32 PM PDT by ThePythonicCow (We are but Seekers of Truth, not the Source.)
[ Post Reply | Private Reply | To 23 | View Replies ]

To: supercat
I think you are describing Tripwire here. From the site Tripwire.com:

Open Source Tripwire and Tripwire Enterprise

If you need to detect changes made to your Linux and UNIX servers, you have three choices - Open Source Tripwire, Tripwire for Servers and Tripwire Enterprise. Although they all share a common heritage, these solutions have significant differences that make them appropriate for different IT environments:


25 posted on 10/14/2006 5:54:54 PM PDT by ThePythonicCow (We are but Seekers of Truth, not the Source.)
[ Post Reply | Private Reply | To 23 | View Replies ]

Free Republic
Browse · Search
General/Chat
Topics · Post Article


FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson