Posted on 09/20/2005 4:13:50 PM PDT by Zuben Elgenubi
A new way to stop digital decay
From The Economist print edition
Computing: Could a virtual computer, built from software, help to save today's digital documents for historians of the future?
WHEN future historians turn their attention to the early 21st century, electronic documents will be vital to their understanding of our times. Old web pages may not turn yellow and brittle like paper, but the digital documents of today's culture face a more serious threat: the disappearance of computers able to read them. Even a relatively simple electronic item, such as a picture, requires software to present it as a visible image, but 100 years from now, today's computers will have long since become obsolete. More complex items, like CD-ROMs or videos, will be unreadable even sooner.
In 1986, for example, 900 years after the Domesday book, the BBC launched a project to compile data about Britain, including maps, video and text. The results were recorded on laserdiscs that could only be read by a special system based around a BBC Micro home computer. But since the disks were unreadable on any other system, this pioneering example of multimedia was nearly lost for ever. It took two and a half years of patient work with one of the few surviving machines to move the data on to a modern PC (it can be seen online at www.domesday1986.com).
National libraries are just starting to grapple with this problem as part of their new mandate to preserve digital culture. It is a major problem, but it is remarkable how little known it is, says Hilde van Wijngaarden, head of digital preservation at the National Library of the Netherlands. People just accept that things no longer work after ten years.
Keeping working examples of all computer hardware is impractical, so the most popular preservation strategy is to copy files from one generation of hardware to the next. The problem is that today's word processors and web browsers, for example, do not always display files in the same way that older software did. An accumulation of subtle errors can eventually make the original item unreadable. An alternative approach, called emulation, uses software to simulate the old hardware on a modern computer, to allow old software to run. But today's emulators will need another emulator to run on the next generation of hardware, which will need another emulator for the next generation, and so on. This can also introduce errors.
So the National Library of the Netherlands is exploring a third option, using a simulated computer that exists only in software. It is called the Universal Virtual Computer (UVC) and is being developed by IBM, a computer giant. The researchers are writing programs to run on this virtual computer that decode different document formats. Future libraries will have to write software that emulates the virtual computer on each new generation of computer systems. But once that is done, they will be able to view all their stored documents using the decoders written for the virtual computer, which only have to be written once. The decoder can be tested for correctness today, while the format is still readable, says Raymond van Diessen of IBM.
His team has written decoders for two common image formats, JPEG and GIF. They plan to move on to Adobe's PDF format. IBM is also talking to drug firms, which are required to store data from clinical trials for long periods. Ultimately, the aim is to be able to preserve anything from simple web pages to complex data sets. Ominously, some scientific data from the 1970s has already crumbled into unreadable digital bits.
What happens in the not too distant future when the hardware no longer supports a particular version of the Universal Virtual Computer?
Then wont they need to write another UVC to run the original UVC.
But then the hardware will continue to evolve and you will need an ever-increasing string of UVC updates to read each other and ultimately read the files.
This is all very silly.
All we need is a complete list of rules for reading each file type. At any time in the future someone can write a program to read any file for which the rules have been maintained.
In the future when humans all have brain implants or have been replaced by AI robots it will take microseconds to generate the code and run it against the sum of all documents.
CD-ROMs last much shorter than a good quality paper. Hard disks and tapes last even less.
Floppies, CD's etc decay physically.
The computer PRINTOUTS on a good paper will survive.
Good paper lasts longer and it passed the test of time.
And all it takes is one careless flick of a Bic lighter to burn that theory to ash .. hence my point, that information (data) has always been lost thru the ages, regardless of its media type
As far as daguerrotype, well, I'm not an expert on the the evolution of photography, so I couldn't tell you anything about that. :)
You are absolutely correct. We have early 19th records which are holding up much better than a lot of our 20th century records. But it's always good to have a backup in case, God forbid, a fire should occur.
I wish they had filmed all the military records that were burned in the St. Louis fire in the early 1970's. My dad's WWII records were lost. They recreated as much as possible, but who knows what else was there.
True, there is magnetic degredation. But signal-processing techniques should make it possible to discern data which is faded too much to be readable via standard hardware (which must 'get' everything in one go).
Of course, if the oxide falls off the disk that's another story.
All depends on how well the film is made. Documents which are being archived for the purpose of preserving a good quality copy of the original will probably do pretty well. Documents which are thrown on microfilm because regulation XYZ says to do so even though nobody ever looks at them anyway may not do so well.
Why don't they just ask the BATFEC and the IRS what to do about it. I'm sure the pasty-faced droids up there have figured out how to store every detail of personal information about every living human on Earth for at least the next 103,487 years with no possibility of escape, oops excuse me I mean data degradation.
JPEG and TIFF accommodate embedded metadata of that type (and more). See http://www.aspjpeg.com/manual_06.html for details on embedded image metadata insertion and extraction.
The advertisement column is funny and misleading. Parsers for all interesting formats are being easily written and maintained for UNIX systems. OS emulators to be frequently udpated are unnecessary, although many of those are also being written and maintained.
Don't forget the wealth of historic records and photonegatives that were lost when the World Trade Center was destroyed. The basement of that building was used to house tons of historic content that is now forever lost.
Making daguerreotypes is what killed Daguerre.
CDs, if kept at a good temperature, humidity, and kept away from light, will last several hundred years.
I have wondered about this "digital decay" every time I send off a sound recording to the Library of Congress for copyright registration. Not only will CDs be unreadable and obsolte in the near future (if they aren't already), but I heard that there is a natural depletion that occurs over time of the digital data on the CDs themselves.
Baconian, Websterian, Collierian, call it what you will, encyclopedic efforts are inspired by the very nature that dooms them - the light that eclipses itself.
Maybe "dark ages" are of the same order of "ice ages."
Rose petals in Grandma's bible.
The only way to freeze time is to unplug the clock.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.