I've noted these issues before, and it's come up again recently in the CC Debate forum (in relation to long-term prospects for project formats), but I'll post some more since it seems to me to be a major area of unsolved problems.
from the article:
"If anyone was prepared for digitization, it should have been the studios — they were behind the push for digital theatrical distribution to begin with. And yet what the report found was an environment in which long-term planning for preserving digital information was not being done, in which the existing technology wasn’t adequate for archival needs, and finally, in which film preservation would require “significant and perpetual spending” far above what was necessary for analog preservation."
"... a digital film archive needs to invest heavily in data migration to maintain its assets ... If the last decade has taught us nothing else, it’s that our system rewards executives who make horrible long-term decisions for short-term results. ... currently, studios also store a 35mm print, either a color negative or a black-and-white YCM color separation. This doesn’t preserve all of the information as perfectly as the digital copy, but if properly stored, this copy can last 100 years before it begins to deteriorate."
"One of the conclusions The Digital Dilemma reached was that the motion-picture industry should push for the development of a true digital archival format, capable of surviving 100 years of benign neglect."
author Matthew Dessem in comments:
"... at least one person I spoke to said that some of the old [original camera files] were already impossible to read because of companies folding and taking proprietary compression algorithms with them. Actually a standardized filing/labeling system for these would also come in handy as each editor tends to name their work files according to whatever system they use to work, so basically a lot of what gets taken in by studios is just tons of files with no identifying information."
user "Kelly" in the comments:
"Analog is seen as the superior format right now partially because we've already spent a century making mistakes and eventually figuring out how best to handle it, and of course there are plenty of advantages to film (being able to physically wind into a film with your bare hands and see an image, for instance), but it's never been an easy or perfect process. Film prints can still fade and turn weird colors even when in proper storage, just because they weren't developed or washed properly, or because they sat in a hot warehouse for a year before they arrived to the proper storage. It's just a bit more of a gradual yet inevitable loss than digital is at the moment. Digital preservation and data migration have a lot of issues and cost outlay at the moment, but they are also the inevitable future, so our efforts are best put into being practical, considered, well-informed and open to change when figuring out how best to move forward."
user "Cinecraft" in the comments:
"Countless hours of television footage is at risk because the tape stock it is on is decaying. File based formats are in even worse shape. In a particularly notorious example, nearly half of the original files for Toy Story had to be re-rendered because the LTO tape they were stored on had become unreadable or corrupt. And LTO has heretofore been regarded as the MOST secure digital asset protection medium. It is what my company uses to back up its digital video files."
"But there are a lot of unknowns, and there is still the issue of the drive become unreadable due to the format becoming obsolete. ... Lord knows what will be in store for future archivists as they attempt to decode each codec. We need to agree upon a standard codec that can be revised with time, but also remains backwards compatible. And we need to arrive at a standard drive format and connectivity. We need to design a system that can be modified but without discarding what already exists, so it becomes possible to work one's way back when circumstances dictate."
Additionally, from last year:
"... not too long ago, studios simply threw films away. Paramount planned to burn its old nitrate. MGM was set to dump its original negatives — including those for Gone With the Wind and The Wizard of Oz — into the ocean. What did they need those for, they figured? They'd made copies.
Luckily for the studios, archivists at UCLA and Eastman House took the prints instead. Because, years later, MGM wanted to digitize its old movies and needed the originals back. The copies they'd made, on Kodak stock, had faded."
"And even after the films are converted to digital, Jan-Christopher Horak, director of the UCLA Film & Television Archive, calls the challenges of preserving them "monumental." Digital is lousy for long-term storage.
The main problem is format obsolescence. File formats can go obsolete in a matter of months. On this subject, Horak's every sentence requires an exclamation mark. "In the last 10 years of digitality, we've gone through 20 formats!" he says. "Every 18 months we're getting a new format!"
So every two years, data must be transferred, or "migrated," to a new device. If that doesn't happen, the data may never being accessible again. Technology can advance too far ahead."
All of this, though, is about end result preservation, and doesn't really speak to things like project files etc. There is mention of the "Toy Story" incident which maybe speaks to this more directly (they had to re-render parts of the film)
from the L.A. Weekly link:
"Five years after the first Toy Story came out, producers wanted to release it on DVD. When they went back to the original animation files, they realized that 20 percent of the data had been corrupted and was now unusable."
The implication here (though I can't find a confirmation) is that they had to re-render one fifth of the film from some form of original project files. If anyone has further info on this, I'd be interested.
(Note: this is not the tangential and unrelated story of the Toy Story 2 incident - youtube below - and this is unrelated to the 3D rendering of TS 1 & 2, though one suspects they might be going back to original work for that ... )
A further tangent, but related:
When Artworks Crash: Restorers Face Digital Test
"When Whitney curators decided to resurrect the piece last year, the art didn’t work. Once innovative, “The World’s First Collaborative Sentence” now mostly just crashed browsers. The rudimentary code and links were out of date. There was endlessly scrolling and seemingly indecipherable text in a format that had long ago ceased being cutting edge."
"Their work involved not only updating servers and running legacy browsers on vintage computer systems, but also considering theoretical and ethical aspects of the conservation, conducting interviews with the original programmer to document what the lost software did, writing the new code for the work, addressing a host of thorny technical issues and documenting the results. In short, the project is an excellent case study of the issues involved with digital conservation ..."
Edit: original metafilter thread with further discussion:
6 or 7 years ago I started looking at archiving options for the company I was with at the time and I quickly came to realize that archiving in the classic sense (put a high quality master in a controlled environment) really didn't exist in the digital age because codecs, mediums and software changed too rapidly. Digital archiving is more like a never ending series of migrations to newer codecs, containers, and storage mediums.
As one of your examples mentioned, actual film can be seen by the naked eye and, with out much trouble ,a basic viewing device could be constructed to play it back. That's certainly not the case with digital where there are multiple hardware and software variables (many of them proprietary) that must be accounted for.