Film still has much higher resolution, a wider color gamut, and greater dynamic range than digital sensors; however, the convenience, instant feedback, and cost savings offered by digital photos and video will eventually confine analog film to niche uses. Today’s (2012) consumer level digital resolutions already capture images that exceed most people’s visual acuity for small formats (e.g. Apple’s “Retina” displays, 5″x7″ prints), but feature movies are still shot using 35-mm film to safely scale to large theater screens.
Ken Rockwell’s sage take on analog vs. digital in 2002 remains true today:
Convenience has always won out over ultimate quality throughout the history of photography. Huge home-made wet glass plates led to store-bought dry plates which led to 8 x 10″ sheet film which led to 4 x 5″ sheet film which led to 2-1/4″ roll film which led to 35mm which led to digital. As the years roll on the ultimate quality obtained in each smaller medium drops, while the average results obtained by everyone climbs. In 1860 only a few skilled artisans like my great-great-great grandfather in Scotland could coax any sort of an image at all from a plate camera while normal people couldn’t even take photos at all. In 1940 normal people got fuzzy snaps from their Brownies and flashbulbs while artists got incredible results on 8 x 10″ film. Today artists still mess with 4 x 5″ cameras and normal people are getting the best photos they ever have on 3 MP digital cameras printed at the local photo lab.
Most of the “digital vs. film” essays on the internet actually compare digitized scans of film against directly captured digital images and have the implicit goal of justifying a professional photographer’s expensive digital camera purchase. Unfortunately, the scanner usually limits resolution on the film side but rarely receives reviewer attention!
When critiquing articles, watch for comparisons that use a microscope to examine the film and look for discussions about the overall workflows’ impact on each imaging system. Only recently has lens MTF testing and discussion been revived for digital photography; film buffs in the 70’s and 80’s regularly read optical lab reports comparing lenses. This expansion of the conversation shows that high-end camera sensors have finally achieved resolutions that film had in the 70’s; the sensors have finally reached the point that lens quality can once again affect overall image quality. Despite Kodak’s financial difficulties, their research labs have continued improving film and digital still has ground to cover before it reaches the absolute resolutions available on film. 35-mm movies could deliver even higher resolution, if they needed to, by using larger formats; for instance, VistaVision exposes twice as much negative area (8-perf, horizontal frames exposed in the same way that a 35-mm still camera exposes them), but they don’t need to – other costs and limitations in the workflow are more important.
Color gamut, frame rate, and dynamic range remain problematic for digital imaging too. HDR algorithms and better sensors have only started addressing these problems. Panavision’s John Galt provides some good detail in “The Truth About 2K, 4K and The Future of Pixels” where he advocates for higher frame rates as the quickest way to improve perceived resolution.
My conclusion? While film is technically superior, none of this really matters for me yet; the creative input of the photographer/director dominates the quality of the result. Even an iPhone, in the hands of an expert photographer, can outperform any camera in the hands of an amateur. Instead of investing in increasingly higher resolution cameras or reverting to film, I’m heading to the library to improve the equipment between my ears!