Michael has a photography magazine in one of his bathrooms that’s about a year old and is touting on its cover the latest-greatest camera, which is a 16.2 Mega-Pixel model. It was thought provoking to see that there, a year old, and wonder, “wow, what resolution are they up to now?”

And then I thought “y’know, I don’t think I need more than the ~7 Mega Pixels I’m getting now.”

These “who would ever need more computer power?” comments are collected and laughed at by techies everywhere. Probably the best known is a comment falsely attributed to Bill Gates that 640K should be
“enough for anybody”. Anyone who knows what RAM is should realize how wrong this idea is, how much we now want more RAM, in fact NEED it if we want to run any software written in the past 10-15 years. So Gates gets laughed at, even though this time he appears not to deserve it.

So why would I go “on the record” here in my blog and say that I doubt that I’ll ever want for a camera with 16.2 Mega Pixels? The difference is clear: in this case, the limiting factor is not the imagination of computer programmers and software designers. In this case, the limit is my eyes.

Consider this: around 1990, us nerds talked a fair amount about printing resolution. I think I was pretty excited at the first 300 DPI (that’s “dots per inch”) printer. Then came the 600 DPI Lasers, and they looked great. Now 15 years later, we are typically using 100 times as much RAM in our computers, but our printers are still typically 300-600 DPI. Why no better? Did the engineers lose the ability to go further? Not in my opinion. My opinon is: most of us, especially those of us who aren’t trained as printing professionals, can’t see the benefit of higher resolution. Maybe not the pros either. In any event, us lay people certainly aren’t going to pay more for more resolution, because we basically can’t tell the difference. Actually, a casual observer looking casually at a printout likely won’t notice the difference between 300 and 600 DPI, they both look pretty good.

My assertions about digital camera resolution staying pretty stagnant from here on out is similar. My 7 megapixel images don’t fit (at full resolution) on any of the monitors I own. Sure, it’s nice to be able to zoom in, but.. I’m buying a camera, not a telescope. So, mostly that high resolution is good for good looking printouts. But these images from my camera will fit (at full resolution) onto 8.5×11 paper at 300 DPI (they will actually be 8.25 x 11). And I hardly ever actually do that. They can do just under 4×6 at 600 DPI. I don’t do that much either. For standard 4×6 at 300 DPI prints, I don’t even use the full resolution that I have right now. So I don’t have much incentive to buy more.

Some people do, of course. A full 8.5 x 11 printout (with no border) at 600 DPI would be something like 30-40 Megapixels, and I can see that people like professional photographers will want that. Maybe lay people will too, but they won’t pay much extra for it once they have 300 DPI at the same size, which many of us already do. Now, if plotters became cheap and commonplace, maybe we’d all be making poster-sized printouts of our families and stuff, and then there’d be at least SOME incentive for cameras with much greater resolution. But, until something like that happens, I think camera manufacturers are going to have a hard time selling us common folk on much more than 7 Megapixels. (By the way, if you want to play with these numbers, there’s a nice image resolution calulator you can play with).

Besides the human factor, there are technology-related reasons that average people won’t want much more resolution right now. With the current state of computers and networks, it’s pretty clunky to deal with these full resolution pictures. I took about 400MB of photos celebrating Xmas with my family. That’s 1% of the total disk space on my laptop. Maybe that doesn’t sound like much, but if I did that once a week for a year (not really that many photos) my laptop’s Hard Drive would be 50% with photos. These things will change, though. It’s the maximum resolution that we can distinguish with our eyes that won’t.

This isn’t limited to image resolution, either. Audio quality as defined by the Compact Disc (16 bit/44100 hz) is still as good as we (basically) ever go for, and that was defined over 25 years ago. It would be trivial to go at higher quality today, but no one typically does because humans can’t hear the difference when you do. Our ears aren’t sensitive enough. Similarly, Hollywood has used the same rate of 24 frames per second for something like 100 years. You occassionally see some distortion because of this (think a moving wagon wheel on the big screen) but there’s not much of a push to make it better. Again, human perception isn’t good enough to justify the expenditure.

All this is basically to serve as a reminder that techological advances are not always limited by technical issues, sometimes it’s our physiology that limits things. I’ve hinted previously that once we have enough network bandwidth to do high-quality video and audio in real time, there won’t be much incentive to go faster. I’m not ready to stick my neck out on that yet, but I will say that I don’t see much demand for anything that would require more bandwidth than that. Still, that’s quite a ways away…