A few weeks ago, just before my keyboard died, my monitor momentarily flickered ever so subtly between displaying white as full white and white as soft pink. It happened so quickly, and the change was so faint, that at first I thought my eyes were playing tricks on me. Fortunately, a day or two later the same thing happened, allowing me to determine that the monitor itself was hinky.
While I have no qualms about opening up a keyboard to see if I can rectify a problem, or just about any other gadget you could name, I draw the line at messing around inside devices that contain potentially lethal capacitors. Combine that reticence with the flickering I had seen, and the low, staticky hum that had been building up in my monitor for the past year or two, and it suddenly seemed prudent to once again peruse the state-of-the-art display offerings available in the market before the very device I would need to rely on to do so failed completely.
(There are all kinds of things that can go wrong with a computer, and the most maddening aspect of all of them is that those issues immediately make it impossible to access the internet, which is where all the solutions are. If your operating system locks up you need to access another computer to research the problem. If your monitor dies you need to have another display on hand in order to order a replacement, which you would not need if you already had one on hand. Speaking of which, even if you use an add-on graphics card, the motherboard in your computer should always have its own graphics chip for exactly that reason. If your card dies — and graphics cards are always dying, or freaking out — you can still drive your monitor and access the web.)
As was the case with my venerable old keyboard, I was not at all surprised that my monitor might be at the end of its useful life. In fact — and you will no doubt find this amusing or absurd — I am still using a second-hand CRT that I bought in the mid-aughts for the lofty price of twenty-five dollars. While that in itself is comical, the real scream is that the monitor was manufactured in 2001, meaning it’s close to fifteen years old. Yet until a couple of weeks ago it had been working flawlessly all that time.
The monitor is a 19″ Viewsonic A90, and I can’t say I’ve ever had a single complaint about it. It replaced my beloved old Sony Trinitron G400, which borked one day without the slightest hint that something might be amiss. Scrambling to get myself back in freelance mode I scanned Craigslist and found a used monitor that would allow me to limp along until I found a better permanent solution. Eight or so years later here I am, still using the same A90. (In internet time, of course, those eight years are more like eight hundred. Not only were LCD’s, and later, LED’s, pricey back then, yet quite raw in terms of performance, but you could also reasonably expect to use Craigslist without being murdered.)
Between then and now I have kept track of changes in the price, size, functionality and technology of flat-panel monitors, and more than once researched display ratings with the thought that I might join the twenty-first century. Each time, however, three issues kept me from pulling the trigger.
First, while all that snazzy new technology was indeed snazzy and new, relative to CRT technology it was still immature, requiring compromises I was not willing to make in terms of display quality and potential effects on my eyes. Having always been sensitive to flickering monitors, I was not eager to throw money at a problem I did not have — or worse, buy myself a problem I did not want. (As a general rule, putting off any tech purchase as long as possible pays off twice, because what you end up with later is almost always better and cheaper than what you can purchase today.)
Second, at the time I was primarily freelancing in the interactive industry, which meant I was working with a lot of beta-version software that had not been fully tested with every conceivable display technology. Using lagging tech at both the graphics and display level meant I could be reasonably confident that whatever I was working on would draw to my screen, at least sufficiently to allow me to do my part.
Third — and this relates somewhat to the second point — one advantage CRT’s had and still have over LCD/LED displays is that they do not have a native resolution:
The native resolution of a LCD, LCoS or other flat panel display refers to its single fixed resolution. As an LCD display consists of a fixed raster, it cannot change resolution to match the signal being displayed as a CRT monitor can, meaning that optimal display quality can be reached only when the signal input matches the native resolution.
Whether you run a CRT at 1024×768 or 1600×1200 you’re going to get pretty much the same image quality, albeit at different scales. The fact that I could switch my A90 to any resolution was a boon while working in the games biz, because I could adjust my monitor to fit whatever was best for any game while still preserving the detail and clarity of whatever documents I was working on.
While imagery is and always has been the lusty focus of monitor reviews, there is almost nothing more difficult to clearly render using pixels of light than the sharply delineated, high-contrast symbols we call text. Because LCD/LED monitors have a native resolution, attempting to scale text (or anything else) introduces another problem:
While CRT monitors can usually display images at various resolutions, an LCD monitor has to rely on interpolation (scaling of the image), which causes a loss of image quality. An LCD has to scale up a smaller image to fit into the area of the native resolution. This is the same principle as taking a smaller image in an image editing program and enlarging it; the smaller image loses its sharpness when it is expanded.
The key word there is interpolation. If you run your LCD/LED at anything other than its native resolution what you see on your screen will almost inevitably be less sharp. While that may not matter when you’re watching a DVD or playing a game, interpolating text is one of the more difficult things to do well. Particularly in early flat panels the degradation from interpolation was considerable, making anything other than the native resolution ill-suited for word processing. [ Read more ]