Life's Too Short for Video Ports


As a jack-of-all-trades IT pro, I have a lengthy list of frustrations – both of the large and small variety. However, after more than 15 years working IT in a small office environment, there’s one nuisance that has come to trump all others.


In the world of computing, old technologies become deprecated. Slowly, but surely, they fade away and are replaced by the newer, unquestionably-better technologies. Not so with video ports. Imagine if the dinosaurs never died out: except that in this scenario dinosaurs are equally terrible and boring.

The result of this annoying lack of evolution-induced extinction is a product ecosystem with a terrible mishmash of old and new technologies, many of which are not compatible, some of which are terrible, and almost all of which are unnecessary.

Allow me to elucidate:


The Video Graphics Array is one of the connectors you’re probably most familiar with. That – in and of itself – makes me sad.

Introduced in 1987, VGA is 31 years old, and as such older than some of the people I go drinking with. VGA is that old chestnut of a video interface that people use because its there and familiar. But by today’s technological standards, it sucks.

VGA only supports analog signals – something which is unacceptable in the era of ubiquitous, pixel-perfect LCD displays with high-DPI resolutions. In fact, graphics chipset vendors Nvidia and Intel officially dropped support for VGA three years ago. Yet – somehow – if you buy a monitor today, chances are it will have a VGA port on it. In fact, many of the cheapest monitors still come with only one port: VGA.

VGA’s most infuriating quality is that it refuses to die in the face of vastly-superior technology, thanks in part to a user-base who are weirdly attached to these blue, D-sub-esque, be-thumbscrewed connectors. Apple catches a lot of flack for killing weirdly anachronistic technologies like the floppy drive and the headphone jack, but at least they’re forcing the death of bad, antiquated tech.


“Feed me easily-bent pins, human…”

“Feed me easily-bent pins, human…”

Unfortunately, VGA’s successor wasn’t stellar. The Digital Visual Interface, (introduced in 1999, as if the quirky substitution of “video” for “visual” didn’t give it away) had the immediate benefit of being – obviously – digital. This eliminates the quirks of analog video such misalignment of the video, tweaking refresh clock speeds, and cable interference. The computer can also get precise information about the display, by communicating with the display (though this functionality was also later kludged into VGA).

The downsides of DVI are many. The connector is incredibly bulky – substantially bigger than VGA. Also, since the format’s creators were worried about backwards compatibility (namely new computers using older monitors), they made the early DVI connectors a hybrid, carrying both digital and analog signals. This is part of why the bulky DVI connector contains a whopping 28 prong-style pins (easily bent, for extra debugging funtimes). The digital and analog inter-compatibility also means you have 5 different specialized ports and cables meant to be used for specific applications. Some of the analog-only cables – seemingly intent on baffling users – had a DVI connector on one end and a VGA connector on the other.

While VGA was simple for users to understand, DVI felt like it needed a whole manual to truly grasp how it worked. The good news is that DVI had a relatively short run; it was superseded just a few years later. The bad news is, we couldn’t make up our minds about what should supersede it.

HDMI and DisplayPort

If there’s one thing the AV world holds as gospel, it’s: “Why have one universal format when you can have two competing formats which basically do the same thing?” Ah, ‘tis a fine tradition which hearkens back from VHS versus Betamax, to Blu-Ray versus HD-DVD.

Both DisplayPort and the High-Definition Multimedia Interface can carry digital video and audio on the same cable. Both can carry HD video (and in later versions, 4K video). Both use the same standard of digital copy protection. DisplayPort, the newer format, can even carry an HDMI signal over a DisplayPort connection. In fact, you really have to dig into the technical specs to differentiate between the two. The main difference is that televisions and audio receivers will have HDMI exclusively, while computers and monitors can readily have one, the other, or both.

If there’s a problem with HDMI and DisplayPort, it’s that both (and not just one of them) exist. This leads into the final frustration…

Adapters. Adapters Everywhere.

Without going too deep into the technical details, all three of the aforementioned digital formats can be converted to eachother without any loss in signal quality. The only roadblock is, you’re going to need an adapter. That adapter, in turn, usually only works in one direction. All of this conspires to create chaos in an office IT environment.

Let’s say you have a desktop computer. On the back are two video ports: one HDMI and one DisplayPort. You have a spare pair of monitors you want to use, each with a VGA port and a DVI port. Converting from a digital port to VGA usually results in a loss of quality, so we won’t be doing that. We need one adapter that converts HDMI to DVI, and another that converts DisplayPort to DVI. Depending on the genders on either end of those adapters, we might need an HDMI cable, a Displayport cable, two DVI cables, some combination of the previous, or all of the above.

What’s worse is that – even though DisplayPort is on almost all new computers – electronics stores infuriatingly do not carry full-size DisplayPort adapters. My local Staples doesn’t have any DisplayPort accessories, while the Best Buy has a limited supply of their in-store brand (of dubious quality). London Drugs carries adapters for the Mini DisplayPort (aka Thunderbolt) ports used by Macs and some other laptops, but not the full-size versions, nor adapters to convert Mini DisplayPort to full size (in fact, I have never seen one in person). This means that if you need an adapter before you can finish setting up someone’s computer, you can’t just run to the store and buy one, you need to order it online and wait.

The cost of adapters adds up. The cables are expensive too (no thanks to the exorbitant licensing fees charged by the consortium of HDMI’s founders). And not all cables are created equal. DisplayPort cables, ideally, have a pair of latches on the side to hold the cable in the port (unless released by pushing down a tab on the plug). Cheaper cables omit the latches, and invariably fall out or don’t sit properly, creating video glitches at best and a blank screen at worst.

Oh, and did I forget to mention that two years ago, Apple ditched the perfectly good Mini DisplayPort/Thunderbolt format for USB-C? Ironically, they did this to rid their laptops of more than one type of port. Heh.

Some days, it almost feels like the manufacturers of displays, TVs, desktop computers, laptops, and cables are deliberately frustrating their customers. But for someone who deals with a collection of multiple displays and computers on a regular basis, I sometimes find myself longing for the simpler days when everything was VGA. Sure, sometimes I had to degauss my CRT, but at least I knew what cable I was going to use.