Index ¦ Archives ¦ RSS

Triple Vision

At work I've been doing a lot more frontend work. This involves writing HTML templates, CSS and Javascript. I also do the backend work (web services, databases, processing) and my experience is that the more you can see at once, the more productive you can be [1]. This involves multiple terminals, browsers, admin tools, DOM inspectors, consoles and editors.

For many years I've always used the biggest monitors I could afford/find [2]. There are various reports of multi-monitor productivity, but it is hard to distinguish between total display real estate versus the number of monitors. For me dual monitors has been how I increase my display real estate. (Obligatory Onion, web coding experience, engineering culture)

My motherboard has 4 video outputs (3 digital, one analogue) so adding a third monitor should be a no brainer. Heck there is even a video showing that it works just fine. I purchased a third monitor from Costco, and plugged it in to the VGA port (I needed to order a digital cable so it was a stopgap). It didn't work, with me ending up with 0, 1 or 2 working displays but never 3. Even more amusing was how confused everything got - the BIOS boot seems to pick the oldest connector technology to display on, while Linux would keep remembering which display was primary in some convoluted way that meant it was almost always wrong.

Eventually I tracked down the problem. Some programmable hardware known as a CRTC is needed to read the display memory at a certain rate suitable for the outputs it is connected to. Despite having 4 outputs, there are only 2 CRTCs. A CRTC can be hooked up to multiple outputs providing the monitor/output clocking is identical which usually requires using identical monitors (the same resolution is not sufficient).

With a heavy heart I decided to retire my smaller 16:10 monitor and get another monitor from Costco to replace it [3]. I now had two identical monitors that could share a CRTC and my original largest 16:10 to use the other one. Because of connectors and cables, my largest needed to use the DisplayPort connection with an adapter to HDMI.

You know what is coming next - the display wouldn't get a signal. It turns out that not only did they cheap out on the CRTCs, but they also cheaped out on the DisplayPort connection. Despite there being a spare CRTC and DisplayPort supporting multiple final outputs, they didn't bother. I had to buy an active adapter.

Finally after all that my 3 screens work. I have a 28 inch 16:10 display in the centre (1920x1280), and 27 inch 16:9 screens on either side (1920x1080). Except under Windows which detects all 3 screens just fine, but refuses to save the settings when you enable the third. I don't use Windows for real work so it doesn't matter.

What should have taken 30 minutes turned into a several week saga. Apparently computers are supposed to increase productivity.

[1]When I first started programming as a kid on Apple IIs and Sinclair Spectrums, the screens were small and you'd be lucky to get something like 32 x 20 characters on the screen. There was the slow trend of monitors getting slightly larger - 11 inches being replaced by 12, 12 by 13 etc. I don't remember those as unproductive times. By 2000 I was spending thousands on 19 and 21 inch monitors when I could.
[2]When working professionally in the 90s my colleagues got various workstations. I picked the NCD 19c which was a whopping 19 inch display, 1280 x 1024 resolution and an X terminal (dumb display device). The display was great and I had run apps all over the place, displaying to the terminal. I didn't switch to a different primary device until 1999 or so.
[3]

There was also a detour via Nvidia. The only PC game I play is 10 years old and my work is mostly text editors, browsers and terminals. This doesn't require high end acceleration. I got an Nvidia card (low end 6100). It had two outputs and in theory would work with the Intel HD4000 for the third monitor.

In practise things didn't work well, especially on Linux where Nvidia's driver doesn't support standard display drivers, and is a closed binary so only Nvidia can add that functionality. It turns out there is a beta driver that did support it, but then refused to cooperate with the Intel driver. On Windows it worked fine.

I got a second Nvidia card since they claimed it would work via a "mosaic" mode. After eventually getting the right voodoo in the configuration (and it working fine on Windows), their driver refused to do it on the grounds that the chipset wasn't authorised, implying that if I had spent $400 per card then they'd be okay doing it. The cards were also noisy. I gave up on Nvidia and got a quieter machine back and no arbitrary drivers that error on authorisations.

Contact me