IBM T221 3840×2400 204dpi Monitor – Part 7: Positive Update

For once it would appear that I have a positive update on the subject of Nvidia drivers. It would seem that patching the latest (319.23) driver is no longer required on Linux. Even better, there is a way to achieve a working T221 setup without RandR getting in the way by insisting the two halves are separate monitors. I covered the issues with Nvidia drivers in a previous article.

The build part now works as expected out of the box. Simply:

export IGNORE_XEN_PRESENCE=1
bash ./NVIDIA-Linux-x86_64-319.23.run

and everything should “just work”.

Best of all, there appears to be a workaround for the RandR information being visible even when Xinerama is being overridden. It turns out, Ximerama and RandR seem to be mutually exclusive. So even though the option disabling RandR explicitly seems to get silently ignored, enabling Xinerama fixes that problem. And since the Nvidia driver’s Xinerama info override still works, this solves the problem!

You may recall from a previous article the following in xorg.conf:

[...]
Section "ServerLayout"
	Identifier "Layout0"
	Screen 0 "Screen0" 0 0
	Option "Xinerama" "0"
EndSection
[...]
Section "Screen"
	Identifier "Screen0"
[...]
	Option "NoTwinViewXineramaInfo" "True"
	Option "TwinView" "1"
	Option "TwinViewOrientation" "RightOf"
	Option "metamodes" "DFP-0:1920x2400, DFP-3:1920x2400"
[...]
EndSection

It turns out the solution is to simply enable Xinerama:

Section "ServerLayout"
	Identifier "Layout0"
	Screen 0 "Screen0" 0 0
	Option "Xinerama" "1"
EndSection

This implicitly disables RandR and Nvidia driver’s Xinerama info override takes care of the rest. Magic. :)

Update:
If you are still having problems when using KDE, there is another trick you can use to force xinerama and disable RandR. Ammend the following line in kdmrc:

/etc/kde/kdm/kdmrc:
ServerArgsLocal=-extension RANDR +xinerama -nr -nolisten tcp

WQUXGA – IBM T221 3840×2400 204dpi Monitor – Part 6: Regressing Drivers and Xen

I recently built a new machine, primarily because I got fed up of having to stop what I’m working on and reboot from Linux into Windows whenever my friends and/or family invited me to join them in a Borderlands 2 session. Unfortunately, my old machine was just a tiny bit too old (Intel X38 based) to have full, bug-free VT-d/IOMMU support required for VGA passthrough to work, so after 5 years, I finally decided it was time to rectify this. More on this in another article, but the important point I am getting to is that VGA passthrough requires a recent version of Xen. And there this part of the story really begins.

Some of you may have figured out that RHEL derivatives are my Linux distribution of choice (RedSleeve was a big hint). Unfortunately, RedHat have dropped support for Xen Dom0 kernels in EL6, but thankfully, other people have picked up the torch and provide a set of up to date, supported Xen Dom0 kernels and packages for EL6. So far so good. But it was never going to be so simple, at a time when drivers are getting increasingly dumber, feature sparse and more bloated at the same time. That is really what this story is about.

For a start, a few details about the system setup that I am using, and have been using for years.

  • I am a KDE, rather than Gnome user. EL6 comes with KDE 4, which use X RandR rather than Xinerama extensions to establish the geometry of the screen layout. This isn’t a problem in itself, but there is no way to override whatever RandR reports, so on a T221 you end up with a regular desktop on half of the T221, and an empty desktop on the other, which looks messy and unnatural.
  1. EL6 had had a Xorg package update that bumped the ABI version to from 10 to 11
  2. Nvidia drivers have changed the way TwinView works after version 295.x (TwinView option in xorg.conf is no longer recognized)
  3. Nvidia drivers 295.x do not support Xorg ABI v11.
  4. Nvidia kernel drivers 295.x do not build against kernels 3.8.x.

And therein lies the complication.

Nvidia drivers v295 when used with options TwinView and NoTwinViewXineramaInfo also seem to override RandR geometry to the show there is a single, large screen available, rather than two screens. This is exactly what we want when using the T221. Drivers after 295.x (304.x seems to be the next version), don’t recognize the TwinView configuration option, and while they provide Xinerama geometry override when using the NoTwinViewXineramaInfo option, they do not override RandR information any more. This means that you end up with a desktop that looks as you would expect it to if you used two separate monitors (e.g. status bar is only on the first screen, no wallpaper stretch, etc.), rather than a single, seamless desktop.

As you can see, there is a large compound issue in play here. We cannot use the 295.x drivers, because

  1. They don’t support Xorg ABI 11 – this can be solved by downgrading the xorg-x11-server-* and xorg-x11-drv-* packages to an older version (1.10 from EL 6.3). Easily enough done – just make sure you add xorg-x11-* to your exclude line in /etc/yum.conf after downgrading to avoid accidentally updating them in the future.
  2. They don’t build against 3.8.x kernels (which is what the Xen kernel I am using is – this is regardless of the long standing semi-allergy of Nvidia binary drivers to Xen). This is more of an issue – but with a bit of manual source editing I was able to solve it.

Here is how to get the latest 295.x driver (295.75) to build against Xen kernel 3.8.6. You may need to do this as root.

Kernel source acquisition and preparation:

wget http://uk1.mirror.crc.id.au/repo/el6/SRPMS/kernel-xen-3.8.6-1.el6xen.src.rpm
rpm -ivh kernel-xen-3.8.6-1.el6xen.src.rpm
cd ~/rpmbuild/SPEC
rpmbuild -bp kernel-xen.spec
cd ~/rpmbuild/BUILD/linux-3.8.6
cp /boot/config-3.8.6-1.el6xen.x86_64 .config
make prepare
make all

Now that you have the kernel sources ready, get the Nvidia driver 295.75, the patch, patch it and build it.

wget http://uk.download.nvidia.com/XFree86/Linux-x86_64/295.75/NVIDIA-Linux-x86_64-295.75.run
wget https://dl.dropboxusercontent.com/u/61491808/NVIDIA-Linux-x86_64-295.75.patch
bash ./NVIDIA-Linux-x86_64-295.75.run --extract-only
patch < NVIDIA-Linux-x86_64-295.75.patch
cd NVIDIA-Linux-x86_64-295.75
export IGNORE_XEN_PRESENCE=y
export SYSSRC=~/rpmbuild/BUILD/linux-3.8.6
cp /usr/include/linux/version.h $SYSSRC/include/linux/
./nvidia-installer -s

And there you have it Nvidia driver 295.75 that builds cleanly and works against 3.8.6 kernels. The same xorg.conf given in part 3 of this series will continue to work.

It is really quite disappointing that all this is necessary. What is more concerning is that the ability to use a monitor like the T221 is diminishing by the day. Without the ability to override what RandR returns, it may well be gone completely soon. It seems the only remaining option is to write a fakerandr library (similar to fakexinerama). Any volunteers?

It seems that Nvidia drivers are both losing features and becoming more bloated at the same time. 295.75 is 56MB. 304.88 is 65MB. That is 16% bloat for a driver that is regressively missing a feature, in this case an important one. Can there really be any doubt that the quality of software is deteriorating at an alarming rate?

WQUXGA – IBM T221 3840×2400 204dpi Monitor – Part 5: When You Are Really Stuck With a SL-DVI

I recently had to make one of these beasts work bearably well with only a single SL-DVI cable. This was dictated by the fact that I needed to get it working on a graphics card with only a single DVI output, and my 2xDL-DVI -> 2xLFH-60 adapter was already in use. As I mentioned previously, I found the standard 1xSL-DVI’s worth 13Hz to be just too slow when it comes to a refresh rate (I could see the mouse pointer skipping along the screen), but the default 20Hz from 2xSL-DVI was just fine for practically any purpose.

So, faced with the need to run with just a single SL-DVI port, it was time to see if a bit of tweaking could be applied to reduce the blanking periods and squeeze a few more FPS out of the monitor. In the end, 17.1Hz turned out to be the limit of what could be achieved. And it turns out, this is sufficient for the mouse skipping to go away and make the monitor reasonably pleasant to use.

(Note: My wife disagrees – she claims she can see the mouse skipping at 17.1Hz. OTOH, she is unable to read my normal font size (MiscFixed 8-point) on this monitor at full resolution. So how you get along with this setup will largely depend on whether your eyes’ sensitivity is skewed toward high pixel density or high frame rates.)

The xorg.conf I used is here:

Section "Monitor"
  Identifier    "DVI-0"
  HorizSync    31.00 - 105.00
  VertRefresh    12.00 - 60.00
  Modeline "3840x2400@17.1"  165.00  3840 3848 3880 4008  2400 2402 2404 2406 +hsync +vsync
EndSection

Section "Device"
  Identifier    "ATI"
  Driver        "radeon"
EndSection

Section "Screen"
  Identifier    "Default Screen"
  Device        "ATI"
  Monitor        "DVI-0"
  DefaultDepth    24
  SubSection "Display"
    Modes    "3840x2400@17.1"
  EndSubSection
EndSection

The Modeline could easily be used to create an equivalent setting in Windows using PowerStrip or a similar tool, or you could hand-craft a custom monitor .inf file.

In the process of this, however, I have discovered a major limitation of some of the Xorg drivers. Generic frame buffer (fbdev) and VESA (vesa) drivers do not support Modelines, and will in fact ignore them. ATI’s binary driver (fglrx) also doesn’t support modelines. Linux CCC application mentions a section for custom resolutions, but there is no such section in the program. So if you want to use a monitor in any mode other than what it’s EDID reports, you cannot use any of these drivers. This is a hugely frustrating limitation. In the case of fbdev driver, it is reasonably forgiveable because it relies on whatever modes the kernel frame buffer exposes. In the case of the VESA driver it is understandable that it only supports standard VESA modes. But ATI’s official binary driver lacking this feature is quite difficult to forgive – it has clearly be dumbed down too far.

WQUXGA – IBM T221 3840×2400 204dpi Monitor – Part 3: ATI vs. Nvidia

It is at times like this that I get to fully consider just how bad a decision it was to jump ship from ATI to Nvidia when it came to graphics cards. But now that sense has been forced back upon me, I will hopefully not consider such madness again for the best part of the next decade.

Due to the ATI drivers being fundamentally unable to handle the T221 reliably, I bit the bullet and decided to put my old 8800GT card back in. The first WTF came about when it transpired that ATI drivers cannot be uninstalled from Windows XP using their bundled uninstaller in Safe or VGA modes. This is quite bad when you consider that it could be the ATI drivers that are making the machine not boot into normal mode. Credit to it, though, the ATI uninstaller was not too bad once I ran it in normal mode, and after using it to remove all ATI software and uninstalling the ATI devices in Device Manager, there wasn’t enough left to cause problem on the next reboot, during which the machine contained an Nvidia card. Everything booted up fine, and after a quick run of the Auslogics Registry Cleaner (just to make sure – easily the best registry cleaner I have used to date), everything was ready for the installation of Nvidia drivers. Everything went quite painlessly, and a reboot later I had the T221 configured for 2x1920x2400@20Hz mode. The only thing that didn’t come up perfectly by default is that I had to add the 1920×2400@20Hz mode in Nvidia Control Panel (click the Customize button).

By this point, the superior features of Nvidia were already becoming apparent:

  • Text and low-resolution mode anti-aliasing in firmware – such modes look vastly better than on ATI hardware).
  • Until the driver enables the secondary port, it remains disabled. This is really nice on the T221 because it means you don’t get the same thing on the screen twice during the BIOS POST and early stages of the boot process. I can imagine this also being annoying on a multi-head setup.
  • The primary port wasn’t switching for no apparent reason in Windows with multiple screens plugged in.
  • Best of all – Windows XP drivers Just Work ™. They don’t forget their settings between reboots.
  • No tearing down the middle of the screen where the two halves meet. With the ATI card, the mouse couldn’t be drawn on both halves at the same time. In the middle, you could make it virtually disappear, Not a big deal, but yet another example of general bugginess. Also, in games the tearing along the same line disappeared (I always run with vsync forced to on, and it was still visible from time to time with the ATI card).

Just the properly working drivers would have easily convinced me of the error of my recent ways, but all the other niceties really make for a positive difference to the experience.

After I got Windows working (it only took 20 minutes, after giving up on ATI after wasting a whole day on getting it to work properly and remember the settings between reboots), it was time to get things working in Linux. The first thing that jumped out at me about this part of the exercise is just how much better ATI’s Linux drivers are compared to their Windows drivers. It is obvious that they are actually being developed by somebody competent. Unlike the Windows drivers, the Linux drivers worked out of the box, and the only unusual thing that I needed to do was to make sure Fake Xinerama was configured and preloaded. Removing them was a simple case of:

# rpm -e fglrx

Simple, efficient, reliable. Seems ATI‘s Windows driver team have a lot to learn from their Linux driver team.

The machine came up fine with the nouveau drivers loaded, but I wanted to get Nvidia’s binary drivers working. The experience here was a little more problematic than it had been with the ATI drivers. The nvidia-xconfig and nvidia-settings utilities weren’t as intuitive as the ATI configuration utility, and the setup suffered from a particularly annoying problem where GPU scaling would default to on. This resulted in the screen mode being left stretched and unusable, but sometimes just starting the nvidia-settings program would fix some of it. In the end I just gave up on it and wrote my own xorg.conf according to the documentation – and that worked perfectly. You may want to set the following environment variable to force vsync in GL modes (e.g. for mplayer’s GL output)

# export __GL_SYNC_TO_VBLANK=1

This ensured there was no tearing visible during video playback.

One thing worth noting is that Nvidia drivers bring their own Xinerama layer with them, so the Xorg Xinerama should be disabled. There is also an option for faking the Xinerama information (NoTwinViewXineramaInfo), so no need for Fake Xinerama, either.

Update: Nvidia Linux drivers after 295.xx have removed the NoTwinViewXineramaInfo option, and due to some strange set of side effects, the Fake Xinerama doesn’t work as a workaround, either. So if you plan to use this on Linux, for now make sure you use a driver no later than 295.xx.

In conclusion, it is quite clear that Nvidia win hands down in terms of features and user experience on Windows, due to their more stable drivers are more intuitive configuration utilities. The story is different on Linux – I would put ATI slightly ahead on that platform, at least in terms of configuration utilities. Having to use Fake Xinerama isn’t a big deal for the technically minded. Even on Linux, however, in terms of the overall outcome and the end experience, I feel Nvidia still come out ahead, since ATI drivers still occasionally produce visible tearing when playing back high definition video.

All this made me think about what is the most important thing about a product such as graphics cards. In the end it is not just about performance. Performance is only a part of the overall package. What I find is that the most important thing about a product is the whole experience of configuring it and using it. How easy is it to get to working under edge case conditions? How reliable is it – once it is working does it stay working? Are there any experience ruining artifacts such as tearing visible in applications, even with vsync enabled? These sorts of things along with the crowning touches such as anti-aliasing of low resolution modes and only having one active video output until the drivers specifically enable the others are what really impacts the experience. And based on my experience of Nvidia and ATI cards over the past few years, I hope somebody talks some sense to me if I consider an ATI product again – except perhaps if their FireGL team starts writing their Windows drivers.

Update 2: It would appear that since the Radeon HD4xxx series, the ATI products have degenerated even further. Both their Windows and Linux drivers have removed the funtionality of providing custom screen modes; modelines in Linux and monitor .inf driver files get completely ignored – the card will only do the modes that EDID provides. Worse, ATI cards newer than HD4xxx only come with a single DL-DVI port. The other DVI port is single-link only. That rules out driving a T221 via the 2xDL-DVI adapter. As a final kick in the teeth, newer ATI cards also only support up to 3 simultaneous outputs, so driving the T221 via 4xSL-DVI channels will not work, either. This makes ATI cards completely unsuitable for use with the T221 monitors. Thankfully all Nvidia cards still support dual-link on all DVI ports, and accept custom EDID modes.

WQUXGA – IBM T221 3840×2400 204dpi Monitor – Part 2: Windows

When I set out to do this, I thought getting everything working under Windows would be easier than it was under Linux. After all, the drivers should be more mature and AMD would have likely put more effort into making sure things “just work” with their drivers. The experience has shown this expectation was unfounded. Getting the T221 working in SL-DVI 3840×2400@13Hz mode was trivial enough, but getting the 2xSL-DVI 2x1920x2400@20Hz mode working reliably has proven to be quite impossible.

The first problem has been the utter lack of intuitiveness in the Catalyst Control Center. It took a significant amount of research to finally find that the option for desktop stretching across two monitors lies behind a right click menu on an otherwise unmarked object:

CCC Desktop Stretch Option

CCC Desktop Stretch Option

Results, however, were intermittent. Sometimes the resolution for the second half of the screen would randomly get mis-set, sometimes it would work. Sometimes the desktop stretching would fail. Eventually, when it all worked (and it would usually require a lot of unplugging of the secondary port to get a usable screed back), it would be fine for that Windows session, but it would all go wrong again after a reboot. The screen would just go to sleep at the point where the login screen should come up, and the only way to wake it up is to unplug the secondary DVI link, log in, and then plug in the second cable, usually a few times, before it would come up in a usable mode. Then the same resolution and desktop stretching configuration process would have to be repeated – with a non-deterministic number of attempts required, using both the Windows display settings configuration and the Catalyst Control Center.

At first I thought it could be due to the fact that I am using a 4870X2 card, so I disabled one of the GPUs. That didn’t help. Then I tried using a different monitor driver, rather than the “Default Monitor” which is purely based on the EDID settings the monitor provides. I tried a ViewSonic VP2290b driver (this was a rebranded T221), and a custom driver created using PowerStrip based on the EDID settings, and neither helped. Since I only use Windows for occasional gaming and not for any serious work, this isn’t a show stopping issue for me, but I am stunned that AMD‘s Linux drivers are more stable and usable than the Windows ones when using even slightly unusual configurations.

To add a final insult to injury, 4870X2 card doesn’t end up running the monitor with one GPU running each 1920×2400 section. Instead, one GPU ends up running both, and the 2nd GPU remains idle. At first I attributed the tearing between the two halves of the screen to be due to each half being rendered by a different GPU. Unfortunately, considering that all tests show that one GPU remains cold and idle while the other one is shown to be under heavy load, I have to conclude that this is not the case. This is particularly disappointing because the experience is both visually bad (tearing between the two 1920×2400 sections) and poorly performing (one GPU always remains idle and the frame rates suffer quite badly – 7-9fps in the Crysis Demo Benchmark). I clearly recall that my Nvidia 9800GX2 card I had before had a configuration option to enable dual-screen dual-GPU mode.

I am just about ready to give up on AMD GPUs, purely because the drivers are of such poor quality and lacking important features (e.g. requirement of fakexinerama under Linux, something that Nvidia drivers have a built in option for). I’m going to dig out my trusty old 8800GT card and see how that compares.

WQUXGA – IBM T221 3840×2400 204dpi Monitor – Part 1: Linux

I’m not sure how many people occasionally stop to notice this sort of thing, but to me it frequently seems that technology regresses for long periods from it’s infrequent peaks. In the 60s we saw flights of the likes of XB-70 Valkyrie and the SR-71 Blackbird, and people walked on the moon. Yet in 2011 we are reading about the last flight of the Space Shuttle rather than about the first colony on Mars. It makes a quote from Idiocracy all the more uncanny: “… sadly the world’s greatest minds and resources where focused on conquering hair loss and prolonging erections.

The same pattern seems to apply to some aspects of the computer industry, when cost pressures take precedence over quality, features and innovation. In 2001, we saw the introduction of the IBM T220 monitor, with resolution of 3840×2400 on a 22.2″ panel. It was later superseded by the T221 with very similar specifications, but it was ultimately discontinued in 2005. Nothing matching it has been available since. Today, the screen resolutions seems to be undergoing an erosion. On small panels the “standards” (sub-standards?) have settled at the completely unusable 1024×600, and with total of five exceptions from Dell (3007WFP, 3008WFP, U3011), Samsung (305T) and Apple (Cinema HD), the commonly available screens are limited to 1920×1080 resolution. Even 1920×1200 screens are getting more and more rare, especially on laptops, because screens are marketed by diagonal size and for any given diagonal length, 16:9 ratio screens have a smaller surface area than 16:10 ratio screens.

IBM T221 monitors, especially of the latest DG5 variety, are very hard to come by and still expensive if you can ever find one. Typically they sell for double what you can get a Dell 3007WFP for. But you do get more than twice the pixel count and more than twice the pixel density. I have recently acquired a T221 and if your eyes can handle it (and mine can), the experience is quite amazing – once you get it working properly. Getting it working properly, however, can be quite a painful experience if you want to get the most out of it.

My T221 came with a single LFH-60 -> 2x SL-DVI (single link DVI) cable. There are two LFH-60 connectors on the T221, which allows the screen to be run using 4x SL-DVI inputs. This provides a maximum refresh of 48Hz. There is also a way to run this monitor using 2xDL-DVI inputs at 48Hz, but this requires special adapters, but that is a subject for another article, since I haven’t got any of those yet.

Using a single LFH-60 -> 2x SL-DVI cable, there are only two modes in which the T221 can be run:

1) As a single 3840×2400 panel @ 13Hz using a single SL-DVI port

2) As two separate monitors, each being 1920×2400 @ 20Hz, using two SL-DVI ports

The 13Hz mode is completely straightforward to get working on both RHEL6 and XP x64, but 13Hz is  just not fast enough. You can actually see the mouse pointer skipping as you move it, and playing back a video also results in visible frame skipping. So I have spent the effort to get the 2x1920x2400@20Hz mode working on my ATI HD4870X2. The end results are worth it, but the process isn’t entirely straightforward. The important thing to consider is that when running in anything other than 3840×2400@13Hz mode appears to the computer as two completely separate 1920×2400 monitors.

IBM T221 with Linux

ATI‘s Linux drivers aren’t really mature enough for the job, and to achieve the best results, you have to use aticonfig to generate xorg.conf without xinerama support, start X-Windows, fire up the amdcccle configuration utility for ATI cards, enable dual screens, then add xinerama support. If all this sounds complicated to you – it is, and it took a lot of trial and error to get right. So to save you the effort, here is a copy of my xorg.conf file. This is from a RHEL6 machine using the ATI fglrx driver. It will almost certainly work on other distributions, too, with little or no modification.

This still won’t work quite as you’d hope, though – xinerama passes information to the applications about the geometry of the desktop, and apps will only maximize to one screen. This also goes for the task bar, and applies to video playback. The last bit of magic involves faking the xinerama information. Nvidia drivers come with a built in option for this: “NoTwinViewXineramaInfo”. Unfortunately, ATI drivers have no such option. But, this being the world of Linux, there is a backup plan. There is a LD_PRELOAD library called Fake Xinerama that can be used to override the screen geometry passed to applications, and make the applications think they are on a single 3840×2400 screen. All you need to do is the following:

1) Compile fake xinerama from the like above
2) Add the line “/usr/local/lib64/libXinerama.so” to your /etc/ld.so.preload file.
3) Create a file ~/.fakexinerama containing:

1
0 0 3840 2400

The first line contains the number of screens, the second line’s format is:
<origin X> <origin Y> <width X> <width Y>
If you are booting into graphical environment immediately (runlevel 5), you will need the .fakexinerama file in root’s home directory, too, since gdm/kdm run as root.

And if you have managed to follow all that, you will have a single seamless  3840×2400@20Hz desktop.