WQUXGA a.k.a. OMGWTF – IBM T221 3840×2400 204dpi Monitor – Part 3: ATI vs. Nvidia

It is at times like this that I get to fully consider just how bad a decision it was to jump ship from ATI to Nvidia when it came to graphics cards. But now that sense has been forced back upon me, I will hopefully not consider such madness again for the best part of the next decade.

Due to the ATI drivers being fundamentally unable to handle the T221 reliably, I bit the bullet and decided to put my old 8800GT card back in. The first WTF came about when it transpired that ATI drivers cannot be uninstalled from Windows XP using their bundled uninstaller in Safe or VGA modes. This is quite bad when you consider that it could be the ATI drivers that are making the machine not boot into normal mode. Credit to it, though, the ATI uninstaller was not too bad once I ran it in normal mode, and after using it to remove all ATI software and uninstalling the ATI devices in Device Manager, there wasn’t enough left to cause problem on the next reboot, during which the machine contained an Nvidia card. Everything booted up fine, and after a quick run of the Auslogics Registry Cleaner (just to make sure – easily the best registry cleaner I have used to date), everything was ready for the installation of Nvidia drivers. Everything went quite painlessly, and a reboot later I had the T221 configured for 2x1920x2400@20Hz mode. The only thing that didn’t come up perfectly by default is that I had to add the 1920×2400@20Hz mode in Nvidia Control Panel (click the Customize button).

By this point, the superior features of Nvidia were already becoming apparent:

  • Text and low-resolution mode anti-aliasing in firmware – such modes look vastly better than on ATI hardware).
  • Until the driver enables the secondary port, it remains disabled. This is really nice on the T221 because it means you don’t get the same thing on the screen twice during the BIOS POST and early stages of the boot process. I can imagine this also being annoying on a multi-head setup.
  • The primary port wasn’t switching for no apparent reason in Windows with multiple screens plugged in.
  • Best of all – Windows XP drivers Just Work ™. They don’t forget their settings between reboots.
  • No tearing down the middle of the screen where the two halves meet. With the ATI card, the mouse couldn’t be drawn on both halves at the same time. In the middle, you could make it virtually disappear, Not a big deal, but yet another example of general bugginess. Also, in games the tearing along the same line disappeared (I always run with vsync forced to on, and it was still visible from time to time with the ATI card).

Just the properly working drivers would have easily convinced me of the error of my recent ways, but all the other niceties really make for a positive difference to the experience.

After I got Windows working (it only took 20 minutes, after giving up on ATI after wasting a whole day on getting it to work properly and remember the settings between reboots), it was time to get things working in Linux. The first thing that jumped out at me about this part of the exercise is just how much better ATI’s Linux drivers are compared to their Windows drivers. It is obvious that they are actually being developed by somebody competent. Unlike the Windows drivers, the Linux drivers worked out of the box, and the only unusual thing that I needed to do was to make sure Fake Xinerama was configured and preloaded. Removing them was a simple case of:

# rpm -e fglrx

Simple, efficient, reliable. Seems ATI‘s Windows driver team have a lot to learn from their Linux driver team.

The machine came up fine with the nouveau drivers loaded, but I wanted to get Nvidia’s binary drivers working. The experience here was a little more problematic than it had been with the ATI drivers. The nvidia-xconfig and nvidia-settings utilities weren’t as intuitive as the ATI configuration utility, and the setup suffered from a particularly annoying problem where GPU scaling would default to on. This resulted in the screen mode being left stretched and unusable, but sometimes just starting the nvidia-settings program would fix some of it. In the end I just gave up on it and wrote my own xorg.conf according to the documentation – and that worked perfectly. You may want to set the following environment variable to force vsync in GL modes (e.g. for mplayer’s GL output)

# export __GL_SYNC_TO_VBLANK=1

This ensured there was no tearing visible during video playback.

One thing worth noting is that Nvidia drivers bring their own Xinerama layer with them, so the Xorg Xinerama should be disabled. There is also an option for faking the Xinerama information (NoTwinViewXineramaInfo), so no need for Fake Xinerama, either.

In conclusion, it is quite clear that Nvidia win hands down in terms of features and user experience, especially on Windows, due to their more stable drivers are more intuitive configuration utilities. The story is different on Linux – I would put ATI slightly ahead on that platform, at least in terms of configuration utilities. Having to use Fake Xinerama isn’t a big deal for the technically minded. Even on Linux, however, in terms of the overall outcome and the end experience, I feel Nvidia still come out ahead, since ATI drivers still occasionally produce visible tearing when playing back high definition video.

All this made me think about what is the most important thing about a product such as graphics cards. In the end it is not just about performance. Performance is only a part of the overall package. What I find is that the most important thing about a product is the whole experience of configuring it and using it. How easy is it to get to working under edge case conditions? How reliable is it – once it is working does it stay working? Are there any experience ruining artifacts such as tearing visible in applications, even with vsync enabled? These sorts of things along with the crowning touches such as anti-aliasing of low resolution modes and only having one active video output until the drivers specifically enable the others are what really impacts the experience. And based on my experience of Nvidia and ATI cards over the past few years, I hope somebody talks some sense to me if I consider an ATI product again – except perhaps if their FireGL team starts writing their Windows drivers.

WQUXGA a.k.a. OMGWTF – IBM T221 3840×2400 204dpi Monitor – Part 2: Windows

When I set out to do this, I thought getting everything working under Windows would be easier than it was under Linux. After all, the drivers should be more mature and AMD would have likely put more effort into making sure things “just work” with their drivers. The experience has shown this expectation was unfounded. Getting the T221 working in SL-DVI 3840×2400@13Hz mode was trivial enough, but getting the 2xSL-DVI 2x1920x2400@20Hz mode working reliably has proven to be quite impossible.

The first problem has been the utter lack of intuitiveness in the Catalyst Control Center. It took a significant amount of research to finally find that the option for desktop stretching across two monitors lies behind a right click menu on an otherwise unmarked object:

CCC Desktop Stretch Option

CCC Desktop Stretch Option

Results, however, were intermittent. Sometimes the resolution for the second half of the screen would randomly get mis-set, sometimes it would work. Sometimes the desktop stretching would fail. Eventually, when it all worked (and it would usually require a lot of unplugging of the secondary port to get a usable screed back), it would be fine for that Windows session, but it would all go wrong again after a reboot. The screen would just go to sleep at the point where the login screen should come up, and the only way to wake it up is to unplug the secondary DVI link, log in, and then plug in the second cable, usually a few times, before it would come up in a usable mode. Then the same resolution and desktop stretching configuration process would have to be repeated – with a non-deterministic number of attempts required, using both the Windows display settings configuration and the Catalyst Control Center.

At first I thought it could be due to the fact that I am using an ATI HD4870X2 card, so I disabled one of the GPUs. That didn’t help. Then I tried using a different monitor driver, rather than the “Default Monitor” which is purely based on the EDID settings the monitor provides. I tried a ViewSonic VP2290b-3 driver (this was a rebranded T221), and a custom driver created using PowerStrip based on the EDID settings, and neither helped. Since I only use Windows for occasional gaming and not for any serious work, this isn’t a show stopping issue for me, but I am stunned that AMD‘s Linux drivers are more stable and usable than the Windows ones when using even slightly unusual configurations.

To add a final insult to injury, 4870X2 card doesn’t end up running the monitor with one GPU running each 1920×2400 section. Instead, one GPU ends up running both, and the 2nd GPU remains idle. At first I attributed the tearing between the two halves of the screen to be due to each half being rendered by a different GPU. Unfortunately, considering that all tests show that one GPU remains cold and idle while the other one is shown to be under heavy load, I have to conclude that this is not the case. This is particularly disappointing because the experience is both visually bad (tearing between the two 1920×2400 sections) and poorly performing (one GPU always remains idle and the frame rates suffer quite badly – 7-9fps in the Crysis Demo Benchmark). I clearly recall that my Nvidia 9800GX2 card I had before had a configuration option to enable dual-screen dual-GPU mode.

I am just about ready to give up on AMD GPUs, purely because the drivers are of such poor quality and lacking important features (e.g. requirement of fakexinerama under Linux, something that Nvidia drivers have a built in option for). I’m going to dig out my trusty old 8800GT card and see how that compares.

Genesi Efika MX Smartbook’s 0 Button Mouse

I love my Genesi Efika MX Smartbook – it’s an awesome little machine. But there have been three things that have bothered me about it since I got mine, and they are the sort of things that can make a difference between sub-mediocrity and brilliance. I have already covered one of the issues in a previous post concerning the screen upgrade.

The second big problem I have with it is that the buttons on the touch pad are completely unusable. This is not an exaggeration. Due to the way they are designed, it is only possible to use them for dragging with a copious amount of luck – not skill – luck. Clicking using the buttons in the touchpad requires only an infinitesimally smaller amount of luck than dragging. This isn’t acceptable, and since I otherwise rather like the Smartbook, I decided to find a good workaround that doesn’t involve carrying a mouse or a trackball with me – this would ruin one of the best things about it – the portability.

I used to have Sony Vaio PCG-U1 and PCG-U3 machines in the past. They were quite awesome, and competed quite successfully on spec with the Genesi Efika MX Smartbook – which is fairly impressive considering the Vaio’s in question were produced in 2002 – 9 years ago. The main reason why I finally needed to upgrade from the old Vaio was because 1024×768 sccreen resolution simply stopped being sufficient for any serious use. The standard Efika would have failed this requirement even worse were it not for the possibility of the 1280×720 screen upgrade. Plus, the Efika is much thinner and doesn’t require a battery pack as big as the rest of the laptop for 6 hours’ battery life. But I digress. The main point I was getting to is that the Vaio had mouse buttons that were quite separate from the joypad, while still being very ergonomic and easy to use. This made me think about using a similar trick on the Efika. All I needed was two conveniently placed yet redundant keys on the keyboard to remap into mouse buttons. The “House” (the one with an icon of a houe as opposed to”Home”) and “Alt” keys in the bottom left corner seemed perfect for this task.

To do this, we need to do two things:

  1. Disable Xorg’s usage of the keys using xmodmap. I put mine in /etc/X11/xmodmap.
  2. Configure actkbd to trap the low-level keystrokes and execute xdotool commands to issue Xorg mouse button events. Put this in /etc/actkbd.conf
  3. Put the two together and make it happen automatically on login using a script /etc/X11/Xsession.d/95-keyremap.

That is pretty much it. The “House” and “Left Alt” keys will now act as left and right mouse buttons respectively. I hope you find it to be a big an improvement as I did. It feels like having mouse buttons again after being stuck with a 0 button mouse.

These instructions are for Ubuntu, since that is what the Efika ships with and I haven’t gotten around to putting Fedora on it yet. It shouldn’t be difficult to adapt the above approach for other distributions.