Dual-head setup with ATI Radeon HD 4650

Every once in a while I need to set up a new computer. This process is usually smooth because my computers are fairly uniform in their setup; they all run Debian unstable with KDE4. I install a minimal Debian testing from the current net-installer, update to unstable, and install further packages as necessary.

A few things can trip me up during this process; for example, I would usually try to set up an encrypted partition, but would have forgotten the details, and would have to spend an unnecessary amount of time playing with the partitioner figuring them out again. Whenever something like this happens, I resolve to write down the details for posterity, but of course never actually do it. Until today.

Configuring X was once considered the trickiest part of installing a GNU/Linux system, at least for Debian. Thanks to the progress made in recent years, this is no longer true. First there was debconf to help you generate a configuration file, and now you don't even need a configuration file in most cases. Except when you have a dual-head setup.

I first had the opportunity to use a dual-head setup (two monitors connected to a computer) about four years ago, and have since become a big fan. I had an Nvidia card then, and setting up the system involved downloading Nvidia's proprietary driver and kernel module, and hand-crafting a suitable xorg.conf file. I have since forgotten the details, but I ended up using something Nvidia referred to as TwinView, which resulted in a very sensible combination of essentially two separate KDE desktops but with the ability to drag windows from one to the other, or resize a window to cover both. The desktops were separate in the sense that they had separate panels, separate desktop backgrounds, etc., and new windows would appear in either of the desktops, not across both. In particular, krunner (Alt-F2) would appear centered on one of the desktops, and not straddling the two. In other words, it was not simply a very large desktop with half showing in each monitor. Maintenance-wise, the only irritating aspect was that with every kernel upgrade, the kernel module had to be recompiled (and X would stop working till that was done). module-assistant was helpful, although not always as smooth as I would have liked.

When I moved to ISI late last year, I naturally put in a request for a dual-head setup. Bureaucracy being what it is (and March being the end of the financial year), I finally got a dual-head graphics card installed today (the rest of the computer has been up and running for a while already). Setting it up was not trivial, and this time, I decided to actually write down my experience.

I had originally asked for Radeon HD 4650XT, because HP mentioned it as a dual-head card compatible with the desktop, which was an HP "business desktop". The card I got was an ATI Radeon HD 4650 (i.e., no mention of the XT part --- I'm not sure what the difference is). Once it was plugged into the proper PCIe16 slot, and we got over the initial scare regarding whether a DVI-D cable was supposed to fit into a DVI-I slot (it was), we started up the computer again and got back the previous desktop, now mirrored on the two monitors, but with no hardware acceleration. The next few hours would be spent on getting a proper dual-head configuration working with accelerated graphics. That is what this post is about.

At this point, I did not have a xorg.conf file. The default driver selected by X turned out to be "radeon". There was another alternative, "radeonhd". A little research suggested that neither were likely to have 3D acceleration (radeonhd claims experimental 3D support for RV730, the chipset for HD 4650), and furthermore it wasn't immediately clear how to set up a dual-head system. More importantly, the same research suggested that ATI/AMD provides a proprietary binary Linux driver, much like Nvidia, for its newer cards. I decided to check it out. For the record, the open source drivers can probably also do dual-head, and I might try it out sometime. This wiki entry claims that 3D acceleration is currently better for the proprietary driver, but 2D acceleration is worse. I haven't found an explicit comparison with regard to dual-head setups, but I haven't looked that hard.

Anyway, the first promising document I found was an ATI FAQ, which started off by saying that
The ATI Proprietary Linux driver currently provides hardware acceleration for 3D graphics and video playback. It also includes support for dual displays...
Exactly what I want! There is also mention of some sort of ``control panel''. Even better. Unfortunately, the FAQ was a bit vague about where I would get said wonderful driver.

It turns out that the magic word is fglrx, which stands for ``FireGL and Radeon for X''. Debian's non-free section has packages for the proprietary driver, and installation is described in the wiki, which does say something a little scary:
For 3D acceleration, fglrx requires an associated kernel module, its compilation can be automated via module-assistant or DKMS.
I started off by installing fglrx-driver (and dependencies), and fglrx-control, which was suggested by the former and apparently contained the aforementioned control panel. I was mentally preparing myself for a brief tussle with module-assistant, but the installation step automatically downloaded fglrx-modules-dkms and compiled something, and seemed to be happy with the results. I assumed (correctly) that no further kernel module compilation was required.

The recent move of X to a xorg.conf-less setup caught me by surprise a little bit, and I'm not yet sure when an explicit configuration file is necessary. So, as my next step, I rebooted my computer (I could have just restarted X, but the wiki made noises about unloading kernel modules, so I didn't risk it). Naturally nothing interesting happened, and I ended up still using the "radeon" driver. I think X simply has a default driver for each card, irrespective of what drivers are available on the system (which is sensible).

I decided to try out the control panel. The package was fglrx-control, and the web had references to an executable called fglrxconfig, but there was no such thing on my system. I ended up calling
wajig list-files fglrx-control
to see what it had installed, and guess what? The executable I was looking for was amdcccle (the CCC stands for Catalyst Control Center; Catalyst is the name of the configuration software). I started it up and got a not entirely unexpected message:
There was a problem initializing Catalyst Control Center Linux edition. It could be caused by the following.

No ATI graphics driver is installed, or the ATI driver is not functioning properly. Please install the ATI driver appropriate for you ATI hardware, or configure using aticonfig.
So I ran aticonfig (which supposedly is an utility to ``parse an existing X-Server configuration file and modify it to operate with ATI products'') to get
$ sudo aticonfig --initial=dual-head
[sudo] password for deepayan:
Could not find configuration file, so create an empty one
Uninitialised file found, configuring.
Using /etc/X11/xorg.conf
Saved back-up to /etc/X11/xorg.conf.original-0
Rebooting again, I got one screen (the other went blank), with apparently no 3D acceleration, but with the "fglrx" driver now in use. By the lack of 3D acceleration, I actually mean that the compositing effects of KDE were disabled. The KDE control panel said "Compositing is temporarily disabled" without any further explanation. It turns out that 3D acceleration was working fine, as evidenced by running fgl_glxgears.

Instead of describing the next phase of discovery in excruciating detail, I will just summarize the results. amdcccle worked perfectly this time around, and once I got used to it, turned out to be everything I could have wanted it to be (except that it required an X restart everytime you changed something, which is not really its fault). There were two approaches that could give me the dual-head setup I wanted. One was Xinerama (a term I remembered as being associated with dual-head setups but didn't remember the details), the other was what ATI refers to as Big Desktop mode (similar to Nvidia's TwinView).

Enabling Xinerama using amdcccle was a two-step process. First, go to Display Manager and choose "Single Display Desktop (multi-desktop)", restart X, then go to Display Options and choose Xinerama (the Xinerama option would be disabled without the first step). This was otherwise nice, but KDE's display control module now said that "Compositing is not supported on your system. Required X extensions (XComposite and XDamage) are not available."

It turns out that Xinerama is incompatible with X extensions required by for compositing. Xinerama predates the days of ubiquitious dual-head graphics cards, and was designed for sharing a desktop across multiple monitors connected to multiple graphics cards. For proper dual-head cards, that is, when both monitors are connected to the same GPU, the RandR extension provides a functionally similar alternative that does not suffer from this drawback. Xinerama is apparently now deprecated in favour of RandR. This is (I think) ATI's Big Desktop mode.

To enable this using amdcccle, one needs to choose "Multi-display desktop [with display...]" from Display Manager and restart X. (If Xinerama was active, that needs to be disabled first, requiring an extra X restart.) And that was it.

Interestingly, although not surprisingly, fgl_glxgears gives a framerate of ~1200 without compositing, and ~300 with. So, disable compositing if you want high OpenGL performance in an application.

2 comments:

Philipp said...

Thanks man! I'd never have guessed that Xinerama was the problem! Turning it off in amdcccle and then setting my displays to "Multi-Desktop display with Screens (2)" did the trick.

Dave said...

I should point out that the lack of compositing in the Xinerama mode was causing Ubuntu's Unity (which uses Compiz) to fail to show up entirely, leaving me without any kind of desktop environment. Switching back out to non-Xinerama made things come back, and it looks like they've fixed the desktop geometry issues with asymmetrical monitor sizes (previously, the desktop would be a rectangle consisting of the maximum extents of all monitors, which meant there was a lot of "dead" space around the borders that your cursor could get lost in.

Anyway, thanks for the post! It was helpful.