Here I go again.
A friend had recently decided that he wanted a new laptop/netbook. After some research, he decided on a netbook, and we went over to Nehru Place yesterday to actually buy it. He was interested in installing Linux on it (of course!), and was a little bit worried about that. This was to be my first close-up encounter with a netbook, so I volunteered to take care of the Linux issue for him. This post, like the last one, is to document this learning experience.
My friend had decided (with substantial prodding from me) on the black ASUS Eee PC 1005PE. Unfortunately, if the shopkeepers at Nehru Place were to be believed, that model is not yet available in India. The model that was available was 1005HA, but the only option had a horrible color combination (white overall, but the boundary of the screen was black---yuck!) and a cheap-feeling keyboard. This was a 2GB RAM, 250GB HDD for ~Rs 17k, but we decided to forego it.
Luckily, a completely different ASUS model caught our eye: the N10J. It's not part of the Eee series, rather, it's marketed as a business-oriented " Ultraportable subnotebook", even though it looked like a netbook to us; a 1.6GHz processor, 2GB RAM, 160GB HDD, 10.2" screen. It had some useless (to us) fluff like a fingerprint reader and a HDMI output, but it's main distinguishing feature appears to be an extra high-performance Nvidia graphics card. The N10J is part of the N10 series, which also has other several other members, confusingly named N10E, N10C, N10H, N10J, etc. We didn't know all of this then, of course, but some good background is available at http://n10.wikia.com/wiki/Overview.
Anyway, this post is about installing Linux on the N10J, so let me get to that without further ado. We already had a Ubuntu Netbook Remix (Karmic Koala) bootable USB stick all prepared, so we started off trying to install from it. In hindsight, we should have just used plain Ubuntu. Also, it turns out this was just a couple of days before the Lucid Lynx release, and had I known that, I would have used the LL release candidate. But I don't keep myself up to date on Ubuntu (which, incidentally, is an ancient African word meaning "I can't configure Debian"), so I didn't know. We also briefly considered Debian, but there were some reports of wireless problems with it; as this was not my machine, I decided not to risk it.
The first non-trivial step was to get the N10J to boot up from USB. We were forewarned that this might be an issue, and not surprisingly, changing the boot sequence to have "External devices" before "Hard disk" did not do the trick. By the way, the N10J comes preconfigured to not show any bootup messages, so you need to know the magic key to enter the BIOS, which in this case was F2.
Once we enabled bootup messages, there was some mention of pressing F9(?) for PBOOT(?) options. Doing so gave us the option of choosing the boot device: the regular SATA hard disk, or the Kingston USB drive we had plugged in. We chose the latter, and soon we had a live Ubuntu session running. Wireless seemed to work. The desktop menu (a special feature of Netbook Remix) seemed very sluggish. We assumed that this was due to the default "nv" driver for the Nvidia card, and that installing the proprietary Nvidia driver would solve the issue. Ubuntu even offered to do that for us, but of course we didn't do it in the live session.
Things seemed to be OK, so we went ahead and installed Ubuntu. Helpfully, the disk was already partitioned into 11GB (Vista recovery) + 30BG (Vista) + 100GB (DATA), and we simply deleted the 100GB partition using gparted and chose the "largest contiguous free space" to install Ubuntu. Installation went smoothly. However, boot time seemed very slow, the desktop menu was still sluggish, and suspend/resume didn't work (i.e., suspend worked fine, but then needed a hart reboot to resume). This time, Ubuntu did not offer to install the proprietary Nvidia driver, and trying to do it manually (System->Admin->Hardware drivers) did not work either.
Leaving aside the Nvidia issue for the moment, we next tried to update the package database, in order to install new software. First we had to choose a mirror. We selected the "India" mirror, and tried updating the list of available packages. This step finished very fast, but then it turned out that this was so because nothing was actually found (all attempts were listed as "Failed" in the 'Details' section). As we had nothing else to guide our choice, we next tried the "Choose Best Server" option, which apparently pings all the known mirrors to make a decision. It chose https://ubuntuarchive.hnsdc.com, which of course turned out to be the same as the India mirror. After a lot of trial and error spanning several hours, we eventually settled on a US mirror. Upgrading to the latest versions happened in a reasonable amount of time, and we were also able to install new packages such as emacs, latex, and KDE4.
By this time, I had realized that all the vague talk about "switchable graphics" meant that the choice between the high-performance Nvidia card and the regular Intel card was to be made by flipping a physical switch on the left of the machine, and then rebooting. We switched to Intel and rebooted, and all our sluggishness and resume problems vanished; even 3D acceleration seemed to work. For now, we are just sticking to the Intel card, as setting up and maintaining the switchable graphics seemed a little painful.
We are happy with the results so far.
P.S.: There were reports of the webcam output being upside down. We tested this using cheese (which came with Ubuntu), and everything seemed to work fine, and we were happy. Then we tried the webcam on Skype, and the video came up upside down. Go figure!
Dual-head setup with ATI Radeon HD 4650
Every once in a while I need to set up a new computer. This process is usually smooth because my computers are fairly uniform in their setup; they all run Debian unstable with KDE4. I install a minimal Debian testing from the current net-installer, update to unstable, and install further packages as necessary.
A few things can trip me up during this process; for example, I would usually try to set up an encrypted partition, but would have forgotten the details, and would have to spend an unnecessary amount of time playing with the partitioner figuring them out again. Whenever something like this happens, I resolve to write down the details for posterity, but of course never actually do it. Until today.
Configuring X was once considered the trickiest part of installing a GNU/Linux system, at least for Debian. Thanks to the progress made in recent years, this is no longer true. First there was debconf to help you generate a configuration file, and now you don't even need a configuration file in most cases. Except when you have a dual-head setup.
I first had the opportunity to use a dual-head setup (two monitors connected to a computer) about four years ago, and have since become a big fan. I had an Nvidia card then, and setting up the system involved downloading Nvidia's proprietary driver and kernel module, and hand-crafting a suitable xorg.conf file. I have since forgotten the details, but I ended up using something Nvidia referred to as TwinView, which resulted in a very sensible combination of essentially two separate KDE desktops but with the ability to drag windows from one to the other, or resize a window to cover both. The desktops were separate in the sense that they had separate panels, separate desktop backgrounds, etc., and new windows would appear in either of the desktops, not across both. In particular, krunner (Alt-F2) would appear centered on one of the desktops, and not straddling the two. In other words, it was not simply a very large desktop with half showing in each monitor. Maintenance-wise, the only irritating aspect was that with every kernel upgrade, the kernel module had to be recompiled (and X would stop working till that was done). module-assistant was helpful, although not always as smooth as I would have liked.
When I moved to ISI late last year, I naturally put in a request for a dual-head setup. Bureaucracy being what it is (and March being the end of the financial year), I finally got a dual-head graphics card installed today (the rest of the computer has been up and running for a while already). Setting it up was not trivial, and this time, I decided to actually write down my experience.
I had originally asked for Radeon HD 4650XT, because HP mentioned it as a dual-head card compatible with the desktop, which was an HP "business desktop". The card I got was an ATI Radeon HD 4650 (i.e., no mention of the XT part --- I'm not sure what the difference is). Once it was plugged into the proper PCIe16 slot, and we got over the initial scare regarding whether a DVI-D cable was supposed to fit into a DVI-I slot (it was), we started up the computer again and got back the previous desktop, now mirrored on the two monitors, but with no hardware acceleration. The next few hours would be spent on getting a proper dual-head configuration working with accelerated graphics. That is what this post is about.
At this point, I did not have a xorg.conf file. The default driver selected by X turned out to be "radeon". There was another alternative, "radeonhd". A little research suggested that neither were likely to have 3D acceleration (radeonhd claims experimental 3D support for RV730, the chipset for HD 4650), and furthermore it wasn't immediately clear how to set up a dual-head system. More importantly, the same research suggested that ATI/AMD provides a proprietary binary Linux driver, much like Nvidia, for its newer cards. I decided to check it out. For the record, the open source drivers can probably also do dual-head, and I might try it out sometime. This wiki entry claims that 3D acceleration is currently better for the proprietary driver, but 2D acceleration is worse. I haven't found an explicit comparison with regard to dual-head setups, but I haven't looked that hard.
Anyway, the first promising document I found was an ATI FAQ, which started off by saying that
It turns out that the magic word is fglrx, which stands for ``FireGL and Radeon for X''. Debian's non-free section has packages for the proprietary driver, and installation is described in the wiki, which does say something a little scary:
The recent move of X to a xorg.conf-less setup caught me by surprise a little bit, and I'm not yet sure when an explicit configuration file is necessary. So, as my next step, I rebooted my computer (I could have just restarted X, but the wiki made noises about unloading kernel modules, so I didn't risk it). Naturally nothing interesting happened, and I ended up still using the "radeon" driver. I think X simply has a default driver for each card, irrespective of what drivers are available on the system (which is sensible).
I decided to try out the control panel. The package was fglrx-control, and the web had references to an executable called fglrxconfig, but there was no such thing on my system. I ended up calling
wajig list-files fglrx-control
to see what it had installed, and guess what? The executable I was looking for was amdcccle (the CCC stands for Catalyst Control Center; Catalyst is the name of the configuration software). I started it up and got a not entirely unexpected message:
Instead of describing the next phase of discovery in excruciating detail, I will just summarize the results. amdcccle worked perfectly this time around, and once I got used to it, turned out to be everything I could have wanted it to be (except that it required an X restart everytime you changed something, which is not really its fault). There were two approaches that could give me the dual-head setup I wanted. One was Xinerama (a term I remembered as being associated with dual-head setups but didn't remember the details), the other was what ATI refers to as Big Desktop mode (similar to Nvidia's TwinView).
Enabling Xinerama using amdcccle was a two-step process. First, go to Display Manager and choose "Single Display Desktop (multi-desktop)", restart X, then go to Display Options and choose Xinerama (the Xinerama option would be disabled without the first step). This was otherwise nice, but KDE's display control module now said that "Compositing is not supported on your system. Required X extensions (XComposite and XDamage) are not available."
It turns out that Xinerama is incompatible with X extensions required by for compositing. Xinerama predates the days of ubiquitious dual-head graphics cards, and was designed for sharing a desktop across multiple monitors connected to multiple graphics cards. For proper dual-head cards, that is, when both monitors are connected to the same GPU, the RandR extension provides a functionally similar alternative that does not suffer from this drawback. Xinerama is apparently now deprecated in favour of RandR. This is (I think) ATI's Big Desktop mode.
To enable this using amdcccle, one needs to choose "Multi-display desktop [with display...]" from Display Manager and restart X. (If Xinerama was active, that needs to be disabled first, requiring an extra X restart.) And that was it.
Interestingly, although not surprisingly, fgl_glxgears gives a framerate of ~1200 without compositing, and ~300 with. So, disable compositing if you want high OpenGL performance in an application.
A few things can trip me up during this process; for example, I would usually try to set up an encrypted partition, but would have forgotten the details, and would have to spend an unnecessary amount of time playing with the partitioner figuring them out again. Whenever something like this happens, I resolve to write down the details for posterity, but of course never actually do it. Until today.
Configuring X was once considered the trickiest part of installing a GNU/Linux system, at least for Debian. Thanks to the progress made in recent years, this is no longer true. First there was debconf to help you generate a configuration file, and now you don't even need a configuration file in most cases. Except when you have a dual-head setup.
I first had the opportunity to use a dual-head setup (two monitors connected to a computer) about four years ago, and have since become a big fan. I had an Nvidia card then, and setting up the system involved downloading Nvidia's proprietary driver and kernel module, and hand-crafting a suitable xorg.conf file. I have since forgotten the details, but I ended up using something Nvidia referred to as TwinView, which resulted in a very sensible combination of essentially two separate KDE desktops but with the ability to drag windows from one to the other, or resize a window to cover both. The desktops were separate in the sense that they had separate panels, separate desktop backgrounds, etc., and new windows would appear in either of the desktops, not across both. In particular, krunner (Alt-F2) would appear centered on one of the desktops, and not straddling the two. In other words, it was not simply a very large desktop with half showing in each monitor. Maintenance-wise, the only irritating aspect was that with every kernel upgrade, the kernel module had to be recompiled (and X would stop working till that was done). module-assistant was helpful, although not always as smooth as I would have liked.
When I moved to ISI late last year, I naturally put in a request for a dual-head setup. Bureaucracy being what it is (and March being the end of the financial year), I finally got a dual-head graphics card installed today (the rest of the computer has been up and running for a while already). Setting it up was not trivial, and this time, I decided to actually write down my experience.
I had originally asked for Radeon HD 4650XT, because HP mentioned it as a dual-head card compatible with the desktop, which was an HP "business desktop". The card I got was an ATI Radeon HD 4650 (i.e., no mention of the XT part --- I'm not sure what the difference is). Once it was plugged into the proper PCIe16 slot, and we got over the initial scare regarding whether a DVI-D cable was supposed to fit into a DVI-I slot (it was), we started up the computer again and got back the previous desktop, now mirrored on the two monitors, but with no hardware acceleration. The next few hours would be spent on getting a proper dual-head configuration working with accelerated graphics. That is what this post is about.
At this point, I did not have a xorg.conf file. The default driver selected by X turned out to be "radeon". There was another alternative, "radeonhd". A little research suggested that neither were likely to have 3D acceleration (radeonhd claims experimental 3D support for RV730, the chipset for HD 4650), and furthermore it wasn't immediately clear how to set up a dual-head system. More importantly, the same research suggested that ATI/AMD provides a proprietary binary Linux driver, much like Nvidia, for its newer cards. I decided to check it out. For the record, the open source drivers can probably also do dual-head, and I might try it out sometime. This wiki entry claims that 3D acceleration is currently better for the proprietary driver, but 2D acceleration is worse. I haven't found an explicit comparison with regard to dual-head setups, but I haven't looked that hard.
Anyway, the first promising document I found was an ATI FAQ, which started off by saying that
The ATI Proprietary Linux driver currently provides hardware acceleration for 3D graphics and video playback. It also includes support for dual displays...Exactly what I want! There is also mention of some sort of ``control panel''. Even better. Unfortunately, the FAQ was a bit vague about where I would get said wonderful driver.
It turns out that the magic word is fglrx, which stands for ``FireGL and Radeon for X''. Debian's non-free section has packages for the proprietary driver, and installation is described in the wiki, which does say something a little scary:
For 3D acceleration, fglrx requires an associated kernel module, its compilation can be automated via module-assistant or DKMS.I started off by installing fglrx-driver (and dependencies), and fglrx-control, which was suggested by the former and apparently contained the aforementioned control panel. I was mentally preparing myself for a brief tussle with module-assistant, but the installation step automatically downloaded fglrx-modules-dkms and compiled something, and seemed to be happy with the results. I assumed (correctly) that no further kernel module compilation was required.
The recent move of X to a xorg.conf-less setup caught me by surprise a little bit, and I'm not yet sure when an explicit configuration file is necessary. So, as my next step, I rebooted my computer (I could have just restarted X, but the wiki made noises about unloading kernel modules, so I didn't risk it). Naturally nothing interesting happened, and I ended up still using the "radeon" driver. I think X simply has a default driver for each card, irrespective of what drivers are available on the system (which is sensible).
I decided to try out the control panel. The package was fglrx-control, and the web had references to an executable called fglrxconfig, but there was no such thing on my system. I ended up calling
wajig list-files fglrx-control
to see what it had installed, and guess what? The executable I was looking for was amdcccle (the CCC stands for Catalyst Control Center; Catalyst is the name of the configuration software). I started it up and got a not entirely unexpected message:
There was a problem initializing Catalyst Control Center Linux edition. It could be caused by the following.So I ran aticonfig (which supposedly is an utility to ``parse an existing X-Server configuration file and modify it to operate with ATI products'') to get
No ATI graphics driver is installed, or the ATI driver is not functioning properly. Please install the ATI driver appropriate for you ATI hardware, or configure using aticonfig.
$ sudo aticonfig --initial=dual-headRebooting again, I got one screen (the other went blank), with apparently no 3D acceleration, but with the "fglrx" driver now in use. By the lack of 3D acceleration, I actually mean that the compositing effects of KDE were disabled. The KDE control panel said "Compositing is temporarily disabled" without any further explanation. It turns out that 3D acceleration was working fine, as evidenced by running fgl_glxgears.
[sudo] password for deepayan:
Could not find configuration file, so create an empty one
Uninitialised file found, configuring.
Using /etc/X11/xorg.conf
Saved back-up to /etc/X11/xorg.conf.original-0
Instead of describing the next phase of discovery in excruciating detail, I will just summarize the results. amdcccle worked perfectly this time around, and once I got used to it, turned out to be everything I could have wanted it to be (except that it required an X restart everytime you changed something, which is not really its fault). There were two approaches that could give me the dual-head setup I wanted. One was Xinerama (a term I remembered as being associated with dual-head setups but didn't remember the details), the other was what ATI refers to as Big Desktop mode (similar to Nvidia's TwinView).
Enabling Xinerama using amdcccle was a two-step process. First, go to Display Manager and choose "Single Display Desktop (multi-desktop)", restart X, then go to Display Options and choose Xinerama (the Xinerama option would be disabled without the first step). This was otherwise nice, but KDE's display control module now said that "Compositing is not supported on your system. Required X extensions (XComposite and XDamage) are not available."
It turns out that Xinerama is incompatible with X extensions required by for compositing. Xinerama predates the days of ubiquitious dual-head graphics cards, and was designed for sharing a desktop across multiple monitors connected to multiple graphics cards. For proper dual-head cards, that is, when both monitors are connected to the same GPU, the RandR extension provides a functionally similar alternative that does not suffer from this drawback. Xinerama is apparently now deprecated in favour of RandR. This is (I think) ATI's Big Desktop mode.
To enable this using amdcccle, one needs to choose "Multi-display desktop [with display...]" from Display Manager and restart X. (If Xinerama was active, that needs to be disabled first, requiring an extra X restart.) And that was it.
Interestingly, although not surprisingly, fgl_glxgears gives a framerate of ~1200 without compositing, and ~300 with. So, disable compositing if you want high OpenGL performance in an application.
Scrapbook - 2
দীপায়ন গেছে চুল কাটতে University District-এ। চুল কম, টাক-টাক ভাব...যে চুল কাটবে সে দীপায়নের মাথায় কয়েকটা পাকা চুল দেখতে পেয়ে ওকে সতর্ক করল... "You have got some white hair!" তারপর খুব সহানুভূতিশীল হয়ে বলল "But every last one helps"!
Scrapbook - 1
আমরা Seattle-এর বেশ নাম করা restaurant, 'The Brooklyn'-এ খেতে গেছি আমাদের প্রথম anniversary উপলক্ষে। ওয়েটর order নিতে এলে আমরা তাকে জিগেস করলাম "What do you suggest? Quail or Halibut?" সে বলল "Quail is excellent! Halibut is also great!" Quail আর New York Strip Steak order দেওয়া হল, সঙ্গে Red Wine. ওয়েটরটির ব্যবহার খুব-ই friendly. খুশী হয়ে order নিয়ে গেলো।
যখন খাবার দিতে এলো তখন দীপায়নকে কানে কানে বলে গেলো " The Quail is just great! And the best part is you didn't have to go and hunt it!"
যখন খাবার দিতে এলো তখন দীপায়নকে কানে কানে বলে গেলো " The Quail is just great! And the best part is you didn't have to go and hunt it!"
Subscribe to:
Posts (Atom)