-->

Sunday, August 19, 2012

Geek Heaven, Part II: Hardware

But I don't DO hardware ...

I'm predominantly a software guy, not hardware. I know this from my worst failure ever, when I tried to assemble my own motherboard, CPU, and graphics card, and ended up shorting them all out and turning them into a recycling project. Since then I've done nothing more challenging than swapping out a hard drive, adding memory, or replacing an expansion card -- tasks which could be accomplished by anyone who can follow some simple instructions.

I've especially steered clear of daring to open a laptop computer case. Those things feel as if the components packed inside them are going to spring out like Slinky toys when you lift the lid, never to be reassembled. Don't you have to have the careful fingers of East Asian assembly line workers to deal with all those fragile connectors and tiny screws?

Necessity, the Mother

But necessity, as they say, is a mother. I was driven to the attempt by my inability to find a netbook computer with the exact specs I wanted for a price I was willing to pay. I wanted a solid-state drive instead of a hard drive, for the durability and extended battery life, but for marketing reasons they are only selling those preinstalled on high end "ultrabooks" priced from $700 on up. This is due to the cost of the SSDs, which currently runs around $1 per gigabyte, compared to only 10 cents per gig for a hard drive. (I remember when hard drives cost $16 per gig, so progress is being made!)

Instead I bought a low end netbook for $279 and added my own 128 gig SSD for an additional $119, saving half the money while getting what I wanted in a smaller package. (Ultrabooks seem to need larger keyboards and screens to justify their price.) For an extra $20 I also doubled the memory capacity from 2 to 4 gigs. Worth a bit of effort.

The case in point is an Acer Aspire, 722 series, which came with an AMD dual-core 64-bit low-power processor, Radeon dedicated graphics, and a 250 gig hard drive running Windows 7. Performance out of the box was surprisingly good, even with only 2 gigs of RAM. I played with it a bit just to be sure everything was working before I proceeded to jeapordize my warranty.

Compatibility Testing

Assured that the hardware was OK, I next tried running Linux on it to check compatibility before installing it. This is another huge advantage that Linux has to offer -- the Live disk. You can boot up a fully operational desktop directly from a CD or DVD without doing any installation whatsoever, so you can find out in advance if there will be any problems with your hardware. Contrast this with a typical Windows install, with its multiple reboots and requests for drivers that you may or may not be able to find. (Most people never make this comparison because they never had to install Windows, it just came with their computer.)

I used Linux Mint 13, KDE version, which is based on Ubuntu 12.04. The netbook did not come with an optical drive, but the Mint ISO files are "hybrid" images that can be burned to a USB flash drive and will happily run from there. I just had to go into the BIOS settings to be sure the Acer would try to boot from any USB drive before falling back to the hard drive, then plugged in the flash drive that lives on my key chain.

The Hack

As expected, almost everything was properly detected and "automagically" configured in the boot process. Within a minute I was looking at the familiar Mint desktop. The only exception was the wifi connection, but I had looked this up in advance and knew how to work around it. The system would detect available wifi networks -- even the very weak ones emanating from my neighbors' houses -- but if I tried to connect to mine it would freeze the whole computer. Not a problem, due to an inspired hack published online.

The Acer netbook also has the option to boot from a network server, and if you place that option first in the boot hierarchy it will initialize the wifi and look for a server to connect to. In a matter of moments, not finding one, it gives up and proceeds to boot from the next available device. But now, with the wifi properly initialized, it works perfectly under Linux.

True, it may not be pretty to have to look at a routine error message each time you start up, but it seems a small price to pay for the ability to run my software of choice. Likely this shortcoming will be addressed in future releases anyway, and I'll be able to eliminate the hack. One of the things said about open source is, "We'll fix it for you while you sleep!" Thanks to the tireless efforts of programmers around the globe, in the fullness of time most everything is eventually resolved.



Attack of the Clones

Finally ready to begin, my first step was to "clone" the Windows installation from the hard drive onto the new SSD. I could have just installed Linux onto the blank drive, but I hate to throw anything away. Somewhere in that $279 price tag was a fee I had paid for Windows, so why not keep it?

There was one difficulty in this, namely that the SSD was about half the capacity of the hard drive, which complicates the process because you have to deal with resizing the partitions on it. You also need to mount both drives at the same time so you can copy from one to the other. I opted for the easy way out by getting an SSD from Crucial that comes packed with a SATA to USB cable and software that does the cloning for you in one simple step. (They sell the kit separately for $18, and Corsair sells one too.) I thought that cable might come in handy for something else some day, so it was worth paying a bit extra for it.

One of the photos shows my keychain plugged into one USB port so I could run the cloning software from it, and the SSD plugged into another. The copying process took about 35 minutes at around 20 MB/sec. I might have been able to speed it up by monkeying with the settings, but hey, I only had to do this once. With that complete I finally came to the hardware portion of the evening.



Outpatient Surgery

All it took was removing the battery and a single tiny screw to slide the back cover off. Underneath, the hard drive and memory card were readily accessible. Also visible were the wifi adapter and cooling fan. Releasing the metal clips on each side of the memory board, I popped it out and replaced it with the new one. Then I had to coax the hard drive out of its snug fit between the rubber cushions that serve as shock absorbers, and remove two more tiny screws to unhitch it from the connecting SATA cable.

The photo shows the new memory installed and the empty hard drive bay. Truly, the hardest part of all this was making sure not to lose those teensy screws or drop them into the innards of the computer. A magnetic Phillips screwdriver was helpful for that.



After attaching the cable and screws to the new SSD and fitting it into place, I replaced the cover and rebooted to make sure it still worked. When you change the size of a partition that Windows lives on, it insists on "checking the disk for errors." You just have to let it do its thing so it will stop complaining, then all is well again. No errors were found, and I noted that the bootup process was noticably smoother and a bit quicker than before, though I didn't time it.

Almost done -- all that remained was to boot up again from the USB drive with the Mint install disk on it. Once you're up and running from one of these Live disks, you just double-click the Install icon to launch the installation process. If you're setting up a dual boot with Windows the only tricky part is deciding how to partition your drive.

A Drive Divided

One annoyance was that the Windows installation itself, with hardly any extra software added, was occupying almost 30 gigabytes. Compare that to my 1 gig Mint install disk that was going to end up at about 4 gigs after installation. 30 gigs is only about 12% of a 250 gig drive, but almost 25% of the SSD I put in its place. On top of that I was going to cut the drive in half to accomodate the two operating systems. And on top of that, the hard drive had come with two additional partitions that were taking up some of the remaining space. I was running out of room, and I hadn't even started yet!

But I hit on a simple and elegant solution. Those two extra partitions were not necessary. One was a sytem diagnostic area for testing hardware issues. The other was for restoring Windows in case it became hopelessly scrambled, a substitute for providing a Windows install disk. The diagnosic area didn't matter since it was only a few measly megabytes. But I didn't need the Windows restore because I had the entire hard drive I had removed as a backup. So I used the 10 gig restore partition for my Linux root -- where everything would be installed except my own personal files -- leaving the rest up for grabs.



I told the installer to resize the Windows partition to about 40 gigs, which left me 10 gigs for file storage under Windows. That small limit was no problem because I only need it for the occasional Windows-only program that I have to run, while most of my files live under Linux. This left me with the lion's share, about 66 gigs, to devote to my /home folder, plus the satisfaction of seeing my Linux root come first in the list of partitions, with Windows sandwiched between that and  /home. I created a swap partition too, equal to the 4 gig size of my RAM as recommended.

But don't let all this scare you off. If you're not as picky as I am you have the option of letting the installer resize your Windows partition for you and assign the rest of the space automatically. Or, just wipe the whole drive and use it all.

[And before anyone points out that the 100 megabyte diagnostic area would have been big enough for a Linux /boot partition, I considered it. But again, why throw anything away?]

Life is Good

The Mint install went smoothly and created the dual boot loader for me, naturally respecting my wishes to keep Windows available. So life is good. While happily installing my favorite software -- Chromium, LibreOffice, Dropbox, et. al. -- I paused to marvel. Balanced on my lap in this 2 pound (1 kilo) book-sized package was more computing power than I would have dreamed of in a desktop unit just ten years ago. Software has advanced, but hardware is sure coming along too.




Monday, August 13, 2012

Life in Geek Heaven, Part I: Software

I don't usually get all geeked out here, but indulge me for once ...

In the Beginning

Way back around 1996 I first tried installing Red Hat Linux on my home computer. At that time Windows 95 had recently replaced Windows 3.1 and Windows for Workgroups 3.11 while Microsoft server functions, such as they were, were still being addressed by Windows NT 3.51. The ever more bewildering mishmash of versions yet to come (Windows NT 4.0, 98, 2000, Me, 2003, XP, Vista, and 7) were still pipe dreams in the head of Bill Gates and his numerically challenged marketing staff.

 After years of watching DOS boot on my screens and then various versions of the Windows GUI, it was unbelievably cool to see a totally alien set of status messages scrolling across my 12" CRT monitor and terminating in a cryptic command prompt. I felt as if in the middle of the Cold War I'd been given a computer from the Soviet Union, something with an entirely different heritage. From reading the documentation I knew enough to type "startx" to launch a graphical desktop. Unlike Windows, I learned that you had a choice of such environments in Linux. One of them, called FVWM, bore a resemblance to Windows 95 with its flat teal background and bottom task bar. It was a new project, a bit half baked, but seemed like a way for the programmers to say, "You want something that looks like this? We can do that."

In the years since then Linux has grown by leaps and bounds. FVWM never really went anywhere, but soon enough there were not one but two premier user environments to choose from -- KDE and Gnome -- with still more to try if you cared to. On the server side, Linux and the related open source software that ran on it, like the Apache web server and MySQL database, were largely responsible for the explosive growth of the Internet. Now, 15 years later, it powers the likes of Google, Amazon, Facebook, and Skype, not to mention everything from Android phones to wireless routers to more than 95% of the fastest supercomputers in the world -- even the Large Hadron Collider. Don't you want some of that?

Progress

In those early days it was often a struggle to get Linux running with your particular brand of hardware -- especially on laptops with their custom touchpads, graphics, and network interfaces. Linux user groups would hold "install fests" to help people over the technical hurdles, just for the fun of it. But once you were up and running, there wasn't a lot of software available for the desktop user. Unless of course you wanted to built a web or file or database server, in which case your tool set was second only to the big commercial versions of Unix. That too has changed. So much free software now comes along with most Linux distributions that someone has claimed, "If Microsoft put all this stuff on one disk the Department of Justice would be all over them for monopolistic practices."

Over the years I've installed various flavors of Linux on a succession of ever more powerful hardware. Starting with an old 486 processor with 16 megabytes of RAM and a 10 gig hard drive, and ending up with a dual core beast with 8 gigabytes of RAM and a full terabyte drive. This thing has enough horsepower to run one or more extra operating systems in their own virtual machine windows while hardly breaking a sweat. (Of course, dual core processors are now so last year. Any self respecting video gamer is running an i7 with 8 cores.)

The Challenge

Until recently, though, I have only tried installing Linux on one laptop. This was a reconditioned machine purchased by a friend that was delivered for cheap without an operating system. Rather than blowing over a hundred bucks to add Windows, it turned out to be a simple matter to get a free Linux installation working on it. There was only one challenge. The friend wanted to use a cellular data card to access the Internet, and the system refused to recognize it as a valid network connection. Finally I realized that it worked like a dialup modem -- it was basically making a cell phone call to get online, like in the old AOL days. Naturally, Linux has had dialup support for years, so all I had to do was configure PPPD (for Point to Point Protocol Daemon) to make the connection, then write a simple script to connect, and another to disconnect. Voila, le Google.

This was not a task that an average user could be expected to perform. But luckily things have progressed even further now, to the point where I can pretty much recommend that installing my favorite operating system on your laptop -- even the most basic netbook -- is a thing you should really try. I know because I just did it myself.

[Next time -- my own netbook experience.]

Sunday, August 05, 2012

The New Relativity

This happened WHEN?

Ever since Einstein convinced us that time is not the absolute thing it seems to be, but rather varies from one observer to another, it has continued to appear safe to ignore this fact at the level of everyday reality. As Woody Allen's mother told him, "Brooklyn is not expanding!" Nor is it accelerating to the speed of light. And throughout the 20th century our communications continued to grow ever more instantaneous and global. The assumption of simultaneity became second nature to us, as we came to expect to see and hear events from anywhere in the world "live" as they happened.

The telegraphy of the 19th century gave way to an interlude of radio, when we settled for people describing distant events to us while they happened. The Hindenburg went down in flames, and Edward R. Murrow got bombed in the London blitz. But then suddenly we could see those events as their images were piped through transatlantic cables. Satellites went aloft and we could see the weather down below. TV brought the Vietnam War and the moon landing into our living rooms -- something that the most imaginative science fiction writers had never predicted. Now everyone with a cell phone can be a broadcaster, and revolutions half a world away have become instantly self-documenting.

But as our horizons continue to expand the mutable nature of time is beginning to appear at the level of everyday experience. We're all familiar with the lag in transatlantic TV interviews, due to the few seconds that it takes for the signals to make a 48,000 mile round trip to a satellite and through a battery of sending and receiving equipment. We accept, reluctantly, that we won't know if the Mars lander has landed until about 15 minutes after it has, or has not, occurred. But at least we'll hear about it at the same instant as everyone else on our planet, won't we?

Well, maybe not. And it won't be Einstein's fault, either. It's because a Pause button has been added into the mix of our communications gear. This came home to me recently when I was catching part of a basketball game (go Heat!). I did some personal exulting over an incredible 3-pointer. Then a couple of minutes later I heard a barrage of cheers coming from my next door neighbor's house. I wondered what other game they could be watching. Then it happened again. Another 3-point miracle shot ... wait a minute ... two minutes ... another chorus of hurrahs from the house next door.

The explanation dawned on me. They were watching via ATT's Uverse service which includes a DVR that lets you pause anything you're watching, even "live" TV, so you won't miss a second of it. Someone next door must have gone for a bathroom break, maybe earlier in the day so they forgot, and their service was dutifully delaying all subsequent shows so that not even a commercial interruption would be left out.

Which kind of brings us back to time being relative to the observer, but in a new way. My neighbor and his game-party certainly believed they were watching "live," and I'm sure their excitement was in no way diminished by the delay -- especially because they may have been unaware of it. But from my point of view they were getting crazy about a replay. And for all I know, ATT might have been delaying the show at least a few seconds just in the process of pumping it through their vast network of copper and fiber, so my point of view might have been as divorced from real-time as theirs.

The situation is even more pronounced with the Olympics, normally taking place at a great distance -- the current games being 5 hours ahead of us, and the previous ones in Beijing about half a day out of sync. Since most of us have lives and can't watch during the day, we have to settle for digested replays in the evening hours. It's understood that some of us might like to watch without knowing the outcomes, so news reports come with spoiler alerts to warn you to turn them off, or at least plug your ears and go "LA-LA-LA-LA" for a few minutes.

The other night I watched some gymnastics with my grandchildren, and had to explain that what we were watching had happened hours ago and all the athletes were in bed by now. They readily accepted the explanation and were completely OK with the delay. All that mattered to them was that they were seeing it for the first time. Maybe they have been born into the age of relativity as much as us older folks were born into simultaneity. For better or worse, the nation will no longer cheer for anything at the same moment.

Just something to take note of, another fact of our ever-changing digital lives. If you want to put it all in perspective, all you have to do is go out and look up at the night sky: all those stars, looking just as they looked dozens, or hundreds, or thousands of years ago ... but not now.