MacEdition Logo
 

Chip chat

By Don Granberry (lunohoco@lunohoco.com), MacEdition Contributing Editor, 20 November 2000

When it comes to brute horsepower developed for the energy consumed, the waste heat produced and the cost per chip, it is very hard to find a better bargain than the AIM alliance’s PowerPC family of CPUs. It matters little whether you consider IBM’s G3 or Motorola’s G4. Both chips are extraordinary performers. So why are so many people screaming for Apple to adopt another family of CPUs? The answer, as usual, is complicated. Let’s see if we can flush a few devils out of the details.

The perception of power

The most readily apparent little demon is the very real perception that IBM and Motorola have been slow to increase their products’ clock rates relative to their competitors, Intel and Advanced Micro Devices (AMD). Scratch a PowerPC engineer about this and he will probably tell you that those other guys are struggling to keep up with the PowerPC. This is true. The whole idea behind the reduced instruction set approach to making CPUs is to make the internal workings of the chip more efficient. The RISC chips are more complicated than the early designers had originally intended, but the final design has worked out quite well. Indeed, this latest round of PowerPC upgrades by both Motorola and IBM were aimed primarily at improved efficiency and lower cost, not necessarily higher clock rates. Both chips are now produced using smaller interconnect technology. IBM’s G3 includes on-die L2 cache and Motorola’s G4 has a much-improved internal bus making both CPUs far more efficient than previous versions.

Each variant now gets more work done per clock cycle than either did previously, while consuming less power. That means that both of the new PowerPC chips do more work while producing less waste heat, a good and admirable thing. One can easily understand why the smiths hammering away at the red-hot silicon in the IBM and Motorola foundries might be a little miffed at the “not enough megahertz” complaints coming from users. They have not been sitting around on their duffs. They have been working hard to improve their products.

The trouble is, most people fail to see things in the same light as the engineers. End customers were expecting increases in clock rate. IBM did increase the clock rate on the G3, albeit mildly compared to the Intel and AMD offerings, but not enough to impress the buying public at large. The problem is, technology has gone around a corner and the buying public has yet to catch up with the latest developments.

A great deal of this public misapprehension about clock rates stems from a perfectly reasonable cause. At one time, a CPU’s clock rate was a perfectly sound indicator of performance. It still is a fairly decent indicator of performance, all things being equal. But not all things are equal between the different chip architectures, and haven’t been for quite some time. A bit of history is required to understand this.

People have been sitting in front of television sets for years. Not only does this have an adverse affect on viewers, people’s experiences with television also lead to false expectations when they first take up computing. The computer looks like a television with a box hooked to it. This misapprehension is not improved by the “all-in-one” computer designs. Such hardware looks almost exactly like a television. As a consequence, people tend to expect the computer to behave just like their television. To a large part of the buying public, the computer is just a fancy TEEVEE.

It would be much better for them to approach the purchase of a computer with the same caution and technical expertise they exhibit when purchasing an automobile, but they don’t. They can’t ride in a computer. It isn’t likely to kill them if it fails. It sits on a desk or table in the house and it has a picture tube. To them, it’s a television that can do tricks.

Now, just you try explaining the subtleties of RISC versus CISC to this crowd. They don’t really want to pay attention. They want something that is easier to grasp. How fast is it? Oh, well, this computer has an 800-megahertz processor and that one has a 500-megahertz processor. Well, almost none of them know what a megahertz is, but one of the machines has more megahertz, whatever that is, and it costs a little less and everyone says that Windoze is the way to go, because there’s more software for Windoze and it’s what we use at work ... need I go on? Why do you think the iMac was such a hit? Because it was neat, and it was pretty and it appeared to be just as fast as the Wintel hardware sitting in the next aisle!

It is very difficult to sell the superior engineering of IBM and Motorola in this kind of environment, especially when you consider that a CPU’s clock rate was, at one time, a valid indicator of the computer’s performance potential. Notice that I said potential, not actual performance. Clock rate alone has never been the sole feature to be considered when judging a computer’s performance, but most people do not realize this and they are unlikely to learn in time to help Apple Computer’s sales figures.

To be sure, not all of the buying public is this naive or ignorant. A great many of the more computer-savvy are calling for Apple to abandon PowerPC for some other chip architecture. Almost all of them are smart enough to recognize that the megahertz issue is primarily a marketing problem. But many are not aware that a few of the subtler demons causing all this trouble were conjured up by Apple and there is at least one demon who torments every computer manufacturer and software publisher on earth. His name is Legacy. Legacy code that is.

Legacy’s curse

In the early days of computing, every new computer required its own programs uniquely written for it and it alone. Back then, the software was written “right to the metal,” there was no such thing as an “operating system.” Acquiring a new computer meant acquiring all new software to go with it. Early programmers had to concern themselves with every little piece of hardware in the computer. The advent of operating system software (yes, Virginia, operating systems are programs) alleviated this loss of code by providing a buffer between the programmer and the hardware. Most changes to the hardware could be accommodated by the operating system, often allowing applications to be reused with only minor modifications or even none at all.

Over the intervening years, this has proven to be mostly blessing with very little curse involved. Recently however, the situation has become just about all curse for hardware engineers. Legacy has become a demon, an impediment to the progress of hardware. Engineers are struggling to make better hardware while making the new hardware appear to be old hardware to existing code. New hardware design has been very much like demolishing and reconstructing the foundations beneath a major city without causing an office tower to fall over. Every popular computing platform has towering edifices of code already sitting upon it.

No manufacturer of any computer has immunity from the demon named Legacy. His works have well recognized names, like Microsoft Office, Adobe Photoshop and QuarkXPress, not to mention the thousands of games whose authors insisted upon writing “to the metal.” Each of these programs are made up of millions of lines of code that have been finely crafted, debugged and tested by millions of users after years of use. They are too precious to give up without compelling cause.

Hardware engineers at Intel and AMD have handled the Legacy demon quite well, given that they launched a frontal assault upon him. Their CPUs have gotten much faster, albeit at the cost of protracted efforts on their part and increasing inefficiencies in the chips themselves. It is hard to estimate when the inefficiencies will become great enough to warrant the demolition and reconstruction of the immense Towers of Code in Legacy’s domain. Today, if you are a great lover of both Windows and high performance computing, you can actually purchase enclosures with built-in air conditioning units for your computer. This is tantamount to constructing a desktop refrigerator around your desktop computer, but who cares? Just so long as it performs, right?

Apple, IBM and Motorola adopted a less direct approach to defeating Legacy. Apple did not have the inventory of legacy code that Windows had accumulated. IBM’s market was built around AIX and was simply not as sensitive to the problem. Motorola was primarily interested in a market that was just beginning to fully blossom; thus, there was little of Legacy’s work to harry them. Between the three of them – Apple, IBM and Motorola – they could develop sufficient demand for a completely new, state-of-the-art chip and make it profitable. Thus the AIM alliance was born.

Apple tried to take Legacy on his flank by making the change to much more efficient hardware while forcing the operating system to trick applications into thinking that they were running on legacy hardware. While this has allowed the hardware to become much more efficient, there have been ongoing problems with inefficiencies in the operating system. The Mac OS has never taken full advantage of the new hardware upon which it runs, thus confusing users about the true abilities of the PowerPC architecture. While the new, PowerPC-based Macs performed better, often performing on a par with Wintel hardware, they have only recently begun tapping into the full potential of the PowerPC.

Apple’s first attempt at authoring a fully PowerPC-native operating system, code named Copland, was abandoned after years of effort. It was replaced by Rhapsody, and now Mac OS X. Thankfully, Mac OS X has finally shipped as a public beta. Initial enthusiasm has been quite high, but, as might be expected, there have been a number of things for which users have a strong dislike. Still, Mac OS X holds out the first real promise of actually taking full advantage of the PowerPC.

A tragic end to a great argosy?

Sadly, just as all of this is finally taking place, Motorola seems to have elected to sit upon its laurels and look smug, not even troubling itself to issue an tentative upgrade schedule for its variant of the PowerPC, the G4. What makes this particularly annoying and alarming is that Apple has chosen Motorola’s G4 chip for the heart of its most powerful desktops. Now it looks as though we will finally get an operating system fully capable of exploiting the PowerPC, just as Motorola has decided to allow the chip to languish. Whether this appearance is true or false, is immaterial. No one has any evidence to suggest otherwise in a marketing environment. There is nothing for anyone to point to say, “See? Things are going to continue improving.” All we have is this pitiable, though beautifully rendered bit of eyewash which Motorola refers to as a “roadmap”. Clearly, this is a case of life imitating the art of a Greek tragedy.

People everywhere will laugh at and occasionally scorn the bumbler, but they will also tolerate and even embrace him if he brings something useful to the market. Apple has, often with more than a little justice, been accused of bumbling and fumbling in as comical a way as could be expected. However, quitters are almost universally disliked. Arrogance is seldom forgiven and in this case, Motorola appears to be an arrogant quitter. Not because Motorola’s engineers have done a bad job, but because Motorola has elected to ignore the expectations of the market and has done nothing to allay customer fears.

To many of us, it looks as though Motorola is pushing Apple over the edge and into the abyss, hence the hue and cry for Apple to seek processing power from another vendor. Some of this may well be the product of more fumbling on Apple’s part – who can tell? Apple itself has been considerably less than forthcoming with reassurances and may even be encouraging Motorola to hold its counsel, but the appearances are that Motorola has become a quitter. If it is not so, then the current situation is all the more tragic.

The question Apple users are asking themselves now is, “Where do we go from here?” There are many potential answers to that question and at MacEdition we will try to examine some of them for you in the coming weeks.

E-mail this story to a friend