Friday, December 31, 2004

Lego Logic

Linked from the title above: a page describing the construction of logic gates (the fundamental building blocks of all modern computers and digital circuitry) from Lego. It uses a lot of "unusual" components, such as rack gears, which I assume must be special ordered from the company (at least, my kids' Legos don't have them, and I don't recall seeing them in the toy stores). The builder has NOT, OR, NOR, AND and NAND gates, which is more than sufficient to implement any logic function. To build a reasonable computer model, one would also need a tri-state device, in which one input disconnects or connects the other to the output. I don't think that would be a problem to implement in Lego. Of course, the other problem in building a complete Lego computer would be the increase in force needed at the input as more and more gates are added --- eventually, the force would be greater than what could be applied at the input without popping it apart. To get around that, you might be able to make some of the gates powered by motors and couple those gates via switches (rather than mechanically). Oh, and you'd need to spend a lot of money.

Thursday, December 30, 2004

The Re-Ascendancy of Hardware

We live in an era where "commodity hardware" is king. For most people, price is the buying point for PCs (or, for that matter, DVD players, etc). Hardware's relevance is merely as a platform for software. That wasn't always the case, and in this article, I'd like to argue that it won't be the case in the future.

I am fortunate (?) to remember the days when desktop computing was all about hardware. User interface was a hex keypad and a small LED display. Serious folks (who had the money) built their own S-100 systems, toggling in the few instructions to load the first sector from a floppy and then transfer control to that boot loader code.

Today, unless you have specialized requirements, it pretty much doesn't matter what PC you buy. Yes, there are differences in terms of speed, storage, graphics, etc., but at every performance level there is a set of essentially interchangeable machines from different manufacturers. Consumers for the most part buy based on price. The result has been that hardly any (none?) PC vendors make money selling them. PCs have become disposable razors (albeit expensive ones), the only problem being that the companies selling the blades (Microsoft and Intel) aren't the ones selling the razors. They're even more like razors now that PCs are basically as powerful as necessary for most any application for the foreseeable future (or at least the next several years) --- if you can edit digital video on a machine, that machine probably can do whatever you need to do. Will people trade in their old computers for twin-bladed ones? Ones with built-in skin care products?

This got me to thinking about the success of the iPod. While digital music players aren't yet ubiquitous (due to price), other types of music players are. So, why doesn't the iPod suffer from being an expensive product in a commodity marketplace? In my opinion, the answer is user interface, and in particular hardware. We've come to think that the real story these days is software's ability to produce any sort of user interface for given hardware, but in the case of iPod, the real story is that the software is invisible --- the iPod is successful because it works like hardware, not software.

Let's face it, most software user interfaces suck. Most try to make a program look like a physical device, but that "device" is a nightmarish, unusable thing. The best user interfaces foster the development of useful mental models by their users [1]. These don't have to be accurate models of how the system works, merely models useful for the task at hand. A good example of this is a car's steering wheel. My mental model associated with steering a car doesn't involve how it really work, but rather how the angle and rate of turn is associated with changing the car's direction.

Compare this with setting the clock on a VCR (depending on the VCR, almost certainly an example of either bad hardware or software user interface). Few VCRs have decent user interfaces, which is why so many are flashing "12:00". Why? Because they don't conform to the well-entrenched mental model associated with manipulating time --- the clock. On the other hand, the iPod's interface, particularly the scroll wheel, is excellent in the same way that an analog control is: it doesn't just allow one to select an item from a list, it also fosters development of a mental model that includes rate of scrolling and relative distance from the list's extremes.

So, where is all this leading? The lesson to me is that, to make software more usable, we must do more than make it more like (good) hardware --- we must make it indistinguishable from hardware. Good software must be invisible. Note that this is not the same as "ubiquitous computing", where computers are so numerous and in so many items used daily that we don't notice them (because they're everywhere). What I mean is that user interfaces should not "just" look like hardware (buttons, switches, etc.), but should actually act like hardware, to the extent that casual users will think of them as such. This is probably only possible for manufacturers that have control over both hardware and software design, since software can only "fade into the background" if hardware provides the user interface. Since this is pretty much the end of the year, this leads me to some "predictions":

  • It seems to me that Apple is probably best able to execute on this type of strategy, given their control over both hardware and software. Apple has already shown that this works for device categories in which there is little software compatibility issue (data compatibility only being an issue for copy protected music). Can Apple extend this to desktop computing? Once upon a time, Sun Microsystems and Silicon Graphics dominated the engineering workstation market (yes, I'm neglecting Apollo, HP, etc). While their software may have had issues (Sun, for example, let their graphical "desktop" stay unchanged for so long that hardly anyone I know actually used it), it was Unix, ran all the GNU software, and that was the most important thing. Yes, the machines were more expensive than Intel boxes running Linux, and that cut into sales. But what really killed those two companies as major players was the fact that they couldn't keep their hardware's performance ahead of Intel boxes. Linux was merely a facilitator for the switch to cheaper, sometimes better-performing hardware. Sun and SGI tried to fight back by offering their OSes for Intel (and, in SGI's case, running Windows on their hardware), but a company can't stake its survival on selling a product on which it can't make a profit. (Yes, I know, both companies still exist, but neither is a major desktop player anymore.) By focusing on the overall experience of using a Mac, Apple can survive (and possibly, grow its user base) and make a profit on its computer sales (which likely makes it unique). To an engineer, Mac OS X is Unix, and Apple hardware is pretty high performance (and not more expensive than Wintel, unless we're comparing bottom-end machines). The one thing that Apple should not do is try to compete in terms of price. Instead, it should focus on new hardware that transforms aspects of people's lives (like listening to music) and creates synergies among its product lines. (Wow, that sounds like it was produced by a buzz-word generator!)
  • Though one might not think it, Microsoft is in kind of a pickle. Yes, I'd like to have Bill Gates' problems, too. However, Microsoft's only real unique advantage is its monopoly status. This allows it to act essentially like a government, extracting a "tax" on each Wintel computer sold. This is not a recipe for future success. Incremental gains in market share by Apple and Linux will inevitably lead to the porting of specialized software to those platforms, which will lead to increasing market share, and so on. Microsoft will have great trouble moving into consumer products because they don't control the hardware --- witness the lack of enthusiasm for Windows Media Center machines. When Microsoft does control the hardware --- Xbox --- they run into price competition and shrinking margins. So, Microsoft can use its financial muscle to diversify into markets in which it will be marginally profitable at best, or it can continue to live off of an eroding market.
OK, so not so many predictions for a year-end article. But, this leads to one counter-intuitive observation: not only is market share not the most important thing, sometimes it actually is disadvantageous. Apple is a profitable 3% of the PC market. How many other computer manufacturers are sustainably profitable, even at much larger market shares? Do any of those companies support significant research? If your company is in an industry where innovation is key to new products and profits, would you rather have a small market share and healthy margins or a large market share and razor-thin margins?

[1] Hrebec, D.G. and M. Stiber,"A survey of system administrator mental models and situation awareness", ACM SIGCPR, San Diego, pp. 166--72, April 2001.