721-4 Chapman_v1_0001
Second Edition: 2006

In Search of Stupidity: The Lost Chapters

Share on facebook
Share on twitter
Share on linkedin
First Edition: 2002, hard cover jacket
721-4 Chapman_v1_0001
Second Edition: 2006
Fullcover_ISOS3
Third Edition: 2020, front and back cover

Read chapters and sections from the first and second edition

Introduction 

In 1982, Harper & Row published In Search of Excellence: Lessons from America’s Best-Run Companies by Thomas J. Peters and Robert H. Waterman. In Search of Excellence quickly became a seminal work in the category of business management books and made its authors millionaires. Although it’s no longer the literary obsession of freshly minted MBAs that it was back in the 1980s, the book’s distribution and influence has proven long-lasting and pervasive. After its introduction, the book stayed on best-seller lists for almost 4 years and sold over 3 million copies. A survey by WorldCat, an electronic catalog of materials from libraries in the United States and other countries, ranks In Search of Excellence as being on more library shelves than any other book in the world. With 3,971 libraries listing it as being in their collections, the book tops the list of 100 books held by libraries. It has held the number one position since 1989.

In Search of Excellence, when it first came out, applied soothing balm to the raw nerves of the American psyche and this helps account for its tremendous success. The 1970s had been a gloomy time for U.S. businesses. The Japanese had run American companies out of consumer electronics; Japanese cars lasted 100,000 miles, while American cars started breaking down at 20,000; and as the 1980s began, Japanese companies had just started making memory chips more cheaply than their American counterparts. The Japanese even announced they were starting a “Fifth Generation” project to build software that would make computers very very smart indeed, leaving the poor old United States with software systems that would be the technological equivalent of Studebakers. (The project was a complete bust, like all the others emanating from the artificial intelligence hype machine of the 1980s, and it never developed much more than software capable of storing some nice recipes for sushi.) Yes, the United States was doing OK in this new market for little machines called microcomputers, but the pundits universally agreed that eventually the Japanese were going to move into that industry as well and that would be it for the Americans.[i] Maybe IBM would survive; after all, they did business like the Japanese anyway. For the ambitious young MBA, a start-up position in agribusiness, such as sheep herding, began to look like the fast track to the top.

In Search of Excellence helped buck everyone up. All the companies it profiled were American firms competing successfully in world markets. It seemed obvious that if you studied the organizations closely, learned the fundamental practices and techniques they used to achieve excellence, and then applied those practices and techniques to your business, it would become excellent too!

The basic thesis of In Search of Excellence isn’t complex and can be summed up succinctly: Excellent companies create corporate cultures in which success flourishes. (Yes, this is a something of a tautology, but it’s a nice one and people always like reading it.) An excellent corporate culture is one that loves the customer, loves its employees, loves the company’s products, and loves loving the company. Once enough love is flowing through the corporate veins, a company will organically become excellent and in turn create excellent products and services. This will lead to more customer, employee, product, and corporate love, lifting all concerned to even greater heights of selling and purchasing ecstasy. The cycle becomes self-sustaining and a universe of almost sybaritic business success awaits those who master the Zen of Excellence.

Most of In Search of Excellence thus functions as the corporate equivalent of the Kama Sutra, profiling different companies as they bend and twist themselves into different postures and techniques designed to build customer desire for the company, increase customer love for the company’s products, and provide lasting satisfaction with the company’s service. The positions and techniques discussed vary widely and include being reliable, shooting for 100 percent, communicating intensely, being creative, talking about it, talking about it a lot, listening a lot, getting on with it, etc. High-tech firms are particularly well represented in the book, with IBM, Xerox, DEC, and many others serving as exemplars of how to seize the business world by the tail via the practice of excellence.

For the next several years, copies of In Search of Excellence flew off bookstore shelves. Thousands of companies, including most in the high-tech sectors, took its maxims to heart. People walked, talked, and communicated with incredible intensity. Peters became a widely sought-after speaker and business consultant (Waterman dropped out of public sight). He wrote more books, including A Passion for Excellence and The Pursuit of WOW!, all of which continued the earlier book’s quest for that ineffable corporate phlogiston that when ignited leads inexorably to success. America’s affair with excellence appeared to be endless.

Unfortunately, while U.S. businesses were vigorously applying excellence to every nook and cranny of their corporate bodies, a few people began to note that many of the firms listed in Peters and Waterman’s tome seemed to be, well, less than excellent. As early as 1984, Business Week published a cover story entitled “Oops!” that debunked some of the book’s claims. Most people dismissed  these early criticisms as journalistic carping, but over time it became more difficult to ignore the fact that something was very wrong with the book’s concept of business excellence.

Take, for example, its examination of Lanier, a major competitor in what is now a vanished world, that of dedicated word processors. The market for these single-purpose computers had been built and defined by Wang. As the market grew, companies like Lanier, Xerox, IBM, and almost a hundred others competed fiercely for the privilege of selling $20,000.00 boxes that did what a $99.95 piece of software does today (actually, the software does much more). These dedicated devices were often the only experience many people had with computers throughout much of the 1970s, and to many people word-processing stations epitomized “high tech.”

In Search of Excellence thought Lanier was really excellent, a company that “lives, sleeps, eats, and breathes customers.” The book described how the company’s top executives went on sales calls once a month, how the president of the company personally handled service calls (and if you believed that, you probably also went out and bought a famous bridge in New York City), how their service was even better than IBM’s, and so forth and so on.

And Lanier was a sharp marketing bunch, too! The company knew that the term “word processor” put everybody “off.” That’s why Lanier called their word processors “No Problem Typewriters.” Sheer advertising genius.

The only problem with all of this was that Lanier wasn’t an excellent company; they were a dead company, a shot-through-the-head dinosaur whose sluggish nervous system hadn’t yet gotten round to telling the rest of its body to lie down and die. In 1981, an Apple II+ running AppleWriter or ScreenWriter[ii] did everything a Lanier word processor did, never mind an IBM PC with WordStar. By 1985, the market for dedicated word processing was as extinct as the Tyrannosaurus Rex, but Peters and Waterman seem not to have noticed they were profiling a walking corpse.

Now, you can argue that market shifts can catch companies unawares and that Lanier was a victim of the unexpected. This, however, cannot be true. In Search of Excellence was written in 1981 and published in 1982. By 1981, thousands of Apples, Radio Shack Model 80’s,[iii] Commodore PETs, and a wide variety of CP/M systems were selling monthly. The IBM PC was also launched that year. WordStar, AppleWriter and Scripsit (popular on the Radio Shack systems) had been available for years. Hundreds of ComputerLand stores, one of the first national franchises dedicated to selling desktop computer systems, were doing business nationwide and dozens more were opening on a monthly basis. Yet somehow Lanier, the company that apparently did everything but have sexual relations with their customers, never found out from a single one of them that they were interested in buying an IBM PC or an Apple with a good word-processing program that did everything a Lanier word processor did at a fraction of the cost and did other things as well, like run a nifty type of new program called a spreadsheet. You would think an excellent company would have caught on much sooner.

It only became worse as time passed and you kept track of the book’s list of “excellent performers,” particularly the high-tech ones. For instance, Data General: Gone into oblivion.[iv] Wang: Moribund by 1987. DEC: PC roadkill. NCR: A mediocre performer bought up by AT&T that passed into extinction without leaving a trace. Texas Instruments: The company that coinvented the microprocessor saw their TI99/4A tossed out of the computer market by 1984. IBM: In 10 years they went from an American icon to an American tragedy.

Xerox, on the ropes by the late 1990s, was on the book’s list of hero companies. By the mid-1980s, industry mavens were already puzzling over how a company could develop the graphical user interface (GUI), mouse, object-oriented programming and Ethernet and fail to make a single successful product from any of these groundbreaking innovations. Instead, Xerox made their inaugural debut into the PC market with an obsolete-before-its-release clunker of an 8-bit CP/M machine with the appetizing name of Worm that sold just about as well as you would expect.

Atari, for God’s sake, even made it to the book’s Hall of Excellence. In 1983, the year after In Search of Excellence’s publication, the company was close to death after releasing the worst computer game of all time, E.T. (based on the movie). Before their product hit the store shelves, an “excellent” company would have used the plastic cartridges that contained this all-time turkey to club to death the parties responsible for producing the game that ruined the Christmas of 1982 for thousands of fresh-faced video game junkies.[v]

It wasn’t simply the companies profiled in In Search of Excellence that proved to be disappointments. During the 1980s it was impossible, especially in high tech, to escape the training seminars, book extracts, and corporate programs that sprang up dedicated to ensuring everyone was excellent all the time and every day. Yet, despite all the talking, walking, and communicating, high-tech firms kept doing stupid things. Again and again and again. And every time they did they paid a price. Again and again and again.

One key to the problem may lie in the fact that in 2002, Tom Peters announced the data used to “objectively” measure the performance of the companies profiled in the book was faked. Oops. Well, remember, excellence means never having to say you’re sorry.

But despite this little faux pas, a more important answer lies in the type of companies analyzed in In Search of Excellence. With only a few exceptions, they were large firms with dominant positions in markets that were senescent or static. IBM ruled the world of mainframe computers. DEC and Data General had carved out comfortable fiefdoms in minicomputers. Xerox reigned over copiers. Wang and Lanier both possessed principalities in dedicated word processing.

In these types of business environments, affairs proceed at a measured pace and there’s plenty of time available for navel gazing. Their vision clouded by all that lint, companies such as IBM and DEC decided that it was their natural goodness that made them successful and therefore they were successful because they were naturally good. By the time Peters and Waterman got around to interviewing them, most of these firms were ossifying, their internal cultures attempting to cement employee mindsets and processes in place in a futile attempt to freeze the past so as to guarantee the future. These firms weren’t excellent, they were arthritic.

For high-tech companies, navel gazing is a particularly inappropriate strategy as  markets tend not to stay stable very long. In 1981, for example, distinct markets for spreadsheets, word processors, databases, and business presentation products existed in the software industry. Word processing alone was a $1 billion category. By 1995, all of these categories had been subsumed by the office suite (particularly Microsoft’s).

What, therefore, accounted for the success of companies such as Microsoft, Oracle, and Symantec and the failure of other firms such as Novell, MicroPro, and Ashton-Tate? Was it Microsoft’s “respect for the individual,” something In Search of Excellence told us IBM had in abundance? Well, Bill Gates once stood up at the start of a presentation being given by a new product manager, fixed the unfortunate fellow with a cold stare, and asked, “Where the fuck did we hire you from?” before leaving the room.

Hmm. Perhaps not.

Perhaps it was a “seemingly unjustifiable overcommitment to some form of quality, reliability, or service”? IBM had that in abundance also. Well, Dell Computer is currently the reigning king of PC hardware, not IBM. Although Dell’s service is OK, the company is not “unjustifiable” about it. Oh, Dell pays lip service to the concept of great customer service, and within the constraints of their business model, they do the best they can. If you don’t like your PC, they’ll probably take it back if you’re within the warranty period and you scream loudly enough and pay for the shipping and maybe fork over a restocking fee. If your PC breaks, they’ll do their best to get you to fix the thing. But Michael Dell, unlike the excellent CEO of Lanier, won’t be calling your house to handle affairs personally.

That’s because Dell has figured out that what people really care about these days in a computer is high performance at a low price. They’ve learned over the years to build such machines. IBM hasn’t. Computers are very reliable and on a statistical basis don’t break down often. If the ones made by your company do, it is possible to sell a great many of them if you price them cheaply enough, as in the case of Packard Bell, a company that briefly became a powerhouse in PC retailing. Alas, the machines were of poor quality, broke often, and few people ever bought a second Packard Bell computer.

On the other hand, Dell computers rarely break. You, the customer, know that. You’re willing to buy a Dell PC because you’ve made a bet in your mind that the risk that the computer you buy won’t work isn’t worth the extra money it would cost to have your fanny kissed in the event of a breakdown. People who buy desktop PCs aren’t a high-roller audience and it makes no sense to treat them like one.

Let’s move on.

Or perhaps it was “autonomy and entrepreneurship”? Motorola, with a history of allowing different autonomous groups within their phone division to tear at each other’s throats while firms like Nokia tore away their market share, surely has that in abundance. In the entrepreneurial spirit of “up and at ’em,” these groups managed to build what is perhaps the coolest-looking cell phone of all time, the StarTAC. The only problem with it was that when it was first introduced it was a very cool analog system when everyone wanted digital phones.

And it was certainly entrepreneurship that led Motorola to launch their Iridium project. Motorola spent $5 billion to put 66 low-earth satellites into orbit so that anyone could phone anytime from anywhere with a Motorola phone. Unfortunately, the satellites spent 70 percent of their time over our planet’s oceans and weren’t usable for much of their life (unless perhaps you’re adrift in the middle of the Atlantic); the phones, though they may have worked from the top of Mount Everest, didn’t work indoors, in the shadows of buildings, or under trees (early demos of the system enjoined purchasers to “make sure the phone is pointed at the satellite”In other words, there was no market for Iridium. After the last satellite was launched, the system quickly went bankrupt. Despondent Motorola stockholders, watching the value of their shares plummet as Iridium crashed and burned, suggested sending up the project’s marketing and engineering team in rockets without spacesuits to join their orbiting financial debacle, but current law forbids this. You would think an excellent company with entrepreneurial instincts would notice that 70 percent of the earth’s surface is water.

Uh huh. Maybe that isn’t it.

In fact, if you examine high-tech companies, only one factor seems to constantly distinguish the failures from the successes. This factor is stupidity. More successful companies are less stupid than the opposition more of the time. As Forrest Gump astutely noted, “Stupid is as stupid does.”

One of stupidity’s most endearing traits is its egalitarian nature. Its eternal dull lamp beckons endlessly to those dim bulbs who seek to rip open the hulls of successful companies and ideas on the sharp rocks of bad judgment and ignorance. With stupidity, your reach never exceeds your grasp; any company, no matter how large or small, can aspire to commit acts of skull-numbing idiocy and have a hope of success.

Take, for example, the creation of the worst piece of high-tech marketing collateral ever developed, the brainchild of the founder of a small company, Street Technologies. The front page of Street Technologies’ expensive four-color, 8 1/2 x 11 corporate opus posed the following challenge:

How to eliminate half your work force.”

The inside of the brochure provided the means to rise to the task:

“Get the other half to use your software!”

When it was pointed out to the president of Street Technologies that a marketing campaign designed to create mass unemployment and spark a brutal Darwinian struggle for personal survival in its target audience might not be the most effective of all possible approaches, he airily dismissed the issue with the observation that “the piece was not aimed at the employees but their bosses.” He’d apparently not considered the issue of who was going to be opening the mail.

Creating silly collaterals is not a task reserved only to high tech’s small fry. The second worst piece of marketing collateral ever created was a noble effort by software giant Computer Associates. This was a brochure designed to be included in a direct marketing campaign for a bundle of OS/2 business software. The piece trumpeted the presence of a free goodie that buyers of the bundle would receive upon purchase—a package of canned sounds you could use to liven up your OS/2 desktop. Sounds highlighted in this amazing bit of literature included “farting,” “pissing,” and “orgasm.” One can only mourn the fact that the package didn’t include the noise made when a marketing manager is summarily decapitated for committing an act of boneheaded silliness, such as developing and printing thousands of patently tasteless and offensive four-color brochures.

The reason for the absence of stupidity can vary. In some cases, firms avoid stupidity because the company’s culture creates more intelligent behavior. In other cases, it’s because a company’s personnel are smarter than the competition’s and thus avoid making stupid mistakes. In yet others, it’s because a business’s leadership is smarter than the competition’s and thus tends not to behave stupidly. Usually, it’s a varying mix of all three. In a sense, the reason for not acting stupidly doesn’t matter—the avoidance of it does. By reducing the number of stupid actions you take vis-à-vis your competition, you’re more likely to outcompete them over time.

Some may object that stupidity isn’t quantifiable but, in point of fact, the opposite is true. Stupid behavior is both quantifiable and identifiable. For example, it’s stupid to create two products with the same name, price point, functionality, and target audience and attempt to sell them at the same time. This may seem stunningly obvious, but somehow one of the world’s largest software companies, MicroPro, publisher of WordStar, a product that once ruled the word-processing market, did precisely that. A few years later, Borland repeated very much the same mistake with very much the same results. Then Novell. After you read Chapter 3 and learn precisely why this is a stupid thing to do and what the likely outcome is, you’ll be less likely to make this mistake in your own marketing and sales efforts. That puts you one up on your competition who, unless they’ve also read this book, are far more likely to repeat MicroPro’s fatal blunder.

Nitpickers like to claim that context often changes the nature of what is stupid behavior, but this principle is vastly overstated. For instance, if you spend many millions of dollars successfully creating a consumer brand, and then, when your most important product is revealed to be defective, stupidly attempt to blow off the public (as I describe Intel attempting to do in Chapter 5), you’ll suffer. It really doesn’t matter what industry you’re in or what product you’re selling. Expect to be immolated.

Or take the example of Syncronys, publisher of the immortal, never-to-be-forgotten SoftRAM “memory doubling” utility for Windows. Introduced in May 1995 with a list price of $29.95, SoftRAM was designed to “compress” your computer’s memory using your computer’s memory to give you, effectively, twice the memory you had physically installed (the problem with this concept should be apparent once you think about it). SoftRAM was quite the bestseller upon its release, with the Windows 3.x version selling more than 100,000 copies and the Windows 95 version more than 600,000. The company’s president, Rainer Poertner, was dubbed Entrepreneur of the Year by the Software Council of Southern California. Syncronys stock jumped from $.03 per share in March 1995 to a high of $32.00 per share in August 1995.

SoftRAM was a handsome-looking piece of software that after installation presented buyers with a snazzy dashboard that supposedly let them increase their PC’s RAM with the touch of a button. Unfortunately for both purchasers of SoftRAM and Syncronys, the software didn’t actually do that. Actually, it didn’t really do anything except change a configuration setting in Windows 3.x that increased the amount of memory that could be swapped to disk, an operation a Windows user could perform him- or herself in under a minute for free.

It turned out that SoftRAM was an example of what Syncronys coyly called “placeboware,” the software equivalent of a deed to the Brooklyn Bridge. The concept annoyed the spoilsports at the Federal Trade Commission (FTC) greatly, who forced the company to stop selling the package and promise to give everyone their money back. (Interestingly enough, no one was prosecuted for fraud in the case, the FTC apparently having bought the argument that the difference between computer salesreps people and car salespeople is that car salespeople know when they’re lying.) It would seem obvious to anyone with even half an uncompressed brain that no one would ever buy a product from Syncronys again, but in an act of supreme idiocy the company actually tried to sell other software packages[vii] after the SoftRAM debacle. Sheer imbecility, as Syncronys promptly went out of business.

However, more than just a few trenchant examples of stupidity are needed to support a substantive examination of the subject, which brings me to the point of this book. In Search of Stupidity was written to provide you with a more comprehensive look at the topic. Within these pages are documented many of high tech’s worst marketing and development programs and strategies, as brought to you by some of its most clueless executives. In my quest to bring you the best of the worst, I made my selections from a wide range of companies, from arrogant smaller hot shots on the path to meltdown to sluggish giants too muscle bound to get out of their own way.

In the interest of fairness, I haven’t included hard-luck stories. No natural disasters, plane crashes, or tragic deaths played a part in any of the disasters discussed. All of the blunders, snafus, and screw-ups described in this book’s pages were avoidable by the individuals and companies that made them and are avoidable by you and your company. After reading this book, you’ll know what they are and you’ll be in a position to act less unintelligently. For you, history won’t repeat itself.

Of course, it is possible that you’ll make other stupid mistakes, ones not chronicled in these pages, but not to worry. If your competition is making the mistakes I describe in these pages, as well as all the others, you’ll still probably prevail. Remember, the race goes not to the strong, nor swift, nor more intelligent, but to the less stupid.

Besides, I’m planning a sequel.

Best of luck!

[i] In point of fact, the Japanese did introduce a plethora of CP/M and MS-DOS “clones.” Like many other companies, the Japanese firms failed to understand the impact of the IBM standard on the industry and none of the machines made a significant impact on the market. In Japan, NEC and Fujitsu attempted to establish independent hardware standards, but their efforts were eventually overwhelmed by the silicon beast. The most important long-term impact the Japanese had on computing technology was Sony’s successful introduction of a standard for 3” floppies.

[ii] An early attempt at a true WYSIWYG (“What You See Is What You Get”) word processor. The product displayed your text on a bitmapped screen and could show italicized and underlined text. On a 1MHz Apple II it also ran veery slooowly.

[iii] The first computer I ever owned was a  Radio Shack TRS-80 Model One, semi-affectionately known by its owners as “Trash One.” The reliability of early models was less than stellar, and the paint tended to rub off their keyboards, leading older systems to develop a rather decrepit appearance.

[iv] Data General made its own contribution to stupidity with the introduction of the Data General One in 1985. This was the first “clamshell” portable and, in terms of weight and functionality, a breakthrough. A fully loaded system cost about $3,000, weighed about 12 pounds, supported up to 512KB of RAM, could hold two 3.5” double-sided 700KB floppies, and featured an LCD screen capable of displaying a full 80x25 lines of text, an unusual feature for a portable in that era.  It also had enough battery life to allow you to get some work done from your airplane seat.  Unfortunately, the LCD screen also sported a surface so shiny and reflective you could literally comb your hair in it, making it almost impossible to view the screen for everyday computing chores. No one could ever quite figure out what had possessed Data General to release a system that basically functioned as a $3,000 personal grooming system. I still own one of these systems and once tried to sell it at a garage sale for $25. I am happy to discover they’re currently worth about $500 in the collectibles market.

[v] It has been my privilege to meet the person who holds the world record for getting the highest score ever achieved on this game, a young man who worked for me in the late 1990s (the E.T. game and original Atari 2600 game system are somewhat collectible and still used by those interested in retro gaming. If you wish to experience the horror that was E.T., you can download the game and a 2600 emulator for your PC from various Internet sites. I won’t reveal the name of this stalwart gamer because my revelation might permanently damage his career. When I knew him, he suffered from insomnia, and after playing many hours of E.T. I can understand why.

[vi] I was present at such a demo. I interrupted the demonstrator to inquire “Which one?”

[vii] For instance, another utility called “Big Disk.”

Foreword to the First Edition

In every high-tech company I’ve known, there’s a war going on between the geeks and the suits. Before you start reading a book full of propaganda from software marketing wizard and über-suit Rick Chapman, let me take a moment to tell you what the geeks think.

Play along with me for a minute, will you? Please imagine the most stereotypically pale, Jolt-drinking, Chinese-food-eating, video-game-playing, Slashdot-reading, Linux-command-line-dwelling dork. Because this is just a stereotype, you should be free to imagine either a runt or a kind of chubby fellow, but in either case this isn’t the kind of person who plays football with his high-school pals when he visits mom for Thanksgiving. Also, because he’s a stereotype, I shouldn’t have to make complicated excuses for making him a him.

This is what our stereotypical programmer thinks: “Microsoft makes inferior products, but it has superior marketing, so everybody buys its stuff.”

Ask him what he thinks about the marketing people in his own company. “They’re really stupid. Yesterday I got into a big argument with this stupid sales drone in the break room, and after ten minutes it was totally clear that she had no clue what the difference between 802.11a and 802.11b is. Duh!”

What do marketing people do, young geek? “I don’t know. They play golf with customers or something, when they’re not making me correct their idiot spec sheets. If it was up to me I’d fire ’em all.”

A nice fellow named Jeffrey Tarter used to publish an annual list, called the Soft*letter 100, of the 100 largest personal computer software publishers. Table 1 shows what the top ten looked like in 1984.

Table 1. Top Software Publishers in 1984

Rank       Company               Annual Revenue

1          MicroPro International    $60,000,000

2          Microsoft Corp. $55,000,000

3          Lotus    $53,000,000

4          Digital Research $45,000,000

5          VisiCorp           $43,000,000

6          Ashton-Tate       $35,000,000

7          Peachtree           $21,700,000

8          MicroFocus       $15,000,000

9          Software Publishing        $14,000,000

10         Broderbund       $13,000,000

OK, Microsoft is number 2, but it’s one of a handful of companies with roughly similar annual revenues. Now let’s look at the same list for 2001 (see Table 2)

Table 2. Top Software Publishers in 2001

Rank       Company               Annual Revenue

1          Microsoft Corp.       $23,845,000,000

2          Adobe                       $1,266,378,000

3          Novell   $1,103,592,000

4          Intuit    $1,076,000,000

5          Autodesk           $926,324,000

6          Symantec          $790,153,000

7          Network Associates        $745,692,000

8          Citrix    $479,446,000

9          Macromedia       $295,997,000

10         Great Plains       $250,231,000

Whoa. Notice, if you will, that every single company except Microsoft has disappeared from the top ten. Also notice, please, that Microsoft is so much larger than the next largest player that it’s not even funny. Adobe would double its revenue if it could just get Microsoft’s soda pop budget.

The personal computer software market is Microsoft. Microsoft’s revenue, it turns out, makes up 69 percent of the total revenue of the top 100 companies combined. This is what we’re talking about here.

Is this just superior marketing, as our imaginary geek claims? Or is it the result of an illegal monopoly? (Which begs the question, How did Microsoft get that monopoly? You can’t have it both ways.)

According to Rick Chapman (he’s formally known as Merrill, but everyone calls him Rick), the answer is simpler: Microsoft was the only company on the list that never made a fatal stupid mistake. Whether this was by dint of superior brainpower or just dumb luck, in my opinion the biggest mistake Microsoft made was the talking paperclip. And how bad was that, really? We ridiculed the company, shut off the feature, and went back to using Microsoft Word, Excel, Outlook, and Internet Explorer every minute of every day.

But for every other software company that once had market leadership and saw it go down the drain, you can point to one or two giant blunders that steered the boat into an iceberg. MicroPro fiddled around rewriting printer architecture instead of upgrading its flagship product, WordStar. Lotus wasted a year and a half shoehorning 1-2-3 to run on 640KB machines, and by the time it was done, Excel was shipping and 640KB machines were a dim memory. Digital Research wildly overcharged for CP/M-86 and lost a chance to be the de facto standard for PC operating systems. VisiCorp sued itself out of existence. Ashton-Tate never missed an opportunity to piss off dBASE developers, poisoning the fragile ecology that’s so vital to a platform vendor’s success.

I’m a programmer, of course, so I tend to blame the marketing people for these stupid mistakes. Almost all of them revolve around a failure of nontechnical business people to understand basic technology facts. When Pepsi-pusher John Sculley was developing the Apple Newton, he didn’t know something that every computer science major in the country knows: Handwriting recognition isn’t possible. This was at the same time that Bill Gates was hauling programmers into meetings, begging them to create a single rich-text edit control that could be reused in all their products. Put Jim Manzi (the suit who let the MBAs take over Lotus) in that meeting, and he would be staring blankly and thinking, “What’s a rich-text edit control?” It never would have occurred to him to take technological leadership because he didn’t grok the technology. In fact, the very use of the word “grok” in that sentence would probably throw him off.

If you ask me, and I’m biased, no software company can succeed unless there’s a programmer at the helm. So far the evidence backs me up. But many of these boneheaded mistakes come from the programmers themselves. Netscape’s monumental decision to rewrite its browser instead of improving the old code base cost the company several years of Internet time, during which its market share went from around 90 percent to about 4 percent, and this was the programmers’ idea. Of course, the nontechnical and inexperienced management of that company had no idea why this was a bad idea. There are still scads of programmers who defend Netscape’s ground-up rewrite: “The old code really sucked, Joel!” Yeah, uh-huh. Such programmers should be admired for their love of clean code, but they shouldn’t be allowed within 100 feet of any business decisions, because it’s obvious that clean code is more important to them than shipping, uh, software.

So I’ll concede to Rick a bit and say that if you want to be successful in the software business, you have to have a management team that thoroughly understands and loves programming, but they have to understand and love business, too. Finding a leader with strong aptitude in both dimensions is difficult, but it’s the only way to avoid making one of those fatal mistakes that Rick catalogs lovingly in this book. So read the book, chuckle a bit, and if there’s a stupid head running your company, get your resume in shape, and start looking for a house in Redmond.

Joel Spolsky

http://www.joelsonsoftware.com

http://www.fogcreek.com

Afterword: Stupid Development Tricks

The complete title of In Search of Stupidity includes the phrase “High-Tech Marketing Disasters,” and from these words you might conclude that it’s a firm’s marketers who usually bear the chief responsibility for major corporate catastrophes. This isn’t true. To be worthy of mention in this book, it took the combined efforts of personnel in upper management, development, sales, and marketing, all fiercely dedicated to ignoring common sense, the blatantly obvious, and the lessons of the past. Major failure doesn’t just happen: To achieve it, everyone must all pull together as a team.

Chapter 4 of In Search of Stupidity helps drive this point home. For MicroPro to plummet from the software industry’s pinnacle to permanent oblivion took A) upper management’s mishandling of development and market timing, B) the marketing department’s idiotic decision to create a fatal product-positioning conflict, and C) the development team’s dimwitted decision to rewrite perfectly good code at a critical time because it wanted to write even better code that no one really needed. A magnificent example of different groups within a company all cooperating to ensure disaster.

In this spirit, I’ve decided to include selected portions of an interview with Joel Spolsky that ran on SoftwareMarketSolution (http://www.softwaremarketsolution.com), a Web site sponsored by the author of this book that provides resources and information on products and services of interest to high-tech marketers. (By the way, this interview was “picked up” by Slashdot [http://www.slashdot.org], a Web site dedicated to all things open source. It generated a considerable amount of comment and controversy. You can search the Slashdot archives to read what other people thought and gain further insight into Joel’s opinions.)

I regard Joel Spolsky, president and one of the founders of Fog Creek Software (http://www.fogcreek.com), as one of the industry’s most fascinating personalities. He worked at Microsoft from 1991 to 1994 and has over 10 years of experience managing the software development process. As a program manager on the Microsoft Excel team, Joel designed Excel Basic and drove Microsoft’s Visual Basic for Applications (VBA) strategy. His Web site, Joel on Software (http://www.JoelonSoftware.com), is visited by thousands of developers worldwide every day. His first book, User Interface Design for Programmers, was reviewed on SoftwareMarketSolution, and I regard it as a must-have for anyone involved in developing and marketing software.

Why this interview? If you’ve ever worked on the software side of high technology you’ve probably experienced the following. After a careful analysis of your product’s capabilities, the competition, and the current state of the market, a development and marketing plan is created. Release time frames are discussed and agreed to. Elaborate project management templates are built and milestones are set. You post the ship date up on a wall where everyone in your group can see it and begin to work like crazed beavers to meet your target.

Then, as the magic day looms nearer, ominous sounds emit from development. Whispers of “crufty code” and “bad architecture” are overheard. Talk of “hard decisions” that “need to be made” start to wend their way through the company grapevine. People, especially the programmers, walk by the wall on which you’ve mounted the ship date, pause, shake their heads, and keep walking.

Finally, the grim truth is disgorged. At a solemn meeting, development tells everyone the bad news. The code base of the current product is a mess. Despite the best and heroic efforts of the programmers, they’ve been unable to fix the ancient, bug-ridden, fly-bespeckled piece of trash foisted on them by an unfeeling management. No other option remains. The bullet must be bitten. The gut must be sucked up. The Rubicon must be crossed. And as that sinking feeling gathers in your stomach and gains momentum as it plunges toward your bowels, you realize that you already know what you’re about to hear. And you already know that, after hearing it, you’ll be groping blindly back to your cubicle, your vision impeded by the flow of tears coursing down your face, your eyes reddened by the sharp sting of saline. And you’ve already accepted it’s time to get your resume out and polished, because the next few financial quarters are going to be very, very ugly.

And then they say it. The product requires a ground-up rewrite. No other option exists.

Oh, you haven’t been through this yet? Well, just wait. You will. However, as you’ll learn, what you’re going to be told may very well not be true. After reading this interview, you’ll be in a better position to protect your vision and your career in the wonderful world of high tech.

And now . . .

An Interview with Joel Spolsky

SoftwareMarketSolution: Joel, what, in your opinion, is the single greatest development sin a software company can commit?

Joel Spolsky: Deciding to completely rewrite your product from scratch, on the theory that all your code is messy and bug-prone and is bloated and needs to be completely rethought and rebuilt from ground zero.

SMS: Uh, what’s wrong with that?

JS: Because it’s almost never true. It’s not like code rusts if it’s not used. The idea that new code is better than old is patently absurd. Old code has been used. It has been tested. Lots of bugs have been found, and they’ve been fixed. There’s nothing wrong with it.

SMS: Well, why do programmers constantly go charging into management’s offices claiming the existing code base is junk and has to be replaced?

JS: My theory is that this happens because it’s harder to read code than to write it. A programmer will whine about a function that he thinks is messy. It’s supposed to be a simple function to display a window or something, but for some reason it takes up two pages and has all these ugly little hairs and stuff on it and nobody knows why. OK. I’ll tell you why. Those are bug fixes. One of them fixes that bug that Jill had when she tried to install the thing on a computer that didn’t have Internet Explorer. Another one fixes a bug that occurs in low-memory conditions. Another one fixes some bug that occurred when the file is on a floppy disk and the user yanks out the diskette in the middle. That LoadLibrary call is sure ugly but it makes the code work on old versions of Windows 95. When you throw that function away and start from scratch, you are throwing away all that knowledge. All those collected bug fixes. Years of programming work.

SMS: Well, let’s assume some of your top programmers walked in the door and said, “We absolutely have to rewrite this thing from scratch, top to bottom.” What’s the right response?

JS: What I learned from Charles Ferguson’s great book (High St@kes, No Prisoners) is that you need to hire programmers who can understand the business goals. People who can answer questions like “What does it really cost the company if we rewrite?” “How many months will it delay shipping the product?” “Will we sell enough marginal copies to justify the lost time and market share?” If your programmer insists on a rewrite, they probably don’t understand the financials of the company, or the competitive situation. Explain this to them. Then get an honest estimate for the rewrite effort and insist on a financial spreadsheet showing a detailed cost/benefit analysis for the rewrite.

SMS: Yeah, great, but, believe it or not, programmers have been known to, uh, “shave the truth” when it comes to such matters.

JS: What you’re seeing is the famous programmer tactic: All features that I want take 1 hour, all features that I don’t want take 99 years. If you suspect you are being lied to, just drill down. Get a schedule with granularity measured in hours, not months. Insist that each task have an estimate that is 2 days or less. If it’s longer than that, you need to break it down into subtasks or the schedule can’t be realistic.

SMS: Are there any circumstances where a complete code rewrite is justified?

JS: Probably not. The most extreme circumstance I can think of would be if you are simultaneously moving to a new platform and changing the architecture of the code dramatically. Even in this case you are probably better off looking at the old code as you develop the new code.

SMS: Hmm. Let’s take a look at your theory and compare it to some real-world software meltdowns. For instance, what happened at Netscape?

JS: Way back in April 2000, I wrote on my Web site that Netscape made the single worst strategic mistake that any software company can make by deciding to rewrite their code from scratch. Lou Montulli, one of the five programming superstars who did the original version of Navigator, e-mailed me to say, “I agree completely, it’s one of the major reasons I resigned from Netscape.” This one decision cost Netscape 4 years. That’s 3 years they spent with their prize aircraft carrier in 200,000 pieces in dry dock. They couldn’t add new features, couldn’t respond to the competitive threats from IE, and had to sit on their hands while Microsoft completely ate their lunch.

SMS: OK, how about Borland? Another famous meltdown. Any ideas?

JS: Borland also got into the habit of throwing away perfectly good code and starting from scratch. Even after the purchase of Ashton-Tate, Borland bought Arago and tried to make that into dBASE for Windows, a doomed project that took so long that Microsoft Access ate their lunch. With Paradox, they jumped into a huge rewrite effort with C++ and took forever to release the Windows version of the product. And it was buggy and slow where Paradox for DOS was solid and fast. Then they did it all over again with Quattro Pro, rewriting it from scratch and astonishing the world with how little new functionality it had.

SMS: Yeah, and their pricing strategy didn’t help.

JS: While I was on the Excel team, Borland cut the MSRP on Quattro Pro from around $500.00 to around $100.00. Clueless newbie that I was, I thought this was the beginning of a bloody price war. Lewis Levin[1], Excel BUM (Business Unit Manager) was ecstatic. “Don’t you see, Joel, once they have to cut prices, they’ve lost.” He had no plan to respond to the lower price. And he didn’t need to.

SMS: Having worked at Ashton-Tate, I have to tell you the dBASE IV code base was no thing of beauty. But, I take your point. Actually, I saw this syndrome at work in Ashton-Tate’s word-processing division. After they bought MultiMate, they spent about 2 years planning a complete rewrite of the product and wasted months evaluating new “engines” for the next version. Nothing ever happened. When a new version of the product was released, it was based on the same “clunky” engine everyone had been moaning about. Of course, in those 2 years WordPerfect and Microsoft ate Ashton-Tate’s word-processing lunch.

JS: Ashton-Tate had a word processor?

SMS: Yes, but nothing as good as WordStar, mind you!

JS: Hmm. That reminds me that Microsoft learned the “no rewrite” lesson the hard way. They tried to rewrite Word for Windows from scratch in a doomed project called Pyramid, which was shut down, thrown away, and swept under the rug. Fortunately for Microsoft, they did this with parallel teams and had never stopped working on the old code base, so they had something to ship, making it merely a financial disaster, not a strategic one.

SMS: OK, Lotus?

JS: Too many MBAs at all levels and not enough people with a technical understanding of what could and needed to be built.

SMS: And I suppose building a brand-new product called “Jazz”[2] instead of getting 1-2-3 over to the Mac as quickly as possible, thus staking Microsoft to a 2-year lead with Excel, is an example of the same thing?

JS: Actually, they made a worse mistake: They spent something like 18 months trying to squeeze 1-2-3/3.0 into 640KB. By the time the 18 months were up, they hadn’t succeeded, and in the meantime, everybody bought 386s with 4 megs of ram. Microsoft always figured that it’s better to let the hardware catch up with the software rather than spending time writing code for old computers owned by people who aren’t buying much software any more.

SMS: WordPerfect?

JS: That’s an interesting case and leads to another development sin software companies often make: using the wrong level tools for the job. At WordPerfect, everything, including everything, had to be written in assembler. Company policy. If a programmer needed a little one-off utility, it had to be hand-coded and hand-optimized in assembler. They were the only people on earth writing all-assembler apps for Windows. Insane. It’s like making your ballerinas wear balls and chains and taping their arms to their sides.

SMS: What should they have been coding in?

JS: In those days? C. Or maybe Pascal. Programmers should only use lower level tools for those parts of the product where they are adding the most value. For example, if you’re writing a game where the 3D effects are your major selling point, you can’t use an off-the-shelf 3D engine; you have to roll your own. But if the major selling point of your game is the story, don’t waste time getting great 3D graphics—just use a library. But WordPerfect was writing UI code that operates in “user time,” and doesn’t need to be particularly fast. Hand-coded assembler is insane and adds no value.

SMS: Yes, but isn’t such code tight and small? Don’t products built this way avoid the dreaded “bloatware” label?

JS: Don’t get me started! If you’re a software company, there are lots of great business reasons to love bloatware. For one, if programmers don’t have to worry about how large their code is, they can ship it sooner. And that means you get more features, and features make users’ lives better (if they use them) and don’t usually hurt (if they don’t). As a user, if your software vendor stops, before shipping, and spends 2 months squeezing the code down to make it 50 percent smaller, the net benefit to you is going to be imperceptible, but you went for 2 months without new features that you needed, and that hurt.

SMS: Could this possibly account for the fact that no one uses WordStar version 3.3 anymore despite the fact it can fit on one 1.4 meg floppy?

JS: That and Control-K. But seriously, Moore’s Law makes much of the whining about bloatware ridiculous. In 1993, Microsoft Excel 5.0 took up about $36.00 worth of hard drive space. In 2000, Microsoft Excel 2000 takes up about $1.03 in hard drive space. All adjusted for inflation. So stop whining about how bloated it is.

SMS: Well, we’ve had much personal experience with the press slamming a product we were managing.  For example, for years reviewers gave MicroPro hell over the fact it didn’t support columns and tables.  Somehow the fact that the product would fit on a 360KB floppy just didn’t seem to mean as much as the idea that the reviewer couldn’t use our product to write his or her resume.

JS: There’s a famous fallacy that people learn in business school called the 80/20 rule. It’s false, but it seduces a lot of dumb software start-ups. It seems to make sense. Eighty percent of the people use 20 percent of the features. So you convince yourself that you only need to implement 20 percent of the features, and you can still sell 80 percent as many copies. The trouble here, of course, is that it’s never the same 20 percent. Everybody uses a different set of features. When you start marketing your “lite” product and you tell people, “Hey, it’s lite, only 1MB,” they tend to be very happy, then they ask you if it has word counts, or spell checking, or little rubber feet, or whatever obscure thing they can’t live without, and it doesn’t, so they don’t buy your product.

SMS: Let’s talk about product marketing and development at Microsoft. How did these two groups work together?

JS: Well, in theory, the marketing group (called “product management”) was supposed to give the development team feedback on what customers wanted. Features requests from the field. That kind of stuff. In reality, they never did.

SMS: Really?

JS: Really. Yes, we listened to customers, but not through product management—they were never very good at channeling this information. So the program management (design) teams just went out and talked to customers ourselves. One thing I noticed pretty quickly is that you don’t actually learn all that much from asking customers what features they want. Sure, they’ll tell you, but it’s all stuff you knew anyway.

SMS: You paint a picture of the programmer almost as a semideity. But in my experience, I’ve seen powerful technical personalities take down major companies. For instance, in The Product Marketing Handbook for Software, I describe how the MicroPro development staff’s refusal to add the aforementioned columns and table features to WordStar badly hurt the product’s sales.[3] How do you manage situations like these?

JS: This is a hard problem. I’ve seen plenty of companies with prima donna programmers who literally drive their companies into the ground. If the management of the company is technical (think: Bill Gates), management isn’t afraid to argue with them and win—or fire the programmer and get someone new in. If the management of the company is not technical enough (think: John Sculley), they act like scared rabbits, strangely believing that this one person is the only person on the planet who can write code, and it’s not a long way from there to the failure of the company.

If you’re a nontechnical CEO with programmers who aren’t getting with the program, you have to bite the bullet and fire them. This is your only hope. And it means you’re going to have to find new technical talent, so your chances aren’t great. That’s why I don’t think technology companies that don’t have engineers at the very top have much of a chance.

SMS: Joel, thank you very much.

[1] Lewis Levin got his start in the industry as the product manager for MicroPro’s PlanStar.

[2] Jazz was intended to be the Macintosh equivalent of Symphony for the PC. Like most of the integrated products, it managed to do too much while not doing anything particularly well.

[3] Over time, the programming staff noted that requests for this feature from users were dropping. This was absolutely true, as people who wanted this capability in a word processor bought other products.

I love this book. When telling stories about some of the finest fiascos in our industry, the author offers unique insight and humor. The result is a book that is both readable and worth reading. That’s a powerful combination that I find increasingly uncommon. I was a fan of the first edition of In Search of Stupidity, and I am honored to be writing this foreword for the second edition.

I am particularly fond of the title of this book. Taken completely out of context, it suggests that if you want to find stupidity in our industry, you have to search for it. I envision a typical person who wanders accidentally into the Software and Computers section of his local bookstore. He sees this book on the shelf and believes that stupidity in high tech is difficult to find.

Aw, never mind that. People are not so easily fooled. Anybody who reads the newspaper can easily look at our industry and see that stupidity is like beer at an NFL game: Half the people have got plenty of it, and they keep spilling it on the other half.

As of August 2006, here is what the average person knows about the world of high-tech products:

    *    The FBI just spent $170 million on a software project that completely failed and delivered nothing useful. Most of us would have been willing to deliver them nothing useful for a mere $85 million or so.

    *    We each get 50 e-mails a week from eBay, none of which actually came from eBay. So we find somebody who knows about computers and ask why, and he starts spewing stuff that sounds like Star Trek technobabble.

    *    The movie industry wants us to buy all our DVDs again so we can see them in “high definition,” but it can’t decide which new format it wants to support. Either way, this comes in the nick of time, because as we all know, the central problem with DVD technology is the atrocious picture quality.

    *    The time between the initial release of Windows XP and Windows Vista is roughly the life span of a dog, and apparently the main new feature is that it will be harder to use digital music and video content. Oh yeah, and it looks prettier.

The world of high tech is fouled up beyond all recognition, and everybody knows it.

But everybody loves reading about it. When it comes to failed software projects or dumb marketing mistakes, the mainstream news media is eager to print anything they can get their hands on. Nobody writes stories about software projects or marketing efforts that succeed.

The funny part is that most of the stupidity never makes it into print. Those of us in the industry know that things are actually even stupider than the perspective in the press. For example, most people know that whenever Microsoft announces a new product, it gives it a really boring name that nobody can remember. But those of us in the industry know that the boring name was immediately preceded by a “code name” that was memorable or even clever. It’s almost like Microsoft has a department whose mission is to make sure their public image always looks lame and pedestrian compared to Apple.

And let’s not forget that stupidity can show up in success as well as failure. Do you know the inside story of the Motorola RAZR? In the original plan, the powers-that-be at Motorola were convinced that the RAZR would be a “boutique phone,” a niche product that would appeal to only a small segment of the market. It ordered enough components to make 50,000 of them. In the first quarter of production, the wireless companies placed orders for more than a million units. Motorola had the most popular cell phone on the market, and it was completely unprepared for it. It took them a year to get production capacity up to meet the demand. Today, Motorola is shipping RAZR phones at a pace that is equivalent to selling 50,000 of them every day before lunch.

In the news media, on the message boards, and here in this book, stories about product disasters in our industry are a lot of fun to read.

That’s why the first edition of this book was great, and this one is even better. I applaud the author for the changes he has made in the second revision, giving more specific attention to the matter of learning from the marketing mistakes made by others. I imagine lots of people will enjoy that kind of thing.

But truth be told, not all of us aspire to such a high and noble station.

If you are like me, you probably lied to yourself about why you wanted to read this book. You told yourself how great it would be to learn from the mistakes of others. In reality, we don’t want to learn—we want to gloat. We like to watch things crash and burn. This book is the marketing equivalent of the car chase scene from Terminator 3.

production: note accent in next graf

Wielders of clichés would say that misery loves company. Call it what you will, but let’s just admit it together: We like to read about products and marketing efforts that exploded in balls of flame. It helps us feel better about our own stupidity.

And in my opinion, that’s OK. In the vast constellation of unhealthy vices and guilty pleasures, this book isn’t really all that harmful.

Eric Sink                                                                  Source Gear
http://software.ericsink.com/

In 1987, while working at MicroPro as WordStar product manager, I was assigned to participate in one of high tech’s hoariest rituals: a media tour. Despite the passage of years, media tours remain a PR staple to these days, and are usually used by mid-level firms. A tour consists of arranging for members of your senior management team to meet with key members of the blogging empires and analysts who write about and cover your market. The hope is that once you’ve established a backslapping, hail-fellow-well-met relationship with an editor from Wired Magazine or a guru from Gartner they’ll be more inclined to write nice things about your company and its products. Sometimes it works out that way. The quid pro quo driving the tour is that in return for putting up with you disturbing their day, you’ll provide fresh news for the blog and buy research from the analysts. Sometimes it works out that way as well.

 

Tour personnel usually consist of at least one member of upper management, one member of middle management capable of giving a comprehensive product demonstration (informally, this person is referred to as “the demo dolly”), and a PR person. For this tour, upper management was represented by Leon Williams, then president of MicroPro, I appeared in the role of the demo dolly, and rounding out the group was a sad little PR type who confessed at the end of our trip that she really didn’t like working with members of the media. Once you’ve been on one or two media tours, most people regard them with the same affection as a root canal. Most tours consist of a trip to New York, Boston, and San Francisco, the three major hubs for high-tech media and analysis.

 

Our itinerary included a side trip to Austin, Texas to meet Jim Seymour, long-time editor and columnist for the then mighty Ziff publishing empire. On the day of our appointed meeting, we trekked out to Seymour’s house in the Austin hills (which itself is becoming a major technology hub), where I dutifully demonstrated the latest, greatest version of WordStar 5.0, the one that couldn’t print. Luckily for me, Seymour, engrossed by the Macintosh (as were most members of the media at the time), paid only cursory attention to the demo, and instead insisted on demoing his latest Mac toys for us. Once everyone was done showing off, we settled down for the obligatory period of chitchat before we headed off to the airport and our next stop in the never-ending tour.

 

Heart of Darkness

For no particular reason that I can remember, the topic turned to Ashton-Tate, publisher of the widely popular dBASE database program. Seymour started talking about a meeting he’d attended with other members of the media where Ed Esber, CEO of the database giant, addressed the group. As he began talking about Esber, his face suddenly developed an expression of contempt. He told us how during the speech Esber had stated at one point that he wasn’t necessarily the smartest guy in software. Seymour paused, then looked at our group and said, “We were all thinking, boy, you’ve got that right, Ed.” The venom in his voice was surprising.

 

I didn’t pay much attention to the exchange at the time, but after leaving MicroPro to become a product manager at Ashton-Tate, I later realized I’d had my first glimpse into the dark heart of one of software’s biggest and most unexpected meltdowns. As events progressed in the industry, it became clear that as far as the PC media was concerned, it was “Ed Esber. He dead.” They wanted his head on a stake.

 

At its height in the 1980s, Ashton-Tate was one of software’s “Big Three,” the other members of the triumvirate being Microsoft and Lotus. Microsoft had DOS, Lotus ruled spreadsheets, and Ashton-Tate was the database king. The lucrative word-processing franchise was being fought over by MicroPro, WordPerfect, MultiMate, Microsoft with its Word product, and a host of smaller players.

dBASE was originally designed to help place winning bets in football pools and was the creation of Wayne Ratliff, a contract programmer at the U.S. Jet Propulsion Laboratory. Although Ratliff didn’t get rich on sports betting, he did decide his new software program had commercial potential. Named “Vulcan” in honor of the home planet of Star Trek’s Mr. Spock, Ratliff placed his first ad for the product in the October 1979 issue of BYTE magazine. At its release, Vulcan was priced at $50, and though there was flurry of initial interest,[1] the stress of trying to ship, support, and manage a one-man company was overwhelming. Ratliff was on the verge of ceasing operations when software reseller George Tate contacted him.

 

Tate and his partner, Hal Lashlee, took a look at Vulcan, quickly realized its potential, and bought exclusive distribution rights. At the time of the deal they were running a not-very-successful mail-order software company called Software Plus. Believing that Vulcan would turn things around for their company, they renamed the company Ashton-Tate to give it a more “upscale” image. (A great deal of speculation has centered over where Tate came up with “Ashton”—no one who worked at the company had that name. The general belief is it was picked because Ashton sounded “British.” It should be noted, however, that Tate had a pet parrot named Ashton.)

 

After a quick trademark search uncovered potential problems with the name Vulcan, the product was rechristened dBASE II. There was no dBASE I, but even in the early 1980s people were reluctant to buy 1.0 releases of software products. The company upped the cost of dBASE II to $695, a very competitive price for a product in its class and with its capabilities, and placed full-page magazine ads featuring a picture of a sump pump and the proclamation that while the pump might suck, dBASE didn’t (or words to that effect). Sales took off and by 1985 Ashton-Tate’s revenues were over $100 million a year and climbing, mostly from sales of dBASE II and it successors, dBASE III and III+. The company also enjoyed modest sales success with its Framework integrated product. Integrated products attempted to combine word processing, database management, a spreadsheet, and graphics all within a single program. Framework was considered the best of breed in this market segment, but the integrateds, which included titles such as Lotus Symphony and Ability,[2] never sold in the numbers projected, and the category largely disappeared in the early 1990s.

 

In addition to ads featuring plumbing, another reason for dBASE’s quick rise to prominence was that the company made much of the fact that dBASE was a relational database management system (RDBMS). The relational model was first introduced in a paper published in 1969 by an English computer scientist, Dr. E. F. Codd, who worked for IBM. More flexible and expandable than competing technologies, relational products over time were adopted by most DBMS developers and users.

 

In addition to a table-oriented paradigm, Codd’s definition of a RDBMS also incorporated several key capabilities and functions a product needed to possess before it could be called a “truly” relational system. None of the early RDMBS systems for the PC incorporated all of Codd’s requirements, and religious arguments raged constantly over which product was “more” or “less” relational than another. dBASE II was probably “less” relational than some of its competitors, but that also meant it could run on systems with less memory and reach a broader audience. Despite the pooh-poohing of purists, for several years dBASE became almost synonymous with the relational concept.

 

In 1985, Tate died unexpectedly of a heart attack at the age of 40, and Esber, his second in command, took over the leadership of Ashton-Tate. Esber was a Harvard-trained MBA and a former product manager at VisiCorp, the company that had seen its VisiCalc spreadsheet eclipsed by Lotus 1-2-3. Esber announced he was going to bring a more professional management style to Ashton-Tate, replacing Tate’s more hands-on and emotional approach. Despite having a bachelor’s degree in computer engineering, Esber didn’t have a reputation of being technically astute, though this was misleading; he had programmed extensively earlier in his career.

 

Esber did fancy himself something of a business guru, and one of his favorite quotes was “A computer will not make a good manager out of a bad manager. It makes a good manager better faster and a bad manager worse faster.” He had something there. It had taken Tate about five years to build Ashton-Tate to software giant status; it would take Esber only two-and-a-half years to put the company on the road to ruin. And Esber had a PC on his desk the entire time.

Introduction

Welcome to the Korean edition of In Search of Stupidity: Over 20 Years of High-Tech Marketing Disasters!  South Korea is a country that knows something about stupidity, having  suffered since the 1950's from the ongoing effects of what may the 19th and 20th centuries stupidest idea, Communism. But we in the US are well aware that South

Korea is also a first-world country with a first-class, high-tech infrastructure.  This accounts for the fact that Koreans are known to be addicted to playing and dominating the rest of the world in massive MUDs (Multiuser Dugeon Domain) games, such as World of Warcraft, Starcraft, and MU. In the US, if your son or daughter looks up from their PC sometime and mutters something about how their shiny World of Warcraft castle has been sacked by rampaging Orcs, evil knights, or pitiless demons, the odds are often quite good it's a Korean doing the sacking and rampaging!

This means the time is right for a Korean edition of Stupidity.  As Korea takes its place amongst the world's high-tech elite, it faces an important choice!  Will Korean companies  A) repeat the mistakes made by others and suffer repeated financial losses, layoffs, and much angst and personal woe as its software and hardware industries attempt to grow?  Or will it be B) Korea gracefully avoids repeating past disasters made by idiots in other countries while it grows to unparalleled high-tech greatness?

Sorry, I'm voting for "A." Human nature is universal and human stupidity is an incredibly powerful force capable of ignoring history, commonsense, and practical experience (if you doubt this, just spend a minute looking north).  But you, lucky reader, are equipped with a valuable antidote to high-tech stupidity.  You hold in your hands not just a book, but an institutional memory that will pour foresight and advanced knowledge into your brain.  Soon, you will be equipped to avoid the mistakes of the past and will be able to march forward into the future secure in the knowledge that if you do indeed screw up, your mistakes will probably be original ones.

And this, at the very least, will put you one up on the industry's biggest industry software company, which, with its latest release, managed to repeat positioning mistakes made almost a quarter century ago by MicroPro, which has earned its own inglorious chapter in this tome.  I'm speaking, of course, of the launch of Windows Vista. Several months after the release of Vista to businesses (the product was released to consumers in January, 2007) the consensus of the market has been that Vista is a flop.  It’s a peculiar type of flop. Financially, Vista is making a great deal of money for Microsoft.  No surprise there; after all, the company has an almost complete monopoly in the desktop OS and main business applications markets and the dominant position in the server OS segment.  OEMs are in complete thrall to Microsoft; if you don’t offer Windows on your laptop, you’ve got an unsellable silicon brick in your warehouse.

But that said, Vista has failed to meet any of the lofty goals originally set for it.  It has failed to excite any segment of the market, including key influencer groups such as the press and the gamers.  It is not driving sales of new computers.  At retail, the pull figures for Windows out of the channel are dreary and show no signs of changing (we’re researching these numbers and will be reporting on them soon in an upcoming issue of Softletter). The blogs are condescending and even many Microsoft devotees are dismayed by what they see and hear.  Legitimate copies of Windows XP (and even 2000!) just became more valuable and knuckles will have to be severely cracked before the hands gripping those precious old boxes relax and allow a fresh copy of Vista to be shoved into a reluctant grasp.  The fact is, few people see any need or have any desire to buy Vista.

In all fairness, some of the problems that accompanied the Vista launch are beyond Microsoft’s control.  As the Internet has matured as a development and infrastructure platform, the growth of SaaS and advent of hybrid applications has led to an inevitable challenge to Microsoft’s monopolies in OSs and desktop applications.  Over the next several years, Microsoft will need to execute the painful chore of chewing off its own two strong revenue arms (but not too quickly) and hope they regenerate into new revenue and profit builders.  It’s not a tasty task, and you can’t really blame the company for avoiding it, necessary though it is.

But  paradigm shifts aside, the biggest problem crippling the Vista rollout was Microsoft’s complete bungling of the critical task of properly positioning the product.  Vista’s marketing identity is a dirty smear in the mind’s eye of the public; today, it’s almost impossible to find anyone (particularly anyone from Microsoft) who can cleanly and quickly describe why Vista is superior to Windows XP.  And a confused buyer is a buyer is a buyer that does not buy (or one who buys something they can understand).

Microsoft’s Positioning Sins

During the Vista rollout, Microsoft committed several positioning sins. Redmond’s mistakes begin with the deadly transgression of creating a positioning conflict within its own product lines. It’s a surprising mistake. During the history of the Windows 9.X vs. NT product lines, Microsoft was frequently tormented by customers confused by which product to buy, a mistake highlighted by the firm’s creation of one of high-tech’s dumbest ads, the “two nags racing” piece which you can see preserved on www.insearchofstupidity.com in the Museum of Stupid High-Tech Marketing. While 9.X and NT both existed, Microsoft frequently had to explain why a customer should buy one product over the other when both were called Windows, looked very much the same, did very much the same thing, and cost pretty much the same. But Microsoft was lucky in that during this period its chief jousting opponent was IBM and OS/2.

But with Vista Microsoft pointed the lance at its own foot, kicked its marketing war horse into action, and firmly pinned its toes to the ground. There are no less than six (actually seven, counting the OEM version. Wait, isn’t that eight if you count academic packages? Are we missing some other variants? Probably. Does Windows CE count?) versions of Vista currently available for sale:

  • Vista Starter (which you can’t buy unless you live in the Third World, apparently.)
  • Vista Home Basic (which, amazingly, does not include the new UI.)
  • Vista Home Premium
  • Vista Business
  • Vista Enterprise
  • Vista Ultimate

This plethora of choices leads customers to naturally ask a deadly question: “Which one do I buy?” Before, a consumer only had to compare between Windows XP Home and Professional (Windows Media Edition came along too late in the life cycle of the product to become well-known enough to confuse anyone). To help customers, Microsoft has published a blizzard of collateral, comparison sheets, pricing matrices, etc., etc. Thinking about whether to buy “Vista Home Premium” vs “Vista for Business?” What’s “Super Fetch” worth to you? How about “Volume Shadow Copy.” But it’s good to know the “Business” version includes “Premium Games.” Just what a business person is looking for in their business version of Vista. Why not include applications that have some applicability to business concerns? Maybe Stock analysis and reporting? Specialized business calculators? Something?  Anything?

And making things ever more interesting is that the EULAs accompanying each version are different.  Want to create a virtual machine on your PC and run Vista Home in it?  You can’t!  How about Vista Business?  You can!  Why one and not the other?  Who knows.

Moving along down the path of positioning damnation is Microsoft’s failure to attach any powerful or desirable image to Windows Vista as a product line.  Try to imagine in your mind what memorable picture or capability is associated with Vista.  None comes to mind. The product does have a new interface, but Microsoft made no attempt to build a compelling image of usefulness around the AeroGlass UI. Yes, the icons are bigger and the screen savers are prettier, but so what? Microsoft might have discussed how the new desktop gave users “X-ray vision” like Superman, increasing their day to day productivity while working with Windows, but it didn’t. Vista is supposed to be far more secure than XP, and perhaps Microsoft could have discussed “Fort Knox,” an integrated group of technologies that allowed you to lock up your PC with bank vault tightness, but it didn’t. (Given Microsoft’s past security woes, it may have lacked the stomach for this gambit.)

By contrast, when Apple released Tiger (OS X 1.4) the market and the  press were bombarded with information spotlighing “Spotlight,” an integrated search utility baked into the new release.  Desktop search was by no means new on either Macs or PCs, but the Mac campaign succeeded in making people aware of its usefulness and, more importantly, gave them a mental picture of why they might want to give Tiger a look.  With Leopard (OS X 1.5), the emphasis was on “Time Machine” (integrated backup).

Another key mistake made in launching Vista was to match features to pricing expectations and here Microsoft has also failed, particularly in respect to Windows Ultimate.  Ultimate is the kitchen sink of Windows, the one with all the toys and whistles and it’s expensive at $450 for a retail version (and pricey at $270 for the upgrade version).  But not to worry! With your purchase of Ultimate you’re promised all sorts of goodies only you, our ultimate customer, can download.  And what are these exciting ultimate premiums? Well, to date, they include fun things like Windows Hold ‘Em (a poker game), extra language packs (those squeals of delight are coming from buyers who just unpacked Finnish) for the Windows multi-language user interface, Secure Online Key Backup, and BitLocker Drive Preparation Tool. (The latter two products are included in other, cheaper versions of Windows.) And oh, let’s not forget the new screen saver that let’s you use videos.  Alas, it’s in beta and not due to be finished for a while yet. Ultimate customers, are, of course, delighted with all of this.

In its long and storied history, Microsoft has distinguished itself from its competition by its ability to avoid the self-inflicted wound of stupid marketing.  With the release of Windows Vista, this has changed. But during the release of Windows Vista, Microsoft has repeated mistakes made by MicroPro (positioning conflict), Borland (positioning conflict, pricing/feature disparity), Novell (positioning conflict), Ashton-Tate (pricing /feature disparity coupled with inept PR) and itself (Windows 9X vs. NT), proving that the company  now suffers from the same institutional memory hole that afflicts much of high-tech. The Vista release now serves as a valuable and contemporary object lesson in how not to position and launch a new software product.

Best of luck!

The Strange Case of Dr. Open and Mr. Proprietary

As noted in Chapter 2 of this book, the release of the Altair microcomputer in 1975 heralded the beginning of the modern high-tech industry. But observers of the period also believe there was more to the Altair than just chips; the unit seemed to emit a mysterious elixir that entered the body of computer aficionados worldwide and sparked a strange war of the soul that has raged in the body of the computer geekdom for more than three decades. The war is between those who advocate for free software and open, patentless technology available to all and those who believe in making substantial sums of money from selling proprietary software and the vigorous protection of intellectual property. It’s the Kumbayahs vs. the Capitalists.

Other influences may be responsible for the ongoing struggle. Perhaps Star Trek bears some of the blame. Few in microcomputing hadn’t watched the series, and as Captain Kirk, Mr. Spock, Bones, Scottie, and their innumerable successors went gallivanting through the galaxy, they seemed to have no visible means of financial support. No one in the Star Trek universe wearing green eye shades ever appeared to worry about the propensity of the various casts to blow up what you’d think were undoubtedly very expensive space ships, given their capabilities of violating the laws of physics, transporting the crew to numerous planets inhabited by women who spent most of their time wearing lingerie and dodging ray gun fire from angry races of aliens who kept screaming “kaplok!” (and who also seemed to have no monetary worries). Perhaps the reason for Captain Kirk’s insouciance lay in the fact that everyone in Star Trek had access to what were called “transporters,” magical devices that could be used to whisk you from the space ship Enterprise to a planet without having to pay a toll. Later in the series’ development, transporters could be used to create chocolate milk shakes, drinks, and even the occasional boyfriend or girlfriend via simple voice commands. And all for free!

Of course, no computer has a Star Trek–like transporter system built into it, but from the standpoint of people interested in obtaining software without forking over monetary compensation, it has something almost as good. That good thing is the “copy” command. And since software, unlike milk shakes, drinks, and boyfriends, is already digitized, just about anyone can execute this wondrous command and enjoy a cornucopia of software in an environment free of the distasteful economic friction of “paying.”

Technology’s interest in the concept of free software was demonstrated almost conterminously with the release of the Altair in the events surrounding the “liberation” of the first BASIC for this pioneering machine. When first available, the Altair had no useful software, and the market was eagerly awaiting the release of Altair BASIC (waiting was something Altairians were very good at doing because Altair maker MITS was legendary for announcing new products it couldn’t deliver, a habit the rest of the industry soon learned to emulate). The product had been developed by a small software firm, Micro-Soft, run by two people no one had ever heard of, Paul Allen and Bill Gates. Micro-Soft had cut a deal with MITS to receive a royalty on every sale of Altair BASIC and was eagerly waiting for a stream of revenue to flow into the tiny firm’s coffers upon the official release of the new product to a marketer eager to buy it.

Unfortunately for Gates’s and Allen’s short-term plans, someone had appropriated an early version of Micro-Soft’s BASIC, stored on paper tape, at a small MITS trade show held in Palo Alto in 1975. The tape was promptly reproduced and then handed out at such venues as the Homebrew Computer Club, a semilegendary group of computer hackers and enthusiasts who met regularly in Silicon Valley to share information, gossip, advice, and other things, such as “liberated” chips and especially liberated Altair software. Soon, paper tapes containing an early, buggy version of Altair BASIC were in wide use and oddly enough, no one offered to pay Micro-Soft a dime for the product.

In 1975 there was very little that was kumbayah about Bill Gates, and he responded to the purloining of Microsoft BASIC by writing an open letter to the software liberators, published in the Homebrew Computer Club’s newsletter (and in similar publications), chiding them for their thieving ways and asking them to voluntarily pay for the privilege of using his BASIC. His letter made the logical point that if people weren’t recompensed for all their time and hard work spent creating new and better software products, they would have no incentive to do so, and the software industry would wither and die.

Gates’s pleas for financial remuneration went widely unheeded. The very act of releasing the letter generated generous amounts of sneers and opprobrium from software’s kumbayahs, three hundred or four hundred letters addressed to Gates chastising him for his greed, and about three or four voluntary payments for Altair BASIC. Ruined by the premature widespread release of Altair BASIC and financial loss this entailed, Micro-Soft went out of business, and Gates and Allen were never heard from…aga…errr…no. That’s not what happened.

What actually happened was the widespread release of Altair BASIC established the product as the de facto standard for microcomputers. Despite some idiosyncrasies, Micro-Soft’s BASIC was regarded as an engineering triumph—lean, loaded with features, and, in comparison with the mainframe and mini-computer BASICs most programmers worked with, incredibly fast. Although everyone didn’t want to pay for Altair, which later became Microsoft (with no hyphen) BASIC, everyone wanted to use it. Since Microsoft’s deal allowed the company to license the product to other firms, Microsoft was soon enjoying a tidy business licensing its BASIC to a plethora of other computer companies. In point of fact, it was the industry’s high regard for Microsoft’s BASIC that led IBM to Bill Gates’s door and enabled him to take advantage of the biggest business opportunity of the 20th century.

Nonetheless, as the industry began its rapid development, resentment on the part of software entrepreneurs grew as software piracy spread. And make no mistake, spread it did. Copying a software program worth hundreds, or even thousands, of dollars, was as easy as inserting a blank floppy disk into a disk drive and typing in your system’s version of the “copy” command. Games in particular were the target of frequent liberation efforts, with user groups for systems such as the Amiga and Atari ST sponsoring “swap nights” where members were encouraged to bring in their software collections for communal sharing. Many businesses entered into the kumbayah spirit of things, with it being a common occurrence for a company to buy one copy of a business software package such as WordStar and distributing it to every member of the company.

To counter the practice of software liberation, now usually called “piracy,” a whole host of what were eventually called “copy protection” systems and techniques were developed. Most of these focused on protecting Apple software because this computer system attracted the bulk of new software development until the release of the IBM PC. Some of the techniques employed included things such as forcing a disk drive to write to locations on a floppy nominally off limits to the hardware; “Spiradisk,” a system that wrote data to the disk surface in a big spiral; hardware “dongles,” plastic keys that contained a chip with a software key embedded into it; and so on.

In response to the efforts of one part of the software industry to prevent pirating software, another part promptly launched an effort to thwart the protectors (this had the happy effect of employing more programmers). Anticopy protection systems included software products such as Locksmith, copy-cracking boards that sucked an entire software product into memory and spit it out to disk, products that were capable of reading dongle keys, and so on, and so on, and so on. As soon as one copy protection scheme was introduced, it was immediately under attack by resourceful folks following in the glorious tradition of Altair BASIC and the Homebrew Computer Club.

By the early 1980s, IBM entered the market with its own microcomputer, and the focus of the endless cat-and-mouse game between the Capitalists and Kumbayahs shifted to the PC. The software industry’s reaction to rampant software piracy was the general introduction of copy protection for many of the major software packages. WordStar 2000, Lotus 123, dBase, and other packages incorporated elaborate schemes meant to halt, or at least slow, the piracy tide. For a brief period in the 1980s, almost a dozen software companies were pitching other software companies on the effectiveness of their respective protection systems.

I initially had a great deal of sympathy for the effort. As a field software engineer for MicroPro, I had become quite accustomed to walking into a customer’s location and seeing multiple copies of WordStar (which was not copy protected) installed on every computer in the place but being able to spot only one set of manuals available to the “user” base. Some simple math seemed to indicate a lot of bread was being snatched from my mouth, or at least from the mouth of the company paying my salary.

It was also annoying to find myself spending time providing technical support to people who were clearly flying the software Jolly Roger. One of my responsibilities was to take local technical support calls while in the office from people who were having difficulty with our word processor. A disturbingly high number of my calls went something like this:

Me: Hi! This is MicroPro technical support. How can I help you?

The “customer”: I need help installing my NEC 3550 printer.

Me: No problem! Please pull out your installation manual out, and turn to page 256. (This was an age when users were a manly bunch, with thumbs thickly muscled from paging through software documentation similar in size and comprehensiveness to small encyclopedias. Not the like the effete perusers of PDFs and HTML you find today.) I’ll be glad to walk you through the process.

The “customer”: Uh, I don’t have a manual in front of me.

Me: No problem. I’ll hold on the phone until you can get it.

The “customer”: Uh, I don’t have a manual.

Me: Can I ask what happened to it?

The “customer”: Uh, the dog ate it. (Other popular claims focused on thieving kids, roaring fires, and torrential flooding).

The computing press (the members of which were used to obtaining all the free software they wanted) was, as you might imagine, generally unsympathetic to the plight of the software firms. Despite giving perfunctory lip service to the idea that software companies had a right to protect their property from theft, the companies were (and are) constantly being lectured on “not treating their customers” like thieves, despite the indisputable fact that large numbers of them were (and are). In 1984, MicroPro estimated that eight pirated copies of WordStar were in use for every one sold. In 2005, estimates put software piracy rates in China at more than 90 percent.

And yet, by the end of the 1980s, practically every software that had implemented copy protection dropped it. Several factors were driving this trend. One was that many companies resisted buying copy protected software because it added complexity and instability to desktop computing systems and strained the resources of IT departments. Another was that copy protection added considerably to the software industry’s support burden because users called up to complain about systems that wouldn’t install because of hardware peculiarities, lost or damaged “key” disks, arguments about the number of “valid” installs, and so on. And, although our feelings undoubtedly weren’t the strongest factor driving corporate decisions, most software firms were hearing whines and groans from their field sales and support personnel about the difficulty of dealing with protected products. WordStar 2000, for example, at one time used a copy protection system that limited users to three installations of the software on different systems. This meant that whenever I or another person had to install WordStar 2000 on a demo system at a remote location, we had to go through a wearying install/deinstall routine while listening to outraged disk drives go AAAHHHHKKKK SKRRRIIIKKK WAAKA WAAKA WAAKA in order to keep our quiver full of demo installs for future use. (Field personnel weren’t initially given non-copy-protected products. When we were, the practical facts we created “on the ground” provided another reason to drop copy protection).

And finally, despite the theoretical losses software companies were suffering from piracy, it was hard to see in reality how piracy was hurting the companies. As the decade progressed, many software companies did indeed stumble and fall, but in no case was it possible to pin the blame on piracy. Also, it started to become apparent to software firms that piracy had a definite upside, as Microsoft had discovered years ago with the Altair. When the number of people using your software increased, your perception as the market leader increased as well. And pirated software functioned as a sort of marketing kudzu, tending to choke out the competition as use of your product spread throughout the computing populace. Once you had displaced the competition, it was possible to convert X percent of the pirates to paid users via various inducements and offers. Corporations, worried about legal liabilities, were also usually not reluctant to buy purloined software if the price was right.

Becoming the market leader also opened up opportunities for bundling and original equipment manufacturing (OEM) deals. At MicroPro, WordStar’s early ubiquity made it the favored word processing product to include with such systems as the Osborne, Kaypro, and many others. While OEM products were sold at a considerable discount from the software’s retail price, in most case all the software publisher had to do was provide licenses and serial numbers to its customers; the OEM customer usually was responsible for manufacturing and supporting the product. One MicroPro OEM salesman referred to the firm’s OEM business as a “money-printing operation.” This model worked in the case of such products as WordStar, dBase, WordPerfect, and most notably, Microsoft Windows. Today, Microsoft’s Windows OEM business is the most profitable component in the company’s bottom line.

In the meantime, while the proprietary software companies were garnering all the attention (and making all the money) from the market, the kumbayah forces, led by an interesting fellow by the name of Richard M. Stallman, were keeping the dream of free software alive. Stallman had entered computing by way of MIT in 1971, where he worked as a systems programmer in the university’s AI lab, at that time a hotbed of innovation in such areas as LISP and related languages. Stallman developed a reputation as an ace programmer, and while at MIT developed the legendary program Emacs, a text editor backed up by a powerful and extensible macro system. Stallman was a militant believer in what was then called the “Hacker Ethic,” a belief system that preached that software and the information it represented should be open and available to all users to change and modify as they saw fit. Stallman was fervent in his belief about the evils of charging for software, at one time proclaiming that “the prospect of charging money for software was a crime against humanity.”[1]

Unfortunately for RMS, as his friends called him, by the 1980s the MIT lab was becoming corrupted by the sirens of commerce, who asked why geeks couldn’t also have fancy cars, big homes, and gorgeous girl friends. Two AI companies (both ultimately unsuccessful) dedicated to building LISP interpreters and dedicated LISP machines spun out of the MIT lab, taking with them many of the lab’s best programmers and all, in the opinion of RMS, of the lab’s kumbayah mojo.

After a period of mourning, Stallman left the lab with a vision fixed firmly in his imagination. He would create a powerful, free, and open software environment that would allow programmers to create new and wondrous products. This environment would be based on the popular (but proprietary) UNIX operating system and, in a display of geek wit, would be called GNU (GNUs not UNIX; we’re sure you appreciate the recursion). And to ensure that what had happened at MIT could never happen again, he’d protect this environment with a new and innovative concept, a “copyleft” agreement that required programmers who used his software to build new software to make the original GNU software, and any changes or improvements made to the software they had created, available for free to anyone who wanted it under the GNU General Public License (GPL). When the GPL was introduced, Stallman became software’s Dr. Open, the civilized, reasonable, humanitarian advocate of all that was good and pure in the world. (Bill Gates has traditionally played the role of Mr. Proprietary, but since he’s supposed to be leaving Microsoft to cure diseases worldwide, Steve Ballmer will be appearing in the part moving forward.)

This was a sharp and revolutionary contrast with the typical end-user license agreement (EULA) that accompanied most proprietary software. Most EULAs allowed “licensees” of software only the right to copy “their” software onto a limited number of computers. In fact, by 2006 the Microsoft retail EULA for Windows allowed you to copy your $100+ copy of Windows XP onto only one computer, regardless of how many computers you owned. And boy oh boy, better make sure you never, ever buy a four-core processor in your computer, because that seemed to violate the Microsoft EULA. And if you read the rest of the EULA, it warned of all kinds of other things you couldn’t do, and all the warnings were written in the Scary Lawyer dialect of the English language. In fact, most EULAs are full of scary language and all kinds of implied legal threats. Interestingly enough, despite that software companies have been using EULAs for decades, it is unclear whether they have any legal validity.[2] Fortunately for the industry, no one actually ever reads a EULA; if they did, everyone would probably use only free software.

Given the current excitement over open source software and technology, it would be easy to think that Stallman’s GPL took the industry by storm, but this was not the case. The first GPL was released in 1989, and the second version, the one in current use in high technology, in 1991. At the time of their issuance, few people paid them the least bit of attention. One reason for this may be that while Stallman may have thought charging for software was wrong, almost no one else thought so, especially the many programmers who were making good money selling software and didn’t want to give up their new cars, houses, and girlfriends. Another was that Stallman’s rantings about the evils of for-sale software and rationale for giving it away sounded a bit too close to Karl Marx’s formulation of “from each according to his abilities; to each according to his needs.” In an era when the Soviet dinosaur was noisily clanking and shaking its way to extinction, Stallman’s zeitgeist seemed off to many.

It’s Finally GNU for You

But perhaps the biggest obstacle to the widespread acceptance of Stallman’s credo was that although he was preaching about the glories of free software created with GNU, he hadn’t actually sat down and finished the project. Stallman had built a series of software utilities that could be used to create software (an activity beloved of many coders) but had neglected, years after the proclamation of GNU, to provide the system with its key component, an operating system. Instead, it was left to a 21-year-old Finnish student at the University of Helsinki by the name of Linus Torvalds to create a working implementation of Stallman’s dream. UNIX, Linux’s distinguished father, had slowly been withdrawn from the programming community and had become increasingly proprietary and fragmented. Dozens of companies took their version of UNIX and built custom extensions and walls around the software. This had the effect of raising UNIX prices (and allowing these companies to do a nice business selling their specialized UNIX versions). Dissatisfied with the UNIX clone he was currently using and unable to afford a proprietary version, Torvalds decided to take a stab at writing his own operating system using the GNU tools.

Linux .001 was released in September of 1991. Shortly after its introduction, Torvalds invited anyone interested in the OS to contribute to the development of the next release. Many people did, and the most significant open source project in the industry’s history was born.

Driven by the enthusiasm of what would become know as “the open source community,” Linux made great strides over the next few years, its progress assisted by Torvalds’s decision to release Linux under the GPL. Its growth driven by open source aficionados, by the late 1990s Linux began to do serious financial damage to companies such as SGI, Sun, SCO, and others, all of whom soon saw their business models being ravaged by the new upstart.

But while Linux was steadily eating away at the profits of the UNIX firms, the Windows world safely ignored Torvalds and his OS, for the most part. A few hobbyists played with the system,[3] and Microsoft’s behavior toward Netscape and the government’s antitrust case raised the blood pressure of free software advocates worldwide; however, that was about it. After all, Windows was very, very cheap. Most people received the product for “free” with their hardware and ignored the issue that their purchase price reflected the cost of Windows, something that was easy to do when computers cost $2,000 to $3,000. And even if you bought it, once you factored in the cost of inflation and the ability to install it on every machine you owned (and a few you didn’t), the cost per computer seemed very reasonable for an operating system that ran a huge amount of software and seemed to support just about every peripheral you owned.

Also, what many have called “the open source paradox” began to rear its ugly economic head (and still does). The paradox was that while GNU, Linux, and other open source software had been written ostensibly to liberate programmers from a world of evil capitalists, ultimately it seemed the evil capitalists were most likely to benefit the most from the whole movement. After all, while it was nice that car companies, oil companies, lawyers, grocery stores, Burlington Coat Factory, and lots of businesses of all types were saving money on purchases of software, there was no proof that programmers were sharing in the bounty from all these expenditure reductions. And if you looked at some of the companies that expounded the use of Linux the loudest, such as IBM, you couldn’t but help wonder. After all, IBM had become America’s most prominent business colossus by building the most proprietary of proprietary software and hardware. IBM had been driven from its perch of preeminence by tiny start-up Microsoft, which had then gone on to enrich more geeks than any other company in history. Microsoft had created thousands of millionaire programmers; how many millionaire programmers had IBM ever created? For that matter, if Linux was so great, were all the Linux millionaires?

Some Hot Tunes

In the meantime, while everyone was focusing on software, no one was paying any attention to the music business. There didn’t seem to be any reason to do so. After all, we all knew how the music business basically worked. Every few years the youth of the world generated yet another raft of disaffected grungesters, cute girls, cute boys, some performers of indeterminate sex, ghetto rappers, hip hop blasters, soul throbbers, chanteuses, lounge acts, and so on, and so on, all of whom were signed to contracts by large, institutionally corrupt music companies. These in turn distributed cash, girls (or boys), and cocaine (or the drug of your choice) to the band while paying off music stations to play the songs of the performers under contract to the company. When the current crop of crooners aged and lost their appeal or overdosed, they were promptly replaced by a new generation of cute girls, cute boys, and so on, and the cycle continued.

The distribution model was also well understood. Music was sold to the public via albums of records, cassette tapes, and later, almost exclusively on CDs. Most of the music on the album was filler, designed to surround the one or two good songs with enough extra musical noise to justify charging $20 per CD, a price that annoyed people who remembered that before the switch to the new technology in the early 1990s, a record had cost about eight bucks. The companies raised prices because they could but justified the new price tags to the public by talking about the expense of producing CDs (despite that it cost less to mass produce them as opposed to vinyl) and to industry insiders by noting that the price of drugs had sky rocketed over the years.[4]

The music industry had known for years that public dissatisfaction with the current state of affairs was high and that people were highly interested in mixing and matching songs to create custom listening sets that matched their interests and moods (I cover this point in greater detail in Chapter 14), but no one in the business cared. The music companies had the entire distribution system, the artists, and the technology under control. In fact, in the early 1990s, the industry was able to strangle a potential threat to its domination, consumer digital audio tape players, by loading them with enough integrated copy restrictions to the point that no one was interested in buying the units. Although some music executives were dimly aware of the problems software companies had with piracy, none felt thought had any lessons to learn from high tech’s digital travails.

While the music industry was ignoring both the desires of its customers and the advance of technology, software geeks worldwide were busily working on making the life of the jingle moguls miserable. First came the development of MP3 compression, a technology that allowed software to take any music recording and compress it to about a 12th of its original size with very little loss in sound quality. Work on the MP3 format had begun in 1987, and final specifications for the technology were released to the public in 1994. Once a song had been “MP3’d,” it was small enough to be easily and quickly transmitted electronically. The next step was taken with the spread of cheap read/write optical disk systems in the mid-1990s. This in turn drove the development of software that could “rip” (copy) music from CDs to the new MP3 format. The fourth and final piece of the puzzle dropped into place with the adoption of the Internet by the public. A complete solution to bypassing the music industry’s lock on the distribution system had come into existence.

The first major company to explore the possibilities the Internet opened up for music distribution was MP3.com. The service was founded in 1998 and offered downloadable musical for free (the artists were compensated via a system that gave them a small royalty payment based on the number of times their songs were downloaded). MP3.com was not a music piracy site; a trained staff winnowed through the uploads and stripped out copyrighted material. Everyone thought the site was wonderful, it grew rapidly, and in 1999 MP3.com launched an IPO that netted the company $370 million.

The good times ceased to roll at MP3.com when in January 2000 it launched the My.MP3.com service. This enabled customers to securely register their personal CDs (you had to actually stick the CD in your PC so that MP3.com could scan it) and then stream a digital copy from your system to an online music “locker room” hosted by the My.MP3.com service. At this point, the intelligent thing for the music industry to have done was to have studied MP3.com, partnered with it, and “trained” the public to interact with the site and ones similar to it for the benefit of all concerned. Instead, the music moguls, in a act of classic and far-reaching stupidity worthy of such famous moments in rock star history as Alice Cooper tossing a hapless chicken to its death to a crowd in Toronto or Ozzy Osborne masticating an innocent bat,[5] sued poor MP3.com for copyright infringement and found a judge dim-witted enough to agree with them. Rather than appeal the case, MP3.com handed over the bulk of its IPO money to the recording industry. Fatally weakened, the service gave up the ghost during the dot-com meltdown, to the music industry’s immense satisfaction.

The smirking and high-fiving came to an abrupt end with the appearance of a new service, Napster. Based on a peer-to-peer network system that allowed computers to directly transfer MP3 files across the Internet, Napster made little effort to prevent software piracy, and the site soon became one of the most popular on the planet. The music industry, having learned absolutely nothing from the MP3.com incident, sued Napster as well and eventually was able to shut it down. As already noted in Chapter 11, Napster’s great vulnerability lay in its use of centralized servers to store the names of the files being offered to other Napster users. Now, with Napster out of business, smart programmers quickly developed new software that didn’t require the use of centralized servers but instead relied on individual computer systems located worldwide to manage the task of file coordination. The recording industry’s intelligent response to this development was to sue 19,000 parents, children, dead Vietnam vets,[6] and others for copyright infringement, an act that had absolutely no impact on the widespread practice of downloading free MP3-compressed music. The industry also began suing the individual peer-to-peer networks such as LimeWire and Kazaa, but as soon as one network disappeared, another one promptly appeared. The music industry now existed in a Greek hell of its own creating, doomed, like Sisyphus, to push the rock of copyright litigation up and down a terrain that consisted of endless hills of peer-to-peer networks.

Getting to the Root of the Problem

The industry’s stupidity reached a dizzying crescendo with Sony BMG Music Entertainment’s 2004 release to its customers of something that proved to be far more exciting than any music video ever produced—a “rootkit.” A rootkit is perhaps the most dangerous of all malware, a vicious piece of Borgware that absorbs your computer’s operating system into a vast, evil collective over which you have no control. Rootkits integrate themselves so deeply into a computer’s innards that even high quality antivirus and antispyware products often cannot detect them. The Sony rootkit, targeted primarily at Windows (though it also infected Macs but to a lesser extent), was loaded onto 52 of its music CDs, and when someone put a rootkit-infected CD into their computer, Sony’s malware was surreptitiously installed onto the system. Once there, if detected, an attempt to remove the rootkit resulted in severe damage to Windows and a nonworking computer. Once hidden on your PC, the rootkit prevented you from copying songs from the CD to another CD or to the MP3 format (though this protection was almost instantly circumvented).

The Sony rootkit spread to more than half a million machines and networks, including those in the Department of Defense and other government agencies, before writer and Windows expert Mark Russinovich discovered its existence in October of 2005. He posted his discovery online, and news of the rootkit spread worldwide in a matter of hours. (Companies such as Symantec and McAfee were heavily criticized for failing to develop software that detected Sony’s malware until Russinovich’s disclosure of its existence.)

Sony’s handling of their self-inflicted PR nightmare showed the company’s collective intelligence was even with that of the wretched headless bat publicly decapitated by Ozzy Osborne. As outrage about the rootkit grew, Sony embarked on a damage control effort that included the following:

    *    Claiming the rootkit didn’t surreptitiously “phone home,” that is, use your Internet connection to contact Sony, when it did just that every time you played a song.

    *    Not realizing that the installation of the rootkit left every computer on which it had been installed with a giant security hole any hacker with knowledge of the rootkit’s behavior could exploit.

    *    Releasing an update that supposedly fixed the security hole created by the rootkit that required you provide your name, e-mail address, and other personal information to Sony. After installation, it continued to send information about your choice of music to Sony, but now it had a name to match up with your play list.

    *    Allowing Sony’s president of global digital business, Thomas Hesse, to go on National Public Radio and conduct an interview in which he told the listening audience that “Most people don’t even know what a rootkit is, so why should they care about it?” The hapless Hesse was apparently too stupid to realize that Sony was in the process of educating most of humanity on the dangers of rootkits.

    *    Not knowing that the company supplying its rootkits, software firm First4Internet, was using an open source encoder in the rootkit.[8]

Class action lawsuits against Sony were launched in California, New York, Texas, Italy, and lots of other places. Twelve days after the discovery of the rootkit, Sony announced it would no longer sell its self-infected CDs. Then it announced it was recalling all of the infected CDs and replacing them with non-copy-protected disks. Estimates of the eventual financial damages to Sony ran from $50 to $500 million (one of the reasons for the uncertainty was that thousands of Sony-infected PCs remain in use and vulnerable. As late as June of 2006, three virus creators were arrested for exploiting the security vulnerability created by the rootkit.[9])

More to the point, the entire fiasco helped convince millions of potential buyers of online music that the easiest, cheapest, and safest thing you could was log onto one of those nice peer-to-peer networks where the music selection was wide, the price was zero, and the number of rootkits you could expect to encounter was low.

Back to the Future with WGA

The year 2000, a date that saw most of the world looking forward, saw Microsoft looking back to the 1980s and copy protection. That year Microsoft announced its new “product activation” program. The new copy protection system worked by tethering, in theory, your copy of Microsoft Office 2000 to the Internet via a key found on Microsoft servers. The process worked by your first installing Office and then allowing the product activator to snoop through your computer, send a profile of your hardware to the Microsoft server, and receive a downloaded product key from Microsoft that would allow you to actually use the software you had bought. After initial trials, the scheme was extended to Windows XP when it was released in 2001. Soon, the entire copy protection system became known as Windows Product Activation (WPA).

There were, as you can imagine, some delightful aspects to WPA. If, for instance, you decided to change the motherboard, processor, graphics card or similar hardware on your system, you ran the risk of waking up WPA and having it nag you to reinstall Windows and your other WPA-protected programs, despite that the copy you were using was perfectly legal. Reinstalling Windows sometimes meant calling up a special 800 number and sitting through a long and wearying session that required you speak every last number of the CD key that came with your copy of Windows in the hope that the phone god with whom you were communing would deign to give you a new key. If that didn’t work, you could look forward to spending some time with someone named “Ramesh” or “Gupta” who was normally sitting in a call center in India or similar exotic location and explaining why you needed a new key that allowed you to actually use the software you’d bought…errr…“licensed.”

Freedom from Choice Is What You Want

Most people looked at WPA with the same affection shown a turd dropped in a punch bowl at a wedding, but in the main, Microsoft was able to finesse its introduction. There were several reasons for this. One was that many people received Windows bundled in with their computer and, as already noted, didn’t really think about what they had paid for the product. Another was that, as had happened before, the WPA copy scheme was quickly cracked, and many people simply bypassed WPA. A third was that Microsoft had given “universal keys” to many of its corporate customers; these allowed them to do mass installs of Windows at their business locations without having to waste time going through hundreds or thousands of activations. These keys had quickly leaked into the general public and were employed by many people to use Windows in pretty much the same way they had for more than a decade. All in all, it all turned out that most people could ignore WPA, for most of the time.

This Which seemed, to most people, fair. Microsoft now had legally sanctioned monopolies in desktop operating systems and office suites (but no mauling of the competition allowed)! The company seemed on its way to establishing a similar monopoly in network operating systems, had strong positions in the enterprise database market with its SQL product, was selling a great deal of Microsoft Exchange, had a nice business in mice, and by 2002 enjoyed the luxury of having approximately $49 billion in cash sitting in the company’s piggy bank. Why would any company in its right mind disturb such a wonderful status quo?

Of course, the open source and free software folks took a great deal of enjoyment in pointing out that Linux, which had steadily increased in functionality and ease of use, was free and never required you talk to Ramesh when changing a motherboard. And in the meantime, an interesting product called first StarOffice, then OpenOffice, had appeared on the scene. StarOffice began its life as an OS/2 office suite developed by a German company in the early 1990s. After the collapse of OS/2, the software morphed into a Windows product that was bought by Sun, ostensibly because it was cheaper for the company to buy its own office software than buy Microsoft’s. The real reason was the desire of Sun CEO Scott McNealy to give Bill Gates and his company a case of heartburn, which he attempted to do by open sourcing most of StarOffice’s code, which was then transformed into OpenOffice by a series of programmers dedicated to open source ideals (they didn’t become millionaires, though). Sun still sells a version of StarOffice, though there’s little compelling reason to buy it considering the price, free, of OpenOffice.

On the other hand, although Linux was free, installing it was a royal pain that the vast majority of people had no desire to experience. The price of freedom included the privilege of choosing which Linux you would pick from dozens of different packages, called “distros,” and then attempting to install your choice on your hardware. This was made more interesting by the fact that although the core Linux operating system was usually (though not always) the same from distro to distro, the various Linux bundles often used different install procedures, had different user interfaces, looked for key files in different places, included different utilities, and so on, and so on. And, although it  was nice that OpenOffice was free and that StarOffice was cheap, once one had copied Microsoft Office to all the computers it needed to be on, the price wasn’t really that bad after all.

All this changed in 2004 when Microsoft introduced, with an Orwellian fanfare of misleading language, its new Windows Genuine Advantage (WGA) program. Windows users were prompted (under threat of losing access to updates other than ones deemed critical to security) to download a program that checked their product key for authenticity. If Microsoft determined you were indeed “Genuine,” you could continue to receive all Windows XP updates. If you weren’t, well, no updates for you, at least until WGA was cracked by hackers (it took about a week). Everything seemed to continue on much as it had before, though the I-told-you-so cackling from the free software crowd grew louder, and people started becoming a little annoyed with Microsoft. It bordered on terminal chutzpah to threaten people with the inability to obtain via Microsoft’s update system access to such things as the latest version of Internet Explorer, a product that had been allowed to rot for five years after Microsoft dispatched Netscape. It was nice that Internet Explorer 7 would have tabbed browsing and all, but Firefox and Opera had been offering those features for years.

The rootkit hit the fan in July 2006 when Microsoft unleashed part deux of WGA, called “WGA notifications.” WGA notifications was a nifty bit of code that reminded everyone very much of a recent music company’s malware. Making utterly sure that WGA notifications would be instantly loathed by humanity, Microsoft misled the world by tucking the program onto its servers and transmitting it across the wires in the company of security patches with the appellation of a “critical update.” (WGA had nothing to do with security.) Once installed, the WGA program revealed the following charming characteristics:

    *    It phoned Microsoft every time you logged into Windows to tattle on you if it thought your install of Windows wasn’t valid (proving that Microsoft had learned absolutely, positively nothing from the Sony rootkit disaster of 2004).

    *    WGA now forced Windows to display an unending series of nagware messages urging you to get “Genuine,” that is, fork over more money into Microsoft’s giant cash hoard.

    *    The EULA that came with WGA notifications was misleading and didn’t properly request the user’s consent to install the software.

    *    If you wanted to “Get Genuine,” WGA didn’t make it easy for you to see other options other than give $149 to Microsoft. And there were other options. For example, if a repair shop had loaded an invalid copy of Windows onto your system during an overall of your system but you had bought a legal copy that was sitting on your bookshelf somewhere, you could restore your legitimate key to your system in a process that appeased WGA. But it was a genuine pain to find information about this process via all the “Genuine” nag screens.

    *    WGA was misidentifying hundreds of thousands, maybe millions, of legitimate installs as “nongenuine.” Exactly how many was somewhat mysterious, since Microsoft was not very forthcoming on the issue. The company did say that of the 60 million checks it had run, 80 percent of the machines tattled on by WGA were using invalid keys. That left about 12 million “others.” High levels of complaints were coming from a wide spectrum of users, particularly people who’d had Windows preinstalled on their laptops. As one blogger asked, “Is Dell a pirate?”

    *    If you read the EULA that came with WGA notifications, you realized you were being asked to download a beta product that had the potential to cripple your copy of Windows.

    *    WGA provided no advantages at all to the user (but plenty to Microsoft). The program was simply a copy protection/antipiracy scheme, and people weren’t stupid.

Reaction to the whole WGA mess was exactly what you would expect. Several class action lawsuits were launched against Microsoft claiming the company had violated laws against spyware in several states. Microsoft promptly replaced the big tattler in WGA with a littler tattler, one that would only “periodically” call home to tell on you. Microsoft also changed the EULA to inform you more clearly about its informant. A French company quickly released a program called RemoveWGA that kicked the Jewish mother (WGA notifications) out of your computer, though the basic WGA system remained intact. Several Windows pundits such as Brian Livingston began to recommend that people not use Windows Update but to instead rely on third-party services.[10]

Fresh from its initial success, Microsoft announced that the joys of WGA would soon be extended to all the products in its line. And to ensure that there were no embarrassing ambiguities in the future, WGA in all its glory would be directly integrated into Vista, the designated heir to XP whose father may have been Bill Gates but whose mother was clearly Steve Jobs. In the meantime, the chortles and snickers from the open sourcers turned to guffaws and screams of laughter as they fell to the floor holding their ribs from an excess of merriment.

Rumors then began to quickly spread that part three of Microsoft’s spyware system would introduce a new friend to WGA’s tattler and Jewish mother: an executioner. This would come in the form of a “kill switch” that would allow Microsoft to remotely disable your nongenuine Windows at the behest and whim of Redmond. (Industry wits noted that given the number of security attacks and virus infections afflicting Windows, most people might not notice any difference in operations.) In response to a query from Ziff-Davis columnist Ed Bott, a Microsoft PR representative, speaking in Modern Flack, provided the following chunk of verbiage:

No, Microsoft anti-piracy technologies cannot and will not turn off your computer. In our ongoing fight against piracy, we are constantly finding and closing loopholes pirates use to circumvent established policies. The game is changing for counterfeiters. In Windows Vista we are making it notably harder and less appealing to use counterfeit software, and we will work to make that a consistent experience with older versions of Windows as well. In alignment with our anti-piracy policies we have been continually improving the experience for our genuine customers, while restricting more and more access to ongoing Windows capabilities for those who choose not to pay for their software. Our genuine customers deserve the best experience, and so over time we have made the following services and benefits available only to them: Windows Update service, Download Center, Internet Explorer 7, Windows Defender, and Windows Media Player 11, as well as access to a full range of updates including non-security related benefits. We expect this list to expand considerably as we continue to add value for our genuine customers and deny value to pirates. Microsoft is fully committed to helping any genuine customers who have been victims of counterfeit software, and offer free replacement copies of Windows to those who've been duped by high quality counterfeiters. There is more information at our website http://www.microsoft.com/resources/howtotell.

A careful reading of this statement revealed plenty of ambiguities (we didn’t ask whether WGA was going to shut down the computer, but Windows), but Microsoft’s PR people clammed up and refused to talk further. Not making people feel any better was an online article by respected security analyst Robert Schneier in which he reported that a Microsoft representative had told him that:

In the fall, having the latest WGA will become mandatory and if it’s not installed, Windows will give a 30 day warning and when the 30 days is up and WGA isn't installed, Windows will stop working, so you might as well install WGA now.[11]

At this point, the open source people were snorting liquids through their noses as they rolled around the floor laughing hysterically, but Windows people were depressed. Forums and blogs exploded with comments from users that now was the time to finally take a look at Linux, OpenOffice, and other open source alternatives to Windows.[12] It made sense. While Microsoft was spending time and energy figuring out ways to torture many of its customers, new versions of Linux had just about caught up to Windows in terms of ease of install, functionality, and peripheral support. There were still problems, but at least you could be sure that if anyone in the open source community attempted to put something like WGA into Linux, Richard Stallman would personally throttle them. No one was enthusiastic about the prospect of allowing Bill Gates and Steve Ballmer to hold a loaded pistol at their PCs on a 24/7 basis. Given the past experiences with WGA, just how could you be sure that some idiot at Microsoft wouldn’t inadvertently do something that crippled your system at just the wrong time? Certainly some people thought the possibility existed. Before finishing this book, I spoke to an acquaintance at Microsoft who told me that: this:

I recommend to my friends that they always keep a copy of OpenOffice on their systems in the event that MS Office’s activation system locks up the software when they’re not expecting it and they can’t reach a phone or the Internet to reactivate it. Interoperability is excellent and you can usually get something done. It’s good protection against our copy protection

It appeared that open source has a friend in Redmond, after all!

[1] Free as in Freedom: Richard Stallman's Crusade for Free Software by Sam Williams (O’Reilly Media, 2002)

[2] http://en.wikipedia.org/wiki/EULA

[3] I purchased a retail copy of Red Hat Linux in the 1990s and attempted to install it on my PC. The install promptly failed when Linux failed to know what to do with my then state-of-the art Adaptec SCSI interface card. A plaintive inquiry sent to the famed Linux community was answered by a condescending message that since Adaptec wasn’t releasing its drivers under the GPL, I shouldn’t expect Linux to work. I promptly gave up on Red Hat and Linux and continued using and buying Windows.

[4] This sounds like a facetious statement. It’s not. The field sales office I worked in was located in Secaucus, New Jersey. The MicroPro offices were down the hall from the studios of one of the region’s most popular Top 40 radio stations at the time, Z-100, and I became used to seeing a limo periodically drive up to our forsaken location and drop off such music stars as Cyndi Lauper, Bob Geldof, Madonna, and so on, for on-the-air PR appearances. I struck up an acquaintance with one of the DJs who worked there, and he explained in loving detail how the industry worked.

[5] Rock Stars do the Dumbest Things by Margaret Moser (Renaissance Press, 1998). A long-buried classic worth your time!

[6] “The Shameful Destination of your Music Purchase Dollars” by David Berlind (http://blogs.zdnet.com/BTL/?p=3486), August 14, 2006

[7] The Borg are Star Trek’s baddest bad guys, a race of cyborgs ruled by queens who run around the galaxy in large cube-style ships assimilating other races while announcing “resistance is futile.” In high-tech, Bill Gates is usually assumed to be the chief Borg queen.  However, given Steve Job’s recent penchant for suing everyone, Apple’s increasing monopoly in the music world, and the suspicious design of the Apple Cube and the Next computer, many people think Apple’s CEO may auditioning for the role.

[8] LAME, licensed under the lesser GPL

[9] “Virus Suspects arrested in UK and Finland” by Quentin Reade. (Webuser, http://www.webuser.co.uk/news/87558.html?aff=rss), June 27th, 2006

[10] Windows Secret Newsletter, issue 78 (http://windowssecrets.com/comp/060629/)

[11] http://www.schneier.com/blog/archives/2006/06/microsoft_windo_1.html

[12] I have. I’m tired of talking to Ramesh every time I swap a motherboard, something I do fairly frequently.

Digital DNA: A Day in the Life of Alfred E. Motorola

It’s a hard fact of life for the hardware guys and gals of high tech that it’s usually the software geeks who get most of the glory. When software people code a software failure, they usually look like their reach exceeded their grasp; when hardware types build a flop, they look like dorks. With software, a timely patch can often erase the ugliest blemish; with hardware, mistakes are set in silicon, so to speak.

The Loneliness of Being Hardware

A fairly recent example of this principle in action occurred with the release of Palm Inc.’s m130 handheld computer. Before it released its latest personal digital assistant (PDA) in March 2002, Palm bragged that the device’s 16-bit screen could display more than 64,000 different colors, but it turned out the m130 could actually show far fewer. Exactly how many fewer was a matter of some dispute. A spokesperson for the company was quoted as saying that by “blending techniques,” such as combining nearby pixels, the m130 could display 58,000 “color combinations,” which isn’t quite the same thing as 64,000 colors. Palm profusely apologized for its mistake but made no offer to take its drabber-than-expected PDAs back despite the screams of some annoyed buyers. It did tell everyone it was busy thinking about some way to make it up to its disappointed customers. Industry wits immediately suggested that every m130 be shipped with a big box of Crayola crayons.

No, it’s not fair, but that’s the way it is.

Oh, there are a couple of exceptions. A few people know who Michael Dell is, though most people think he’s that young guy who says “Dude!” in all those TV commercials. But Dell is really a boring company once you get to know it. Its main business is selling large numbers of square beige computers shipped in square white boxes. It’s a great business, and Dell is a very, very successful company, but there’s not much glamour there. Dell isn’t cool, and it isn’t glorious.

Then there’s the guy (Ted Waite of Gateway) who talks to the cow, but cows aren’t very cool (though the cow is kind of funny). And his company is losing a ton of money. That’s not very glorious.

And maybe Scott McNealy of Sun Microsystems? Well, that’s a tough one. He spends most of his time talking about Java and the Internet, though the company actually makes its money selling expensive computers running some incomprehensible OS called UNIX. Isn’t Java software?

There is Steve Jobs of Apple. Jobs has a genius for hiring people who can design wonderfully colored and shaped computers that about 4 percent of the market wants to buy. He’s the guy who brought us the movie Finding Nemo  and Buzz Lightyear, and he also looks pretty sharp in Nehru shirts. Some guy from the television show ER even played him in that interesting but completely inaccurate movie, The Pirates of Silicon Valley. Yeah, Steve Jobs is pretty cool. Too bad more people don’t use his computers.

But after that it all becomes kind of fuzzy. Who’s the father (or mother) of the PalmPilot? Who’s the Disk Drive King? The God of Monitors? The Queen of Keyboards? The Prince of Uninterruptible Power Supplies? The Master of Removable Media?

No one knows. No one cares. It’s tough to be in hardware.

On the software side, however, superstars abound. There’s Bill Gates. Paul Allen. Steve Ballmer. Larry Ellison. Marc Andreessen. Steve Case. Peter Norton. Dan Bricklin. Ray Noorda. That Linux guy from Sweden—or is it Norway?—Linus Torvalds? Some incomprehensible Englishman named Tim Berners-Lee whom everyone calls “the Father of the Web.” Heck, even Gary Kildall is famous just for failing big time. Michael Cowpland of Corel used to be pretty well known too (though most people remember him for that wedding-day picture of his trophy wife draped across a Lamborghini[1]).

There is, however, one hardware company that has some major media mojo attached to it. After years of dancing Bunny People, the Blue Man Group, and hyperactive space aliens, that company is Intel. Microprocessors are the hardware heart of the technology revolution, and Intel makes them. Most people aren’t exactly sure how a microprocessor works, but they do know Intel produces a lot of them and many know they have an Intel in their computer. Intel is the semiconductor industry’s ultimate glamour boy, hardware’s Ken doll.

But as we all know, envy exists in this world. Our Ken has a jealous rival, someone who looks at our clean-cut builder of CPUs from the periphery of the admiring throng and grinds his teeth in frustration. “Why is everyone so crazy about him?” our hardware Iago wonders. “I make CPUs too. I’m a multibillion-dollar company. My technology helps drive commerce and industry worldwide. Why doesn’t anyone care about me?”

Too frustrated to watch anymore, the observer turns away and strides by us. A quick glance at his countenance confirms his identity. Who else possesses that peculiar combination of dull stare, pockmarked skin, sandy hair, prominent dental gap, and eternally vacant expression?

Yes, that’s him all right. Alfred E. Motorola.

Memories of a Crushing Blow

Motorola has envied Intel’s marketing prowess since the companies first clashed in the early 1980s during the rollout of their respective 16-bit microprocessors. Motorola had the better chip, but Intel had “Crush,” a prototypical kill-the-competition campaign put together by William H. Davidow. Described in Davidow’s book, Marketing High Technology (Free Press, 1986), Crush integrated PR, marketing communications, and advertising in a comprehensive effort to convince customers that Intel’s ability to outdevelop, outsupport, and outsell the competition made an investment in Motorola’s technology a bad bet regardless of technical merit. Motorola was caught flat-footed by Crush and could never develop a credible response. The company ended up ceding the bulk of the glamorous and profitable market for general-purpose microprocessors to Intel.

Motorola has never forgotten Crush, and the success of Intel Inside only rubbed salt in the wound over the years. In 1999, the company decided it couldn’t stand it anymore and that it too needed to have a big corporate branding program. Thus was born Motorola’s “Digital DNA” program, a waste of $65 million that demonstrated the company had learned little from the body slam Intel dealt it years before.

Bad, Bad Genes

The first problem with Digital DNA was that Motorola never deigned to pay anyone to stick the Digital DNA logo, a sticker that read “Digital DNA from Motorola,” on their hardware. This alone was enough to doom the program. Motorola didn’t want to pay out MDF because of the expense but was missing the point. Intel’s MDF campaign allowed it to sell and charge more for its chips over rivals such as Motorola and AMD. The calculation was simple: For every dollar spent on MDF, Intel saw two dollars back via chip sales and profitability. The lack of an MDF component to the campaign also robbed Motorola of the ability to direct the marketing and advertising efforts of Digital DNA participants à la Intel Inside.

The second problem was the program’s target audience—Motorola’s customers, not the customers of their customers. Motorola’s advertising for the program was thus aimed at phone makers, car manufacturers (big buyers of embedded computer systems), and electronics makers, not the buyers of phones, cars, and electronics. This strategy ensured no consumer demand for products with Digital DNA inside them would be generated. It also put Motorola in direct competition with those companies to whom it supplied chips, such as cellular phone manufacturers. Companies such as Nokia and QUALCOMM regarded the prospect of putting a Motorola logo on their phones with little enthusiasm. Again, Motorola had completely missed the point of the Intel approach, which was to make consumers demand computers with Intel inside, thus pressuring manufacturers to buy more Intel chips.

The third problem was schizoid execution. Having made the decision to target its customers, the company also diverted precious advertising budget dollars to running print-based consumer advertising as well. This wasn’t money intelligently spent; an effective corporate branding effort requires a massive and extensive media blitz carried out over an extended period of time. A few million dollars spent in newspaper and magazine ads wasn’t going to create any significant consumer interest in Digital DNA.

After a couple of years of wasted time and money, it became clear that Digital DNA was genetically defective. The program generated no end-user demand for Motorola products, no increased awareness of Motorola, and no increased demand other than that dictated by normal business necessity for Motorola products among its customers. Digital DNA was allowed to quietly wither away into obscurity.

The last time Ken passed Alfred on the beach, he kicked sand in his face.

[1] She looked marvelous.

The Main Causes of Failure

The main reasons for company failure can be broken down into four basic types:

    *    Your company is based on fraud and/or the sale of illegal products and services.

    *    Your company is built around an unrealistic or ridiculous business assumption.

    *    Your company does not have a strategic vision and plan for success.

    *    Your company has failed to execute business basics in the course of selling its products and services.

In regards to the first two types of failure, I don’t have much advice to give. If, like Enron, ZZZBest, and thousands of other companies over the course of the 20th century and continuing into the 21st, your underlying business model is a Ponzi scheme, your business will fail, and maybe you will go to jail. If your company plans to market heroin or cocaine in the United States, you will fail and probably go to jail if someone doesn’t shoot or decide to dismember you with a chainsaw first during a dispute about optimal distribution strategies and reseller margins. If, like a very bright gentleman I spoke to during the course of writing the second edition of Stupidity, you intend to bring to market a new word processor for Windows, you will fail, and no one will even pay attention to you. If you’re the new SoftRam, please send me a shrink-wrapped version of your product for my collection.

The third class of failure, lack of strategic vision and planning, is, as you’ve seen, the one most business writers like to write about primarily because books about this topic tend to make the most money. Of course, all successful companies have to develop and sell products and/or services people will pay for; this is the essence of modern commerce, and it is here that a “strategic” plan is both useful and necessary. But few companies and theorists are content to leave it at just that. For the past several decades, American business has been obsessed with the idea that somewhere out there exists a grand unified theory of business that explains once and for all how success can be guaranteed if only the theory can be uncovered and explained. For a time “excellence” seemed to provide the Great Answer. Then the path to proven success appeared to encompass leaping like gazelles over chasms. Now, many believe institutionalizing innovation (something of an oxymoron) into the corporate genome is the first step on the path to enlightenment.

American companies are obsessed with this concept of strategy and are always recasting and reorganizing themselves so as to realign with the latest, greatest strategic vision as brought to them by a newly minted business guru or a shiny new CEO. Caught in a tautological loop, far-reaching business plans are developed that are excellent or that leap far enough or that are innovative because excellent or leaping or innovating is what we do. Things usually go well at first. Americans, despite romantic self-images of rebel cowboys and sturdy nonconformists, are actually pretty good at organizing themselves and taking orders. We’re not as antlike as the Japanese, of course, but we do stand patiently in lines, stop at red lights, wait our turn, and take orders from authority with a fair degree of alacrity. So when plans and dictates come down from on high, companies can usually be whipped into fighting trim in fairly short order. A laser-like focus is brought to the creation of new marketing and sales campaigns. The competitive terrain is analyzed thoroughly via market research and focus groups. The distribution channel is primed with promotional money, advertising, and collateral. New products and services are manufactured in record time with maximum efficiency. The company maneuvers with stereotaxic directness toward its launch point and pauses in readiness, waiting for the command to move out. When the clarion call comes, the entire firm surges forward in lockstep unison, eyes set straight ahead on the prize, moving in a determined sweep to clear from the field of battle any obstacle that stands in the way of the ultimate victory and triumph.

But then things start to go badly wrong. You unleash a new word processor, and your entire company falls into the mud of massive confusion and market resistance because you’ve made a fundamental positioning mistake (MicroPro). You launch a hot new microprocessor, and because you’ve failed to realize that now that you’re a consumer brand, the PR rules have changed, and the charge forward has been halted by incoming fire from the press (Intel). You become the Internet’s most strategic site by dint of a marketing campaign that secures every corner of the globe with platoons of floppies and CDs, and then find yourself in full retreat as everyone stops using phones to connect to the Internet and starts using high-speed connections (AOL). You attack the market for digital content in 1998 with a can’t-miss device, an MP3 player the size of today’s iPod…and no one hardly pays attention (Saehan/Eiger Labs).

And then an awful realization bursts upon you. Business is not…war, at least conventional ones. Innovation doesn’t always lead to success; failure to innovate sometimes leads to disaster. Markets are not terrains that can be swept clear of enemies in all conquering waves. What is excellence in the context of one industry is a waste of money in another. If business is war, then it’s an odd sort of ongoing guerilla conflict in which the enemy can be an opposing company one day and a division or business unit at your own firm the next. Markets are swampy, Escheresque lumps of chaos studded with redoubts and obstacles that disappear and reappear from any direction, studded with over and under ramparts onto which confused invaders stumble and then stagger off from view. And even when you succeed in your objectives and take the field, sometimes the field disappears beneath you, and you find yourself slogging about in a pale foam that obscures your vision and leaves you wandering directionless in a vast wilderness.

And sometimes you get amazingly lucky.

For instance, let’s take the success of Microsoft Windows, to date high-tech’s most dizzying product triumph. Overcoming its humble roots as a clumsy imitation of the far more sophisticated Macintosh operating system, Windows’s success from 1990 onward drove Microsoft by 2005 to more than $40 billion in revenue and 60,000 employees, with 2005 profits exceeding $3 billion. Windows was first announced in 1983 when the GUI wars were first taking shape in the wake of Xerox’s pioneering work in the field and the first version was released in 1985. Over the years Windows bested GEM, VisiOn, GeoWorks, the Mac OS, and, most notably, OS/2 in the war for supremacy. What clearer example could exist of a company having a strategic vision for a product and then pursuing that vision to ultimate success?

But for Windows to achieve its current monopoly position, the following events had to occur:

    *    Xerox, the original inventor of what we now call the graphic user interface, had to never develop a clue about how to commercialize most of the ground-breaking developments that came out of its PARC labs.

    *    Digital Research had to blow off IBM when it came calling for an operating system for the original IBM PC.

    *    IBM, which during the early years of its relationship with Microsoft could have crushed the company like a bug, had to behave as if prefrontally lobotomized from 1985 to 1995 as the gruesome OS/2 saga ground on.

    *    Apple had to decide to not license the Macintosh operating system, a decision that led to the company going from approximately 30 percent market share in the early 1980s to 4 percent market share by 2006.

Other events that contributed to the eventual success of Windows also encompassed the following:

    *    The failure of industry pioneer VisiCorp to release a successful version of VisiOn, an early graphical OS for the PC that scared Bill Gates into almost shampooing his hair.

    *    Apple suing Digital Research over the release of its DOS shell, GEM, shortly after the product’s release. GEM was a direct Windows competitor and far more sophisticated than early releases of Windows in its look and feel (it looked and felt like a Mac). Before the Apple suit crippled the product, GEM was on the verge of achieving widespread adoption in the PC market.

    *    An unexpected run-up in the cost of memory chips (and temporary violation of Moore’s law), which helped cripple the release of OS/2 1.0.

Now, how does one fashion a credible strategic plan that assumes your competition will agree to collectively shoot itself in the forebrain while unpredictable market forces break in such a way as to help ensure your eventual success?

The answer is that you can’t. Microsoft’s success with Windows, which, depending on how you count these things, ranges from $60 to $100 billion (and still counting!) is as much a result of good luck and stupidity on the part of its competition as much as any vision on the part of Microsoft. No strategic plan that anyone would take seriously could include the actual events as they unfolded over the decades. And whatever strategic plans Microsoft had for Windows in 1983 were obsolete by the product’s release in 1985. And whatever plans Microsoft had in 1985 were obsolete by 1987, the year of OS/2’s release. And certainly by 1990 everyone’s plans for Windows were obsolete as a technically inferior but useful DOS shell swept to market supremacy over far more sophisticated and feature-rich rivals that couldn’t do much.

But in the meantime, as I’ve already pointed out, while Microsoft’s competition was engaged in various sorts of self-immolation, the company was continually executing business basics effectively. From the early 1980s through the 1990s the company entered the word processing, spreadsheet, and business presentation markets with good products that sold well and received generally favorable reviews. During this same period, Microsoft was creating a PR campaign that effectively developed a pleasing persona around Bill Gates that supported Microsoft’s marketing and sales efforts. The company also continuously improved and refined their development products, releasing new IDEs, languages, and tools that were well received by developers. In 1993 the company fortuitously stumbled onto the Office concept and rode its success to even larger profits. It also figured out how to make profits during the Internet bubble by selling products such as FrontPage. In the aggregate, all these events have contributed to Microsoft’s success, and little strategic planning was involved. Microsoft simply gravitated to good opportunities, executed well (or at least better than its competitors), and reaped the rewards.

You’re not convinced? OK, let’s look at another seminal company in the industry, one undergoing a seemingly miraculous rebirth in high tech. Let’s look at Apple, a company I had quite a bit of fun with in the first edition of Stupidity.

Now, before we go further, I’m going to give you a test. Let’s imagine, for a few minutes, that you have gone down to the mall to visit your local Apple store in order to peruse its wares and decide whether you’re going to buy a sleek, dazzling new Apple Intel-based Powerbook or save a few hundred bucks and buy a boring but decent Dell laptop. As you fight your way into the place past hordes of crazed shoppers battling to scarf up the latest iPod, a dazzling light suddenly appears from nowhere in middle of the store’s ceiling. The light grows brighter and more intense, and everyone in the place, except you, falls into a deep sleep and slumps gently  to the store’s floor, still clutching their iPod boxes. As you watch in amazement, the light  contracts into  a glowing orb that descends to the floor and coalesces into a beautiful girl. (I feel these Disney trappings most appropriate in light of Steve Job’s ascension to the Disney board of directors as a result of the Pixar buyout.) This dazzling apparition is dressed in a gown of diaphanous gold filigree and wafts a wand so white it almost hurts to look at it. As you gape in amazement, the wand glows and shimmers while emitting magical sparks that seem to distort reality itself! You reach out in delight to touch this marvelous  instrument, but the vision in front of you quickly yanks it away with a warning that the thing scratches like heck. Tucking the wand safely away in a silicon rubber holster, the magical lady explains that she is your Apple Fairy Godmother and that she has come to ask you to develop an enchanted strategic business plan.

You are, she explains as you listen with rapt attention, to help Good King Steve Jobs come up with a wondrous way  to help Apple return to the Glory Days of the late 1970s and early 1980s, when Apple was the predominant player in the nascent microcomputer industry. It shouldn’t be too difficult, she says, for someone as brave and handsome as you. And, after all, she says with a lustrous smile on her face, Apple has exquisitely designed and colored computers on which reside the industry’s slickest and most intuitive GUI, OS X, version Panther, or Tiger,  or KittyKat, or something. This is aAll running on top of a rock-solid, open source foundation called Darwin, a derivative of the widely praised FreeBSD. OS X Server, OS X’s bigger, brawnier brother, is a snap to set up and maintain. And the incredible success of the iPod has put Apple’s name on every consumer’s tongue and in just about every music lover’s pocket.

Now, what’s your plan? How do you plan to succor Good King Jobs? We’ll stop the book for a bit and give you some time to think through what you’re going to do.

OK, time is up.

What you do, of course, is smile regretfully and explain to the hallucination in front of you that you intend to quickly recover from the slight concussion you suffered when a shopping-hardened yuppie sprinting up the aisle in pursuit of the last white 6 gig Nano accidentally hit you upside your head with a purse loaded with a PDA, cell phone, and her current 4th-generation 60 gig iPod. Shaking your head vigorously, the fairy disappears with a *POOF* and the shoppers resume their mad scrambles. Then, after browsing quickly through the software displayed on the shelves and spending some time on the store’s web kiosk, you bail out of the place. You see, you’re a finance guy with an accounting degree working on your CPA, and one day you plan to be a CFO somewhere. You’re looking for a specialized package that can roll up budgets across different company divisions and business units and create a unified financial model of the entire company, something you really can’t do with plain old Microsoft Excel. No one offers such a program for the Mac, so it will have to be the Dell.

Now, why didn’t you let the magic linger a little longer? Why not take a stab at planning to put Apple back on the throne from which it once reigned microcomputing 25 years ago? After all, everyone is bored with Windows and hates its copy protection. Linux, the only possible other competitor, has all the computing charm of a diesel truck and requires a degree in computer science to install. And everything the Apple Fairy Godmother said is true, and she left out some hard revenue facts besides. In 2003, Apple’s annual revenue hovered around $6 billion. In 2005, Apple sold more than 32 million iPods, and more one billion songs were downloaded from its iTunes service by the winter of 2006. Yearly revenues from 2005 were almost $14 billion with more than a billion of that being profit.

 Becauses such a plan is as impossible to write as was a 1983 strategic plan for Windows that possessed any credibility. In 2003, when writing the first edition of In Search of Stupidity, I noted that Apple had about 3 percent to 4 percent market share of new computers sold worldwide (an observation that carries over to the Apple OS, which still runs only—officially—on Apple boxes). Actually, I was generous; by the time the book went to print, Apple’s share had slipped to less than 3 percent in some analyses. And today, after the iPod’s stunning success, Apple’s worldwide market share of PCs/operating systems worldwide is now about…3 percent to 4 percent.

It isn’t as if Apple hasn’t tried to change this. Since Steve Jobs returned to Apple, the company has launched several “switch to the Mac” campaigns, all of which have had little impact on the market. (Apple doesn’t even pretend to try hard in the server market, despite its product’s excellent performance). Apple has been able to hold onto its installed base, but little more. People seem quite content to connect their Apple iPods to their Wintel machines. Teenagers, always harbingers of new trends and fads, seem happy to rely primarily on Windows-based peer-to-peer networks to “liberate” music via the Internet and break the RIAA’s heart. And many I speak to seem quite put out by iTunes’s digital rights management (DRM) schemes. Apple’s growth is coming from consumer electronics, not computers, and no one on this planet has ever figured out how to take a company from 4 percent market share to industry dominance in the face of an entrenched competitor determined to defend its turf. Apple came close to industry dominance in the early 1970s and 1980s, but this was before IBM woke up. And despite Microsoft’s creeping development of the senescence that inevitably afflicts all megasized corporations, unless a big meteor hits Redmond and Bellevue, Apple cannot hope Steve Ballmer and Bill Gates are going to stand idly by while Apple lops off significant amounts of market share and money from Microsoft.

Does this mean Apple will eventually leave the PC business? Maybe. One possible scenario is that the company focuses on building more consumer devices, using the Apple OS as an embedded operating system to run ever more sleek and scratch-prone proprietary gadgets. Perhaps Apple eventually merges with Sony or another major consumer electronics giant and merges their technology with the new company. Apple has already provided their Intel-based computers with an easy way to run Windows, and the company gracefully exits the market with a solution that doesn’t leave its customers with the option of running only soon-to-be obsolete software. Given the pace of hardware advancement and evolution, the entire affair would take only two to three years.

Or maybe the market is changing under Microsoft, and Apple is in position to take advantage of the chaos that will ensue. The iPod’s success is ushering in a new era of content where music, film, and, eventually, literature is casting off its ties to the physical. Say a permanent good-bye to liner notes and beautiful album covers (two institutions already wounded by the move to CDs). Today’s new music consumer expects to take their music with them, be it on an airplane, in a room, or even from their hotel room. iPods are just way stations, disposable transmitters that facilitate the job of providing personalized content 24/7/365 to consumers. And if you want cover art with that music, well, that’s what websites and screen savers are for. And isn’t it nice those pretty images are also available anytime from anywhere?

In this milieu, what’s needed is a beautifully designed and easy-to-use system that seamlessly manages the task of providing, creating, and managing content for both professionals and the masses, a plan that calls for a hardware platform with plenty of oomph. It’s called convergence, and high tech has been waiting years for it to occur. For Microsoft, the problem is Windows doesn’t seem suited to the task; the system is feature laden but hard to use, loaded with extrusions and encrustations that make the heads of people already defeated by the remote control ache. But anyone who has used an iPod knows Apple can build lean, elegant, easy-to-learn interfaces people like. And its computers are certainly powerful enough to handle content management and transmission. So perhaps it’s Apple that dominates this new world, leaving Windows to its fate as a backroom grease monkey that does the grimy, dirty work of chugging through spreadsheets and grinding out yet more business memos. The consumer market is now where it’s at, after all, with COMDEX replaced by CES as high-tech’s major show. And now that Steve Jobs is on the board of Disney, where obviously he plans to sit quietly in the background and provide some helpful advice to the new CEO, we can hope the video iPod and its successors will at least provide us with a steady diet of nice cartoons and the latest Pixar/Disney movies.

There are many other possible scenarios. Perhaps Microsoft buys into several key markets and stitches together a convergence solution that, although not as elegant as Apple’s, has enough functionality, price advantage, and nonproprietary advantages to succeed in extending Windows into the living room. After all, who wants to bet against Microsoft and all those billions? And Microsoft has already executed such a strategy, with considerable success.

Of course, if you write enough business plans, I suppose one of them will be the right one. But this smacks of hiring a room full of chimps to sit in front of a group of terminals and hack randomly at a business plan software package in the hopes they’ll crank out the next Netscape IPO. The last time this worked was during the Internet bubble, and I think you’ll have to wait a few more years before you can get away with this.

The Softletter Company and Product Positioning Workbook for High-Technology

A vital guide that ensures you nail your product positioning before you release your product or found your company. If you fail to properly execute your product positioning, every other aspect of your marketing will collapse!

The Softletter Company and Product Positioning Course for High-Technology

The Softletter Company and Product Positioning Workbook for High-Technology Online Course

The vital companion to the Positioning Workbook! Self-paced and comprehensive, the video guides you through the industry's most successful and field-tested methodology for positioning your product and ensuring your marketing can reach your sales and revenue goals.

SaaS Entrepreneur: The Definitive Guide to Succeeding in Your Cloud Application Business

The most comprehensive and powerful guide to building your online, cloud-based software startup.

Selling Steve Jobs' Liver: A Story of Startups, Innovation, and Connectivity in the Clouds

Follow the adventures of Nate Pennington and Ignacio Loehman as they ideate, innovate, and growth hack their startup with the help of Steve Jobs. (Well, at least a part of him.)