The Third Edition of In Search of Stupidity: Over 40 Years of High-Tech Marketing Disasters is Here!

Share on facebook
Share on twitter
Share on linkedin

Purchase In Search of Stupidity: Over 40 Years of High-Tech Marketing Disasters Today

Pay only $9.99 and receive with your purchase the following bonuses:
$ 9
99
  • A copy of Selling Steve Jobs' Liver: A Story of Startups, Innovation, and Connectivity in the Clouds, a $9.99 value
  • Access to In Search of Stupidity: The Lost Chapters, Cartoons, and the Gallery of Bad High-Tech Marketing Collateral
  • Access to the In Search of Stupidity blog with new explorations into bad high-tech marketing
Popular

What's New in the Third Edition?

In the Third Edition, you’ll:

Experience the Horror of Hell on Your Desktop as Steve Ballmer unleashes Windows 8  on an unsuspecting world. Gasp as Microsoft’s entire mobile platform (and Ballmer’s executive career) is destroyed!

Shiver in awe as Jeff Bezos channels Steve Jobs’ spirit  during Amazon’s Fire Phone development but is possessed by Alfred E. Neuman instead!

FirePhoneCartoon

Gaze in amazement as high-tech CEOs keep self-destructing on a planet where everyone has a smartphone and everything you say goes viral immediately!

1673009

Scream in disbelief as Hollywood makes a movie about Steve Jobs that doesn’t contain a single actual fact!

1680670

Shriek in despair as Disrupted author Dan Lyons buries employment opportunities in high-tech for everyone over 40!

Watch in disbelief as Hugh Howey, Joe Konrath, David Gaughran and the rest of AAAG (Aggregated Amazon Ankle Grabbers) transform 8,661 independent writers into mutant sheep who leap mindlessly to their financial doom while clutching their Kindles!

Feel the fear as Amazon, Google, Facebook, Twitter, YouTube, Instagram, LinkedIn, and other social networks launch mass censorship attacks on everyone to the right of Karl Marx, disintermediate their subscribers from their personal networks, and help China enslave over 1 billion people via the Great Social Credit System!

fear_one

Not to mention new cartoons, expanded analysis on how every disaster in Stupidity could have been avoided,  an updated glossary of terms and much more! Enjoy the quick visual tour below.

I Identify
Click Here
Windows 8
Poster
SSJL
Backcover
Dive to Glory
Previous
Next

Read excerpts from each chapter

A Fistful of Chips

Of all the entities converging on the world of microcomputing during its early formation, none was more dominant than IBM. To many, IBM wasn’t simply a high-tech company, IBM was high tech, other companies being simply minor stars in an IBM firmament. By 1981, admiration, reverence, and fear of IBM had reached neo-cult status. IBM was “Big Blue,” and its chief competitors in the mainframe business were referred to as “The Seven Dwarfs.”

 

Nonetheless, IBM, almost against its will, was increasingly drawn to examine the unknown force that was driving people to go and buy hundreds of millions of dollars worth of “toy” computers. By the early 1980s, IBM had come to the realization it needed to understand this force, participate in its growth, and control it. The IBM PC was IBM’s first bid to achieve these ends.

 

A great deal of mythology surrounds the introduction of this now legendary system. The prevailing belief among many is that microcomputing before IBM’s arrival was a rough-and-tumble frontier, full of ornery software and colorful hombres tough enough to buy and tame herds of uncooperative boxes of lowing, obstreperous silicon. But as has so often been the case with historical events, truth and legend are often at odds.

 

The truth is that the microcomputer industry just before IBM’s appearance resembled not so much a rude cow-town but rather a spanking-new steam train, trimmed in polished brass and covered in fresh paint. Most of the passengers boarded the train at Start-up Junction and are looking forward to the ride to Prosperity, the town just up the line. On board and seated in a fancy Pullman car are a diverse set of well-to-do-looking characters, all gussied up in fancy store-bought clothes they’ve purchased from the proceeds of successful IPOs and healthy sales. These are the hardware dudes, who include Apple, Commodore, and Radio Shack, as well as a score of manufacturers of 8-bit computers running the widely used CP/M operating system. They’re a happy-looking lot—they’re shipping units to businesses as fast as they can manufacture them.

 

The home market is equally energetic, though not nearly as profitable, with every general store in town packed at Christmas and every other holiday with parents and their eager-eyed offspring snapping up every VIC-20, Commodore 64, Atari 800, Texas Instruments 94, and Timex Sinclair they can lay their hands on. (In 1982, Macy’s,[1] at the time a power in consumer electronics, ran out of every home-oriented microcomputer at its flagship Herald Square store in New York City a week before Christmas.)

 

Riding in the car just in front of the hardware merchants are the software peddlers, and they look almost as content. They’re selling copies of VisiCalc, WordStar, and PFS File as fast as they can stuff them into cardboard boxes. In many cases, cardboard isn’t required; demand is so high that customers are willing to take their software home in plastic baggies. Boom times indeed!

Fabulous Fruit

Of all the characters waiting expectantly for the train to pull out of the station, Apple was probably the best positioned of the early denizens of Microcomputerville to become the town’s mayor. Apple’s mainstay system, the Apple II, and its immediate successor, the Apple II+, were triumphs of industrial design and utility. Sleek and low slung, the units provided an attractive contrast to the stark industrial designs common to business machines. The Apple was reasonably priced (a fully configured system with a whopping 64KB of RAM, color monitor, and dual floppies cost only about $4,000). Its integrated color graphics gave it crossover appeal to the home market, and the system was supported by a wide selection of business and entertainment software. A small company called Corvus had even developed a system for networking Apples together. All in all, it was a compelling, up-to-date package and buyers loved their Apples.

 

Yes, the system did have its idiosyncrasies. You had to buy a hardware upgrade to type in lowercase. Connecting your floppy drive to your Apple incorrectly caused the hapless disk unit to seemingly explode as an internal capacitor[2] blew with a loud pop and a rush of blue smoke out the drive door, but people were willing to overlook these little peccadilloes.

 

Just as important as its hardware design was the fact that Apple was the first system to run the first spreadsheet, VisiCalc, microcomputing’s first killer application. A killer application is defined as a product so compelling you’ll buy the necessary hardware just to run that particular piece of software. VisiCalc qualified for this rare and honored appellation—once an accountant or CFO saw rows of numbers rippling across a spreadsheet grid as she automatically updated, that person was hooked for life: She had to have the product. Management information systems (MIS, later to be called information technology or IT) departments may not have cared for the loss of centralized control that these little boxes represented, but it’s a well-known axiom of corporate life that “You don’t say no to the CFO.” And once the CFO’s secretary (now called an administrative assistant) tried out a word-processing program, that was it. Apples, along with any other computer that ran VisiCalc, or some of its early competitors, quickly proliferated across a business frontier that was grateful to get them.

 

Also contributing to the Apple II’s success was its relatively flexible and extensible hardware and software architecture. Unlike most of its competitors, Apple’s system was “open.” Popping off the cover of an Apple II revealed, courtesy of designer Steve Wozniak, slots, connectors into which it was possible to plug in a host of different accessories and upgrades, including memory extenders, accelerator cards, copy boards (hardware devices you used to help make bitmapped images of software for, err, “archival” purposes), extended graphics cards, CP/M boards that allowed you to run CP/M software on your Apple II, and so on. An extensive industry focused on providing third-party accessories and upgrades quickly coalesced around the Apple II, helping drive sales even further.

 

In fact, from Apple’s point of view, the system was entirely too open. By 1980, a burgeoning clone and “gray” market was developing around Apple’s flagship as units with names like the Pineapple[3] and the Orange started being shipped into the United States in growing quantities from Taiwan and other points east. Domestically, Apple even had its own Compaq, a New Jersey company called Franklin Computers, which offered a well-made Apple clone that even let you type in lowercase letters right out of the box.

 

Apple’s reaction to this turn of events foreshadowed its future behavior with respect to the Macintosh market. It summoned an army of attorneys who were given the mission of shutting the clone market down. The lawyers accomplished this by convincing the courts that it was illegal for companies to simply copy the Apple basic input/output system (BIOS), the built-in set of software instructions that enabled the system to communicate with its internal peripherals. Once this principle was established, the clone market quickly withered because the machines were built by simply replicating the Apple’s hardware chassis and equipping it with ROM chips that contained the now “pirated” BIOS code. (Most people obtained the Apple operating system, Apple DOS, by simply copying the floppy on which it came, though Franklin had gone to the trouble of creating its own version of the Apple operating system.) The Taiwanese all sailed back to their island to concentrate on building IBM clones, and the last time Franklin Computer made any noise was at the industry’s 1983 COMDEX trade show in Las Vegas. It hired the Beach Boys to regale attendees at a party that turned out to be a musical swan song to the company’s imminent wipeout.

 

At the time, CP/M (short for Control Program/Monitor or Control Program for Microcomputers) was considered by many to be Apple’s great rival (though both Commodore and Tandy systems had their devoted acolytes). Developed in 1974 by Gary Kildall, founder of the whimsically named Intergalactic Digital Research (later just Digital Research), CP/M was designed to run on Intel’s widely used 8-bit microprocessor, the 8080, and its several clones, most notably Zilog’s Z80 chip. Unlike Apple DOS and its other competitors, CP/M was less closely coupled to a particular microcomputer’s underlying hardware. Digital Research capitalized on this trait to build a profitable and growing business licensing CP/M to several dozen companies, such as NCR, Televideo, Sol Processor, Radio Shack (its variant was known as “Pickles and Trout” for some forgotten reason), and one of the industry’s earliest and most spectacular flameouts, Osborne Computing, creator of the first “portable” (at 25 pounds) computer.

 

CP/M suffered from one tremendous drawback, however. Although it could be easily adopted to run on a wide variety of computers, no de facto hardware standard for CP/M machines existed. Printer ports, monitors, and in particular floppy drives all differed from machine to machine. As a result, a person who purchased MicroPro’s WordStar word processor for his Vector system had no assurance the floppy on which the software was stored could be read by a Cromenco computer, despite the fact that both used the CP/M operating system. For a while, resellers such as Lifeboat Systems in New York City did a nice business simply supplying CP/Mers with the software of their choice on floppies their computers could read.

 

Exploding disk drives and noncompatible floppy formats aside, our train has built up a head of steam and begins to chug forward. But as the engine begins to pull out of the station, a lone rider appears suddenly in the distance, his horse galloping madly in pursuit. Reaching the last car before the train has come up to speed, the outlaw grabs hold of a railing and quickly swings himself up onto the rear platform. As he does, we can see the pursuer is a lean bandito wearing a tattered poncho, his features obscured by a tattered hat pulled low over his face. He enters the train and strides through it toward the special Pullman where our hardware merchants sit unsuspecting. When he reaches their car they turn to face the intruder, trepidation writ large on their faces. There’s a long moment of silence. Then the stranger lifts his hat to uncover ice-blue eyes that show no pity and throws back his poncho, revealing a three-piece suit matched with a white shirt and sensible tie. Strapped around the stranger’s waist are a pair of 8088s, deadly six-guns with the phrase “16-bit” inscribed on their chromed barrels. Pulling out these engines of destruction by their off-white pearl handles, the stranger mercilessly guns down the hardware dudes one by one. Only a handful escape the initial carnage.

 

The IBM PC has arrived on the scene.

Misfire: Jeff Bezos and the Firephone

Mana Man

In 2011, Steve Jobs passed away from pancreatic cancer. Like most “solid” organ cancers, it’s a tough customer. Your internal organs have no nerves and the cancer normally metastasizes and spreads before it’s detected. Victims typically die months after the first diagnosis.

 

Jobs was lucky in that he was diagnosed with a rarer form of the disease, neuroendocrine islet cell carcinoma. Unlike the more common and lethal adenocarcinoma, islet cell cancer is often detected earlier and is more responsive to treatment. It can be cured if found early and treated promptly.

 

As befitting the career of one of high-tech’s most paradoxical personalities, when Jobs was first diagnosed, instead of seeking the assistance of modern medicine and immediately undergoing surgery, he first took a nine-month detour into various alternative treatments, including fad diets and “spiritual healing,” which science has demonstrated are useless. Jobs finally underwent the surgery, which prolonged his life, but he later regretted not acting immediately when there was perhaps a chance for a cure. During an interview about his biography of the Apple titan, Steve Jobs, author Walter Isaacson stated he’d felt Jobs had delayed the operation because:

 

“I think that he kind of felt that if you ignore something, if you don’t want something to exist, you can have magical thinking.”

 

While his behavior was foolish and brought sorrow to his friends and family, you almost can’t blame Jobs for believing he possessed some sort of magic. When he returned to Apple in 1996 and resumed full control of the company in 1997, Apple was a fading power in computing and seemed destined for irrelevance, if not erasure. From 1997 to the year of his death, he executed the most remarkable turnaround in business history, releasing three massive hits in a row—the iPod, iPhone, and iPad, not to mention iTunes and the Apple App Store. By the time he passed away, Jobs had achieved more than a business transformation and turnaround. He’d transformed into a totem, a mystical figure with a special mana that, if you could somehow tap into it, would grant you with the power to bend marketing space and time to your own personal reality as well.

 

Inspired by Jobs and all that luscious iPhone revenue, Jeff Bezos began planning for Amazon to enter the smartphone market in 2013. On the face of it, it was a logical decision. In 2007, Amazon’s first hardware product, the Kindle ereader, had captured the ebook market. In 2011, Amazon introduced its first tablet, the Kindle Fire, and it too had been a solid success.

 

The function of both devices in Amazon’s sales strategy were to act as low cost, loss leader purchases designed to whisk you into Amazon’s virtual bookstore and warehouses, where you would hopefully buy enough merchandise and services to cover the costs incurred in building them. As such, the design of both the ereader and tablet were minimalist and as cheap to build as possible.

 

Spiritually Seeking Steve Jobs

For his smartphone, Jeff Bezos had a different plan entirely. By 2013, he no longer thought of Amazon as just a store, but also as a technology powerhouse, a unique amalgam of merchant and high-tech empire. Bezos felt he competed against not only against Walmart, but also against Google, Microsoft, and, of course, Apple, the coolest brand on the planet, which until only recently had been led by the coolest guy in high-tech. Bezos thought he could be cool too, and like many other Silicon Valley CEOs, fired up a virtual ouija board in search of some of that powerful Steve Jobs mana. Soon, he’d loaded up with spiritual power and the Fire Phone development effort began.

 

The project, code named “Tyto,” was set up in Amazon’s Lab126 under Bezos’ watchful eye. The facility is Amazon’s design studio and a conscious homage to Jobs and his immortal “pirate” building at Apple HQ where the original Macintosh was developed. Channeling Jobs further, Bezos appointed himself the device’s “super product manager” and made every major design and functionality decision for Tyto during its development cycle.

 

As the device took shape, Bezos became fixated on providing a unique new feature for the new phone, a 3-D effect for the screen called “Dynamic Perspective.” Implementing it required four power-sucking cameras installed in each corner of the phone. Dynamic Perspective’s only practical use was to make your lock screen look 3-D. Sort of. Which wasn’t practical. But in Benzos’ mind, this one feature branded the Fire Phone a high-end, luxury device.

 

From a design standpoint, the rest of the Fire Phone was an undistinguished black polycarbonate slab with a soft black edge, a design cue taken from entry-level Chevrolets. During the product’s live rollout, much was made of the “injection molded steel connectors,” but the slides also included the term “rubber frame.” The latex fetishists in the audience may have been excited, but rest of the audience was unimpressed.

 

Yes, the Fire Phone had a 13 MPS front camera, while an iPhone’s only had eight, a tangle-proof headphone cord, and a software bundle that sounded nice, but it looked like something you bought at the bargain end of the smartphone counter at Best Buy. And while Amazon wanted you to watch lots of videos on your Fire Phone, it lacked an HD screen.

Microsoft Behind the Eight Ball

Flirting with Disaster

After the tremendous success of Windows 3.0 in 1990, the market was clamoring for Microsoft’s next act, an OS that would enable PC owners to pull even with the Macintosh. Windows NT was announced and for the next three years Microsoft spent considerable time proclaiming that this new version of the product, once known as OS/2 3.0, would be the 32-bit successor to the Windows 3.x product line. But as NT neared completion, complaints began to surface that the product was too big and resource-hungry to fit the market’s existing desktop profile. Microsoft had heard this before with other products, but Moore’s Law which, roughly paraphrased, states that computing capacity doubles every 18 months and was fully operative at the time, had in the past bailed out the company. But this time, Microsoft blinked. NT was quickly hustled offstage and Microsoft cobbled together a 16/32-bit hybrid that would eventually be known as Windows 95 and switched promotional gears, telling everyone that this product was in fact the desktop upgrade Microsoft had been promising.

 

After Windows 95’s release, Windows 3.x’s huge installed base, IBM’s ineptitude in marketing the competing OS/2, and a massive promotional campaign all contributed to the product’s  tremendous sales success. But over time, the positioning problem grew in the critical desktop arena. Windows NT, then 2000 (the more things change . . .), had always been available in a “workstation” version that directly competed with the Windows 9x family. After all, both product lines were called Windows. They were both advertised as 32-bit operating systems. The desktop versions were comparably priced. They looked alike. So, which to buy?

 

Microsoft tried to help customers make the decision via a classically bad 1996 ad campaign many referred to as “Two Nags Racing.”  A two-page spread, it featured a picture of two horses running neck-and-neck with the caption “You See a Horse Race. We See Two Thoroughbreds.” Apparently no one at Microsoft had realized that, uh, yes, but the horses are racing. And as we all know, only one horse can win. So, which customer is going to ride the losing steed? Faced with such a choice, corporate America paused (and the ad was quickly yanked).

 

Luckily for Microsoft, a way out of the dilemma appeared. Windows NT was repositioned as a network operating system (NOS) and aimed at Novell, where it achieved massive success due to the Utah firm’s marketing ineptitude. Despite some ongoing speculation and criticism, this enabled Microsoft to successfully navigate away from the dark and stormy positioning clouds that loomed ahead of it and towards further product success. In 2001, company grasped the Holy Grail of product line integration with the release of Windows XP, which enabled the company to offer one product named Windows to the market. Despite the weather, everything became sunny every day in Seattle.

 

The Green Reign

By 2005, Microsoft was 10 years into its rule over computing and the view from Redmond was splendid. Microsoft Office’s grip on the desktop applications market had been locked down and seemed unassailable. Internet Explorer’s market share hovered close to 90%. The company enjoyed strong sales of its development, server and SQL database products. Microsoft set the standards now, not IBM.

 

Even better, a new marketing was opening up and Microsoft was sitting pretty there as well. A new term, “smartphone,” was entering computing’s lexicon. A smartphone was a hybrid device, a blend of a mobile phone and a personal digital assistant (PDA). While PDAs had never become the multi-billion dollar category foretold since the introduction of the Apple Newton, they’d carved out a foothold in the market and Microsoft’s Windows Mobile OS had made a nice transition from PDAs to hot new smartphones such as Samsung’s BlackJack.[1] By 2007, Windows Mobile enjoyed a 30% market share in smartphone OSs. To drive the point home, that year BlackJack owners were sweating out the possibility that Windows Mobile 6.0 would not be released for their devices.

 

There were a few splotchy clouds drifting across what was otherwise a crystalline horizon. Losing that anti-trust suit to the U.S. DOJ had been painful. Bill Gates had left the company as a result and that was a blow, though his replacement Steve Ballmer remained in fine spirits throughout the ordeal. More inconvenient was that after the case had been settled, the DOJ draped an anti-trust anaconda around Microsoft’s windpipe with instructions to squeeze if Redmond started to show signs of reverting to its bad old ways. Fortunately, reptiles tend to sleep most of the time, though the snake did become agitated if Microsoft engineers started becoming too busy with browser technology.

 

And of course, as everyone knew they would after the case was settled, the Euros started suing the company left and right. Annoying, but when you enjoy two major high-tech monopolies, your profits can easily cover the costs of keeping legal vultures at bay. Just the cost of doing business.

 

Vista of Nowhere

It started becoming less good in 2006 with the release of what was supposed to be Windows XP’s successor, Windows Vista. Codenamed “Longhorn,” Microsoft promised it would be stuffed with all sorts of good things such as a journaling file system that, like Dr. Who’s tardis, could practically travel back in time to retrieve files, security protection that even super computers couldn’t crack, a gorgeous new interface, better printing, searching, networking, and on and on. Everyone in the industry had already heard this a million times in the past and the company’s Marketspeak was translated into a standard statement that read Longhorn would be late, it would have bugs, they would be fixed over the course of several incremental releases, most of the new features would eventually work, and Microsoft would sell a ton of it. What choice did anyone have?

 

On January 2007, Vista was duly released, and everyone stepped politely aside to avoid impeding the inevitable upgrade stampede. And waited. And waited some more. Then even more before everyone realized the stampede wasn’t coming.

 

It was a puzzle. It couldn’t have been lack of awareness of the product; it had been covered relentlessly by the high-tech press for years. When it was released, Microsoft pulled out all the normal PR stops with an expensive launch event, extensive advertising, an on-stage show starring Bill Gates and Steve Ballmer, and much more. All the normal PR bases were covered, but Vista just refused to sell? Why?

 

The answer came back to a positioning failure. Vista didn’t sell because it violated one of three fundamental laws of product positioning—the product’s feature set as delivered (not promised) failed to convince Windows XP users, Vista’s primary competition, that it was worth the time and hassle of replacing a perfectly fine and working installation.

 

In the years between the product’s announcement and its delivery, Microsoft had steadily chopped away at the advanced features that were supposed to make Vista a compelling purchase. The advanced file system went to the same place that Dr. Who’s tardis does when it dematerializes. The “hacker proof” code was scrapped and your new Vista system could still be cracked open by the 15-year-old next door. The new “Aero” interface  that was supposed to be more beautiful than Hawaiian sunsets, drained power from your laptop at an alarming rate, leading people to turn off it off and settle for a look more akin to Seattle during a drizzle.

 

Making it worse was that some of the things that Microsoft did add weren’t wanted or were annoying. For example, copy protection, and a “User Account Control” (UAC) notification system that when you wanted to move or copy a file asked, “Are you sure?” And when you said yes, asked “Absolutely positive?” And when you said “YES,” asked you, “Are you sure you don’t want to change your mind?” That was often the last thing it said before you put your fist through the screen.

 

In short, there was no compelling reason to buy Vista and people didn’t. Microsoft gamed the adoption figures, but no one was fooled. Windows XP remained the preferred version for several more years. It turned out that Windows users did have a choice after all.

 

Vista’s market flop shocked Microsoft and certainly didn’t help the company’s bottom line, but no one panicked. Microsoft was still powerful, still set the standards, and still owned two invaluable monopolies. Vista was a setback, not a catastrophe. Microsoft’s attitude was reinforced with the success of Windows 7 in 2009. The product was well reviewed, avoided many of the issues that plagued Vista, and was a solid sales success.

 

Despite Windows 7’s success, a danger sign presented itself after the launch. This was the continued unwillingness of many Windows XP users to give up the OS. Many people remained unconvinced that the features offered by Windows 7 justified the pain of switching.  In 2012, XP enjoyed a worldwide market share of 47% compared to Windows 7’s 36%.

 

Nonetheless, Windows 7 made many people feel better about Microsoft’s place in the world, and the company refocused its attention on nurturing its monopolies while also planning to conquer the remaining application markets it didn’t control: Their plans included:

 

  • Launching Silverlight and Sparkle, competitors to Adobe’s at the time ubiquitous Flash technology.
  • Releasing Expression, a competitor to Adobe’s Macromedia web-building product line.
  • Launching the Zune, an iPod competitor.
  • Launching the Kin, a “social” smartphone.
  • After its merger with Intuit was rejected by the U.S. DOJ on anti-trust grounds, a continued push into the home/small business accounting market with Microsoft Money.
  • A new commercial anti-virus product.
  • New initiatives designed to supplant Lotus Notes and Novell’s Groupwise.
  • New efforts to compete in mobile email aimed at the Blackberry.
  •  

And obviously work on Jupiter (Windows 8), Windows 7’s planned replacement, went full steam ahead.

In 1987, while working at MicroPro as WordStar product manager, I was assigned to participate in one of high tech’s hoariest rituals: a media tour. Despite the passage of years, media tours remain a PR staple to these days, and are usually used by mid-level firms. A tour consists of arranging for members of your senior management team to meet with key members of the blogging empires and analysts who write about and cover your market. The hope is that once you’ve established a backslapping, hail-fellow-well-met relationship with an editor from Wired Magazine or a guru from Gartner they’ll be more inclined to write nice things about your company and its products. Sometimes it works out that way. The quid pro quo driving the tour is that in return for putting up with you disturbing their day, you’ll provide fresh news for the blog and buy research from the analysts. Sometimes it works out that way as well.

 

Tour personnel usually consist of at least one member of upper management, one member of middle management capable of giving a comprehensive product demonstration (informally, this person is referred to as “the demo dolly”), and a PR person. For this tour, upper management was represented by Leon Williams, then president of MicroPro, I appeared in the role of the demo dolly, and rounding out the group was a sad little PR type who confessed at the end of our trip that she really didn’t like working with members of the media. Once you’ve been on one or two media tours, most people regard them with the same affection as a root canal. Most tours consist of a trip to New York, Boston, and San Francisco, the three major hubs for high-tech media and analysis.

 

Our itinerary included a side trip to Austin, Texas to meet Jim Seymour, long-time editor and columnist for the then mighty Ziff publishing empire. On the day of our appointed meeting, we trekked out to Seymour’s house in the Austin hills (which itself is becoming a major technology hub), where I dutifully demonstrated the latest, greatest version of WordStar 5.0, the one that couldn’t print. Luckily for me, Seymour, engrossed by the Macintosh (as were most members of the media at the time), paid only cursory attention to the demo, and instead insisted on demoing his latest Mac toys for us. Once everyone was done showing off, we settled down for the obligatory period of chitchat before we headed off to the airport and our next stop in the never-ending tour.

 

Heart of Darkness

For no particular reason that I can remember, the topic turned to Ashton-Tate, publisher of the widely popular dBASE database program. Seymour started talking about a meeting he’d attended with other members of the media where Ed Esber, CEO of the database giant, addressed the group. As he began talking about Esber, his face suddenly developed an expression of contempt. He told us how during the speech Esber had stated at one point that he wasn’t necessarily the smartest guy in software. Seymour paused, then looked at our group and said, “We were all thinking, boy, you’ve got that right, Ed.” The venom in his voice was surprising.

 

I didn’t pay much attention to the exchange at the time, but after leaving MicroPro to become a product manager at Ashton-Tate, I later realized I’d had my first glimpse into the dark heart of one of software’s biggest and most unexpected meltdowns. As events progressed in the industry, it became clear that as far as the PC media was concerned, it was “Ed Esber. He dead.” They wanted his head on a stake.

 

At its height in the 1980s, Ashton-Tate was one of software’s “Big Three,” the other members of the triumvirate being Microsoft and Lotus. Microsoft had DOS, Lotus ruled spreadsheets, and Ashton-Tate was the database king. The lucrative word-processing franchise was being fought over by MicroPro, WordPerfect, MultiMate, Microsoft with its Word product, and a host of smaller players.

dBASE was originally designed to help place winning bets in football pools and was the creation of Wayne Ratliff, a contract programmer at the U.S. Jet Propulsion Laboratory. Although Ratliff didn’t get rich on sports betting, he did decide his new software program had commercial potential. Named “Vulcan” in honor of the home planet of Star Trek’s Mr. Spock, Ratliff placed his first ad for the product in the October 1979 issue of BYTE magazine. At its release, Vulcan was priced at $50, and though there was flurry of initial interest,[1] the stress of trying to ship, support, and manage a one-man company was overwhelming. Ratliff was on the verge of ceasing operations when software reseller George Tate contacted him.

 

Tate and his partner, Hal Lashlee, took a look at Vulcan, quickly realized its potential, and bought exclusive distribution rights. At the time of the deal they were running a not-very-successful mail-order software company called Software Plus. Believing that Vulcan would turn things around for their company, they renamed the company Ashton-Tate to give it a more “upscale” image. (A great deal of speculation has centered over where Tate came up with “Ashton”—no one who worked at the company had that name. The general belief is it was picked because Ashton sounded “British.” It should be noted, however, that Tate had a pet parrot named Ashton.)

 

After a quick trademark search uncovered potential problems with the name Vulcan, the product was rechristened dBASE II. There was no dBASE I, but even in the early 1980s people were reluctant to buy 1.0 releases of software products. The company upped the cost of dBASE II to $695, a very competitive price for a product in its class and with its capabilities, and placed full-page magazine ads featuring a picture of a sump pump and the proclamation that while the pump might suck, dBASE didn’t (or words to that effect). Sales took off and by 1985 Ashton-Tate’s revenues were over $100 million a year and climbing, mostly from sales of dBASE II and it successors, dBASE III and III+. The company also enjoyed modest sales success with its Framework integrated product. Integrated products attempted to combine word processing, database management, a spreadsheet, and graphics all within a single program. Framework was considered the best of breed in this market segment, but the integrateds, which included titles such as Lotus Symphony and Ability,[2] never sold in the numbers projected, and the category largely disappeared in the early 1990s.

 

In addition to ads featuring plumbing, another reason for dBASE’s quick rise to prominence was that the company made much of the fact that dBASE was a relational database management system (RDBMS). The relational model was first introduced in a paper published in 1969 by an English computer scientist, Dr. E. F. Codd, who worked for IBM. More flexible and expandable than competing technologies, relational products over time were adopted by most DBMS developers and users.

 

In addition to a table-oriented paradigm, Codd’s definition of a RDBMS also incorporated several key capabilities and functions a product needed to possess before it could be called a “truly” relational system. None of the early RDMBS systems for the PC incorporated all of Codd’s requirements, and religious arguments raged constantly over which product was “more” or “less” relational than another. dBASE II was probably “less” relational than some of its competitors, but that also meant it could run on systems with less memory and reach a broader audience. Despite the pooh-poohing of purists, for several years dBASE became almost synonymous with the relational concept.

 

In 1985, Tate died unexpectedly of a heart attack at the age of 40, and Esber, his second in command, took over the leadership of Ashton-Tate. Esber was a Harvard-trained MBA and a former product manager at VisiCorp, the company that had seen its VisiCalc spreadsheet eclipsed by Lotus 1-2-3. Esber announced he was going to bring a more professional management style to Ashton-Tate, replacing Tate’s more hands-on and emotional approach. Despite having a bachelor’s degree in computer engineering, Esber didn’t have a reputation of being technically astute, though this was misleading; he had programmed extensively earlier in his career.

 

Esber did fancy himself something of a business guru, and one of his favorite quotes was “A computer will not make a good manager out of a bad manager. It makes a good manager better faster and a bad manager worse faster.” He had something there. It had taken Tate about five years to build Ashton-Tate to software giant status; it would take Esber only two-and-a-half years to put the company on the road to ruin. And Esber had a PC on his desk the entire time.

An OS Is Born

While IBM and Microsoft were involved in their negotiations, rumors of the new PC OS began to float throughout the industry. For a while it was called CP/DOS, or DOS 286, or DOS 5.0, and then finally Presentation Manager. It would have multitasking and multithreading (whatever that was) and semaphores and all manner of good stuff, but most important, it would have a GUI, just like the Mac! This interface would be based on Windows, only much better, obviously, and Windows and the new interface would be so similar that anyone who developed a Windows product could port it to Presentation Manager when it was ready to ship with a snap of the fingers and a twinkle of a compiler. Write two products for the price of one. And boy, that sounded really good to all the developers!

 

And though no one would actually confirm the date on which the new wonder OS would ship, everyone assumed it would be sometime in 1986 or at the very least 1987. And that sounded good, too, because by 1986, Atari 520 ST [1] owners had a pretty sophisticated GUI for their machines, for God’s sake, whereas PC types still had to clunk along in character-based DOS. And PC owners were sure getting tired of all those Mac snobs laughing at them and twirling those damned mice under their noses and getting all the girls because Macs were so cool. In fact, a fair number of them started buying up Macs so they could twirl mice and be cool, too. But most were still content to wait for IBM to ship a cool Mac-like OS so that they could twirl their mice while avoiding paying Apple those 50% profit margins it got on its systems. But they were sure eager to get their hands on that new OS and those mice.

 

Then IBM threw SAA into the mix and everything changed.

 

SAA, which stood for Systems Application Architecture, was an attempt by IBM to develop a cross-platform OS (or something close to it) that would run on all IBM mainframes, minicomputers, and PCs. (An inadvertently hilarious book about this heroic effort was written many years ago. It was referred to by industry wags as “The Soul of a Giant Three-Ring Binder.”) This initiative had been sparked by IBM’s annoyance at having to listen to its archrival in the lucrative minicomputer market, DEC, trumpet that it, DEC, “had it now.”

 

What DEC[2] had “now” was a unified OS and application environment for its entire product line. In theory, no matter what size computer you bought from DEC, they all used the same OS and ran the same software. (This wasn’t entirely true in practice, but DEC certainly was way ahead of IBM in this regard.) By contrast, IBM supported over a dozen incompatible hardware and OS platforms. Moving an accounting package from, for example, an IBM minicomputer system to a mainframe required an extensive rewrite of the software.

 

 SAA was designed to close this perceived competitive gap, but IBM was targeting a chimera. True, in the late 1980s, DEC’s profits and revenue presented the picture of a company in the pink, but this was an illusion. In reality, DEC’s appearance was more akin to the hectic flush a consumptive develops before death. DEC’s business model consisted of selling minicomputers and small mainframes to companies at the departmental and divisional level. This was precisely the market companies such as Novell and 3Com were targeting with their networking OSs. Herds of Silicon Beasts yoked together with NetWare were a fraction of the cost of DEC’s expensive and overengineered hardware products and required less support. LAN systems were also starting to offer a broader and cheaper selection of software to compete with the minicomputer market’s offerings.

 

Some ominous reports from the field began to filter into DEC headquarters over defections in the company’s customer base, but these were ignored until it was far too late. Over time, the concept of much cheaper and more choices beat “has it now” hands-down and DEC, along with most of the minicomputer market, disappeared in the late 1990s. For good measure, SAA proved to be a massive waste of IBM’s time and money and eventually sank without a trace as well.

 

Nonetheless, what IBM said still went, and work on integrating SAA technology into Presentation Manager moved forward. Much of this work involved building support for a whole host of IBM mainframe terminals into the new OS. This all took much time and effort, and it was soon apparent there would be no wonder OS in 1986. In fact, the new OS with the cool graphical interface would not be ready until 1988. Oh, and you know all that stuff about a simple recompile being all you needed to do to port your Windows product to Presentation Manager? Forget it. You were going to have to do a major code rewrite to get your product to run under the new OS after all. Which, by the way, was going to be called OS/2. A nice fit, IBM thought, with its new PS/2 line of microcomputers.

 

Then, in an act of supreme stupidity that would characterize IBM’s marketing of OS/2 for the rest of the product’s ill-starred existence, the company announced it would indeed ship the first version of OS/2 in 1987. Only it wouldn’t have a cool Mac-like GUI—just the same DOS-like character interface everyone was heartily sick of. Few cared that underneath the hood of the new OS was a quantum-leap improvement over DOS in functionality. With this single stroke, IBM had created TopView II.

 

IBM’s motive for this act wasn’t hard to discern. OS/2 had become a draftee in the company’s war on the hardware cloners, and it had been assigned to ride shotgun alongside its new computers into battle. By 1987, the company had woken up to the consequences of unleashing the Silicon Beast on the market and was looking to take it all back. The PS/2 line would ship with a new bus, the Micro Channel, that had more patents stuck to it than bugs on a fly strip hanging from the ceiling of a Texas gas station. The new BIOS chip, called Advanced BIOS or ABIOS, reared up and bit you on the finger if you tried to reverse engineer it. The units shipped in April 1987 with plain old DOS, and IBM badly wanted something that could better showcase their new darlings. OS/2 1.0 was it.

 

While this was all well and good for IBM, software publishers were less than thrilled. Companies were being asked to throw a considerable amount of time and money into supporting an OS version whose sales prospects were dubious. Making everyone feel worse was IBM’s pricing of OS/2 1.0: $340 for a retail copy, a price that generated sticker shock. IBM had established a low price point for desktop OSs with the introduction in 1981 of DOS 1.0 for $40, and no one thought the OS/2 pricing strategy was a smart move. Once a market’s pricing structure is established, it takes time and effort (and perhaps a helpful monopoly) to change it, if you ever do. Yes, many people would eventually obtain the product via bundling, but strong retail sales would help kick start acceptance of OS/2 and generate sales of OS/2-specific products. And that was unlikely to happen with a $340 desktop OS that lacked a GUI.

 

And speaking of pricing, IBM and Microsoft had placed a $3,000 price tag on the OS/2 software development kit (SDK). That was no problem for larger software companies, but smaller firms complained bitterly. Microsoft practically gave away its Windows development tools. Even Apple set more reasonable prices for its SDKs.

 

IBM also seemed oblivious to the need to provide marketing assistance to independent software vendors (ISVs) building OS/2 applications. The company had no direct marketing programs a third party could access that would help promote new OS/2 products. IBM had no expertise or influence in software distribution channels and seemed uninterested in developing any. IBM made no attempt to garner critical “shelf space” in major resellers, an important factor at the time. There were no co-op advertising programs. There were only a few scattered attempts to build a supporting infrastructure of books, publications, shows, and events that would stimulate interest in buying OS/2 and OS/2-related products. IBM’s attitude was that what had worked for the company since the Great Depression would work today. And, to an extent, it did. Several major publishers, including Ashton-Tate, Lotus, SPC, and MicroPro, as well as a few daring start-ups, committed themselves heart and soul to OS/2.

Barbarian Conquests

Borland made its first play for big-league status with its 1987 purchase of Ansa and its Paradox database. Now buried in the Corel Office suite, Paradox, first released in 1985, has never received the credit it deserves for its innovative design and breakthrough performance. The product’s initial claim to fame was its introduction of query by example (QBE) capabilities to PC relational databases. Instead of typing in long lines of obscure queries, a Paradox user could quickly recall records by simply checking off boxes from an onscreen image of the database, and then save these visual queries for future use. This capability, combined with powerful form creation and scripting features, made the product a viable competitor to Ashton-Tate’s dBASE and the various Xbase clones. The product often, if not always, came in first in reviews and competitive analyses, and by the 3.0 release Paradox was widely considered to be the “best of breed” in the DBMS desktop market.[1]

 

Even better from Borland’s standpoint was the fact that the product couldn’t be easily cloned. The Paradox scripting language manipulated “objects” such as queries, reports, and forms within the Paradox environment and resisted compilation technology. On the other hand, Paradox was accessible enough to allow third parties to develop utilities for and extensions to the product. The combination of power, price, and third-party push helped Paradox begin to make major inroads into a market formerly dominated by Ashton-Tate and the Xbase alternatives. By the time of Borland’s takeover of Ashton-Tate in 1991, Paradox owned about one-third of the market for PC desktop databases. Interestingly enough, Borland kept the price of Paradox at $695, then the median price for high-end database products. It seemed the software barbarian was willing to ape the ways of civilization when they suited his purposes.

 

After Ansa and Paradox, Borland purchased the Surpass spreadsheet from Seymour Rubinstein of WordStar fame,[2] renamed it Quattro, and entered the spreadsheet market with barbarian zest. As part of his slash-and-burn tactics, Philippe Kahn launched what became known as a competitive upgrade promotion against Lotus, which had lagged in releasing its new 3.0 version of 1-2-3. The competitive upgrade works by offering the user of another product your product at a reduced price in return for the user ostensibly “turning in” her current product—a desirable marketing “twofer” because the upgrade increases your installed base while simultaneously decreasing your competition’s. Quattro’s pricing was initially lower than the $495 median for spreadsheets but, as with Paradox, by 1990 it was repriced to match industry standards. The competitive upgrade was kept sharp and at hand in Borland’s promotional arsenal, and periodically the company launched one when it spotted an opportunity. Wielded in the hands of barbarians, cutthroat pricing and competitive upgrades were fearsome weapons, but they were ones that more civilized warriors could also employ, as Borland would one day discover.

 

In 1991, Borland reached over $200 million in annual revenue, mainly on the strength of growing Paradox sales. Kahn was now at the height of his ambitions and looking for new conquests. Casting his fierce gaze about, it came to rest on Ashton-Tate, a wounded company that seemed ripe for the picking. Negotiations commenced between the barbarian and his intended prey. Alas, a group of smooth-talking and decadent civilized men seem to have seen the savage coming and talked him into forking over the princely sum of $440 million in Borland stock for the privilege of raising the Borland tribal standard over once mighty Ashton-Tate.

 

Upon its purchase, Ashton-Tate proved to be a pretty warty property and by no means worth what Kahn shelled out. (A sum in the neighborhood of $200 million would have been more realistic. Maybe.) The company’s “crown jewel,” dBASE IV, was an ugly frog that showed no inclination to turn into a prince anytime soon, and the rest of Ashton-Tate’s software portfolio was pretty toad-like as well. Its MultiMate word processor was obsolete and sales were dying. Ditto for its ChartMaster family of products. There was also an unsellable desktop-publishing program, Byline. Framework was a fine little bit of code, but the brief day of the integrateds was almost over. Ashton-Tate’s Mac products weren’t bad, but by 1991 it was becoming clear that Windows was going to reduce Macintosh software to a niche market, and Kahn wasn’t interested in investing in it. Reduced to its essence, what Borland had bought for its $440 million was a mailing list of dBASE customers and an installed base that was quickly rotting away as developers fled dBASE into the arms of the Xbase alternatives or Borland’s own Paradox.

 

The Great Pentium Bunny Roast: Intel Inside

In this environment, semiconductor giant Intel spotted an opportunity. Earlier in its history, the company had launched a marketing campaign designed to convince IT types that they should be concerned about whether their computers were built around Intel’s 386 processor. The program had been fairly successful and now Intel believed it was time to be more ambitious and make Intel a household name. Though people increasingly cared less about what company manufactured their PC, they still wanted to compare their purchases and brag about them. As a consequence, computer owners had begun to worry about the specifications and speed of their processors[1] in much the same way that car owners obsess over the horsepower and cylinder specs of their respective buggies.

 

Intel reasoned that if people were going to worry about their microprocessors, the company might as well make them worry about not having one made by Intel. And while Intel was at it, the company should provide disincentives to computer manufacturers in the rapidly growing home market for PCs from using anything other than Intel microprocessors inside the boxes being purchased by Joe and Josephine America. These dual motivations gave birth to the Intel Inside program, the most massive consumer branding campaign high tech has ever seen. The Bunny People were released on an unsuspecting world.

 

Intel Inside consists of two key components. The first, and perhaps most significant on a long-term basis, is the marketing development funds (MDF) (or bribe, depending on your point of view)[2] aspect of the program. Largely hidden from public view, Intel’s MDF systems work by kicking back to manufacturers an average of 3%-6% of the total average selling price of the company’s worldwide monthly microprocessor shipments. In return, computer makers agree to display the Intel Inside label on their computers and in their advertisements.

 

The accrued MDF funds don’t go directly back to the vendors. Instead, Intel deposits the money in an Intel-managed MDF that the manufacturers must use to pay for print, Web, TV, or radio advertising for their Intel-based systems. If they don’t use the funding within 12 months, they lose it.

 

All Intel Inside participants must submit every ad, regardless of medium, to Intel for approval. Ads are checked for compliance with Intel corporate identity standards for:

 

  • Size
  • Color
  • Prominence of Intel’s logo
  • Verbiage in the accompanying taglines
  • Click-throughs to Intel websites for Web advertising

 

Intel also “manages” the percentage of the funds that vendors must use for advertising in each medium. Helping Intel manage the process is a 100-page manual of regulations that even dictates how ad copy must be written. Failing to follow Intel’s guidelines and committing even a minor infraction can lead to all MDF funds being frozen. Adding a product that uses a non-Intel chip to an existing line leads to forfeiture of all Intel MDF for that line. The vendor must establish a new product line to maintain access to its Intel Inside funds.

 

Intel Inside proved for years to be very successful in locking Intel’s competitors out of the top end of the market. While major manufacturers such as HP, Dell, Acer, and Lenovo now offer systems with AMD processors, these are almost exclusively reserved for their low end and “budget” systems. Intel remains the premium offering, though the company is finally seeing some real competition from AMD’s new lines of Ryzen CPUs.

 

Invasion of the Bunny People

The second and far more visible aspect of Intel Inside was a massive media campaign consisting of a series of ads and commercials featuring all sorts of jiggly, jiving critters. The first generation of Intel media pitchmen were known as the “Bunny People”: dancing “technicians” who leaped around in the “clean suits” worn by the people who work in semiconductor fabrication plants. Just like real rabbits, the Bunny People have been supplanted by numerous descendants, including the Blue Man Group and animated aliens who look like Bunny People whose genes have been subjected to nuclear radiation in a hidden lab. In addition to the Bunny People, Intel also created a jingle (the company calls it a “signature ID audio visual logo”) placement program—that ubiquitous 3-second tad-dah-tad-DAH song snippet millions of Americans have had pounded into their subconscious during a Dell or Gateway TV ad.

 

The main thrust of Intel’s media campaign was to convince people that computers are more fun, exciting, and colorful if they have Intel inside. And after spending a great deal of money, Intel succeeded in doing just that. Millions of people knew about Intel (though many weren’t precisely sure what they knew), bought computers that had Intel inside, and were confident that in doing so they had assured themselves of the very best computing experience they could have. That’s because their computers had Intel inside and that was a good thing because . . . Intel had spent a lot of money to hire dancing Bunny People to say so . . . and because it costs a lot of money to hire dancing Bunny People, lots of people must be buying Intel . . . so Intel has lots of money to spend on dancing Bunny People and that’s . . . a good thing!

 

By 1994, the Intel Inside program had built up a full head of steam. And that was also a good thing, because Intel was about to introduce its Pentium chip, a major product and marketing milestone for the company. Prior to the Pentium, Intel had identified chips via a series of numbers that also corresponded to the chip’s ancestry. The 286 was the second generation of the 8086 line, the 386 the third generation, and so on. However, Intel had been told by a very unsympathetic trademark office that it wouldn’t be granted a trademark on a series of numbers, and that anyone could call their chip a “486” if they felt like it. Intel promptly renamed its 586 the “Pentium” and the Bunny People were instructed to leap about with enthusiasm to celebrate the event.

 

People responded favorably to all of this frantic dancing, and new Pentium-based computers flew off the shelves. The computers all seemed to work very well, undoubtedly because of the Intel inside them, and America was a happy, happy place. And then a disturbing serpent appeared in Intel’s sales paradise as a rumor spread through the internet and the media about a flaw in Intel’s latest microprocessor. It appeared the Pentium inside in your computer couldn’t . . . well . . . count.

 

The Rabbits Fail Math

The problem was with the Pentium’s floating-point unit (FPU). An FPU speeds up the operations of software that does extensive calculations involving decimal-point math. Unlike previous Intel microprocessors, all new 58 . . . err . . . Pentiums . . . integrated an FPU directly into the chip itself. Prior to this, if you wanted to obtain the benefits of an FPU, you often had to purchase a separate chip, usually called a math coprocessor, and install it inside your PC. Most people didn’t bother; only a handful of software packages made much use of FPU math operations.[3] But people who were concerned about math operations bought them or bought chips that had FPU capabilities and, being math types, they tended to be quite picky about the answers the chips provided.

 

One of these picky people was Thomas Nicely, a math professor at Virginia’s Lynchburg College. In the summer of 1994, while checking the sum of the reciprocals of a large collection of prime numbers on his Pentium-based computer, Nicely noticed the answers differed significantly from the projected correct values. He tracked the error down to the Pentium by running his calculations on an older system that used a previous generation 486 chip. This unit spit out the right answers.

 

Confirmation in hand, Nicely promptly sent off some inquiries to Intel about his results. Intel, wrapped up in the care and feeding of its Bunny People, ignored him. Nicely thereupon posted a general notice on the internet asking for others to confirm his findings. Intel, after realizing Nicely was not going away, talked of hiring the professor as a “consultant,” and Nicely signed a non-disclosure agreement that basically said he wouldn’t discuss further developments on the issue. The cat, however, was out of the bag—to Nicely’s, and Intel’s, great surprise.

 

What was actually happening inside the Pentium was fairly obscure (except to picky math people). The Pentium (and other chips) contains what are called lookup tables, rows of values embedded in the chip that speed up math calculations. When creating these tables, someone had put a zero in one of the columns. What should have looked something like this:

 

123456789

looked something like this instead:

 

123456089

The real-world results of that misplaced zero were that the Pentium would give incorrect answers on numbers that went past four decimals. What should have read:

 

5505001/294911 = 18.666651973 (486 with FPU)

instead came out as:

 

5505001/294911 = 18.66600093 (Pentium)

 

Making matters worse for Intel was that as the investigation into the Pentium’s problems continued, other, even more obscure problems surfaced with the chip’s math processing.

 

As Intel was quick to tell everyone, a bug in a microprocessor’s embedded code or data isn’t a new phenomenon. An errata sheet, a document listing known problems with a chip, accompanies practically every major CPU released by Intel, AMD, VIA, and so forth. Engineers are used to dealing with these problems and devising workarounds. Usually, the chip’s maker issues a software patch to deal with any programming or application issues, the fabrication plant makes an inline change to its manufacturing process, and that’s that. After all, these things happen and Intel had never promised you a rose garden.

 

Um, well, yes it had. Somehow, as the Bunny People had leaped and cavorted on the screens of America’s TVs, they had failed to mention errata sheets. Software patches. Workarounds. They hadn’t mentioned those at all! Millions of computer buyers were confused and amazed.

 

Intel’s actions subsequent to the disclosure of the Pentium’s FPU faux pas epitomized techno-geek stupidity at its worst. As news about the problem spread, Intel announced that “. . . an error is only likely to occur [about] once in nine billion random floating point divides . . . an average spreadsheet user could encounter this subtle flaw once in every 27,000 years of use.”

 

Critics responded by noting that although it might be unlikely you’d get a wrong answer, if your calculation met the right conditions you could be sure of getting a wrong answer. And worse, there was no way of knowing if you’d gotten a wrong answer. In the meantime, IBM halted shipment of Pentium-based computers (which wasn’t that big a deal because they were still selling more of the older 486 units) and told everyone that “Common spreadsheet programs, recalculating for 15 minutes a day, could produce Pentium-related errors as often as once every 24 days.” Wow! That sure sounded more often than 27,000 years!

 

Then it was disclosed that Intel had known that the Pentium flunked math before it shipped and hadn’t bothered to tell the public. OK, it would have been odd to have the Bunny People dancing around with signs on their chests that proclaimed “1 + 1 = 3,” but still! We the people expected our Intels inside to be able to count, for God’s sake.

 

Not content to leave bad enough alone, Intel then compounded what was a rapidly growing PR nightmare by having then CEO Andrew Grove issue an apology over the internet while the company was simultaneously telling everyone it wasn’t planning a mass recall of the Pentium and intended to sell its existing inventory of math-challenged chips until it was exhausted. After which you could presumably buy a computer that counted correctly. At this point the Bunny People were leaping about to the point of cardiac infarct, but not many people were watching them anymore. People were starting to get really angry or were telling mean jokes about the Pentium. Jokes like this:

 

Question: How many Pentium designers does it take to screw in a light bulb?

Answer: 1.99904274017, but that’s close enough for nontechnical people.

 

Question: Complete the following word analogy: Add is to Subtract as Multiply is to

  1. a) Divide
  2. b) Round
  3. c) Random
  4. d) On a Pentium, all of the above

 

Top Ten New Intel Slogans for the Pentium

9.9999973251 It’s a FLAW, Dammit, not a Bug

8.9999163362 It’s Close Enough, We Say So

7.9999414610 Nearly 300 Correct Opcodes

6.9999831538 You Don’t Need to Know What’s Inside

5.9999835137 Redefining the PC—and Mathematics As Well

4.9999999021 We Fixed It, Really

3.9998245917 Division Considered Harmful

2.9991523619 Why Do You Think They Call It *Floating* Point?

1.9999103517 We’re Looking for a Few Good Flaws

0.9999999998 The Errata Inside

 

Intel didn’t think these jokes were funny at all, but the company wasn’t yet done exploring the depths of marketing stupidity. Shortly after Grove’s unconvincing internet mea culpa, Intel announced that yeah, OK, for all those whiners out there, yeah, the company will swap out your Pentium if you’re prepared to explain why you need a computer chip that can count right. And buddy, the explanation had better be good.

 

By now the Bunny People were achieving leaps of absolutely stratospheric heights, but no one was watching, no one at all. A new joke about the Pentium began making the internet rounds. It wasn’t that funny, but to a great many people, it sounded highly accurate:

 

Question: What’s another name for the “Intel Inside” sticker they put on Pentiums?

 

Answer: Warning label.

Most people have always had a sneaking love for those cheesy Japanese movies in which vast areas of Tokyo were always being subjected to large-scale urban renewal via the efforts of a huge, irradiated prehistoric creature with a bad attitude.

 

The undisputed king of the Japanese movie monsters was always Godzilla. Anyone who messed with this supercharged lizard that liked to spit radioactive phlegm was guaranteed a bad day. In various encounters he squashed Mothra the Giant Bug, blew away the Smog Monster, pulled the wings off of Rodan the Giant Pterosaur (or whatever), and kicked King Kong’s giant chimp butt. (Yeah, yeah, in the version shown in America the monkey wins, but in the real version shown in Japan, Kong gets a face full of nuclear halitosis and goes down for the count.) He never even paid attention to the plastic tanks and model airplanes the Japanese army threw against him.

 

Novell was once like that. The company started off its life in 1979 as Novell Data Systems, a Utah-based computer manufacturer and developer of proprietary disk operating systems. Its main offering, Sharenet, was a very expensive proprietary mix of hardware and software, and Novell had little success in selling it. By 1983, Novell was on the verge of collapse, but before the lights were turned off for the last time, the company brought in one of its investors, a gentleman by the name of Ray Noorda, to see if anything could be salvaged from the mess. Noorda was not a technologist, but he was a shrewd businessman with an eye for value and an almost pathological focus on keeping costs down.

 

After poking about a bit, Noorda focused on the network operating system (NOS) that was Sharenet’s software heart and decided Novell’s redemption lay in this product. The NOS, soon to be christened “NetWare,” was the pet project of the “Superset”: a small group of contractors led by Drew Major and hired by Novell. It would serve as the foundation for what would become, for over a decade, one of the industry’s most powerful and influential companies.

 

NetWare’s unique value was in how it allowed users to share files and resources on a network of connected PCs. Prior to NetWare’s creation, competing NOSs for the PC market from companies such as 3Com simply partitioned a server (the remote computer on which the NOS ran) into virtual drives. You could store files remotely, but they were inaccessible to others. NetWare was far more sophisticated. The remote hard disk was treated as a common resource available to all users of the network. Individual users were granted rights to subdirectories on the server, and if the user had permission from the network administrator, he could easily transfer files across the network to others. In addition to its file-sharing capabilities, NetWare also made it easy for multiple users to share printers, an important issue in an era when a primitive dot-matrix unit cost about $600.

 

Another strength of NetWare was its independence of any particular vendor’s hardware. NetWare could run over ARCnet, Ethernet, or Token Ring. Most of its competitors were tied to specific LAN types or LAN adapters. To communicate between the server and the desktop PC, the company relied on its proprietary Internetwork Packet Exchange (IPX) protocol. For a brief period in high-tech history, IPX became the de facto industry standard for network protocols. This would change as the internet and TCP/IP gathered momentum in the 1990s.

 

Noorda initially offered his new NOS to some of the major players in the industry, most notably 3Com. Headed up by Robert Metcalfe, the coinventor of Ethernet, 3Com was the early leader in the NOS and networking environment. In a meeting at COMDEX in 1982, one that typifies the friendly, hail-fellow-well-met attitude so prevalent in high tech, Metcalfe threw Noorda out of the 3Com booth.[1] Metcalfe’s reward for his intelligent behavior was to help ensure 3Com’s eventual departure from the NOS market.

 

From this inauspicious beginning, Novell soon bulked up to become the Godzilla of PC networking. In a brilliant marketing move that could have been thought up by Bill Gates, Novell bought several network interface card (NIC) vendors and helped drive hardware prices down. Having ruined margins for several companies but expanded the market for NetWare, Novell dumped most of its hardware business to focus on its OS.

 

By 1987, Novell was the baddest of the bad in the NOS arena. NetWare crushed Corvus, another early market leader in the industry. Plucked Banyan’s VINES. Body slammed 3Com. Kicked sand in the face of Microsoft’s LAN Manager and IBM’s LAN Server. From the early 1980s to the mid-1990s, Novell’s dominance in LANs and NOSs was unchallenged.

 

Playing to its strengths, Novell also established itself as a major player in the “groupware” category with GroupWise. Noorda also drove the development of Novell’s reseller education and certification programs, and made a Certified Novell Engineer (CNE) certificate the most valuable networking designation in the industry from 1985 to 1995. Novell’s CNE program was widely admired and copied in the industry. When Microsoft rolled out Windows NT, it made no secret of the fact that its Microsoft Certified Systems Engineer (MCSE) training regimen was based on Novell’s program. The company was big, powerful, and profitable. By 1994, annual revenue exceeded $2 billion at “Big Red.”

 

And then, just like in a Japanese movie, a nerd with glasses and a questionable haircut developed an incredible radioactive shrinking ray and turned it on the rampaging monster. When the ray had wreaked its incredible effect, the beast had been shrunk to gecko-like proportions.

 

To add insult to injury, the nerd didn’t even bother to reach for a tank or a missile or a jet to apply the coup de grace to our miniaturized monster.

 

He used a cereal box.

Contagion: Steve Jobs Disease Erupts!

A Legend Passes

On October 11, 2011, Steve Jobs, as we all must do, passed away from this earth from complications caused by pancreatic cancer. While he died much younger and sooner than he, his friends and family, and most of the general public would have wished, Jobs had the satisfaction of knowing he left this life at the very peak of his career and impact on high-tech. When he took complete control of Apple in in 1997 (Jobs during his first turn at Apple was kept on a leash and was never the company’s CEO), Apple seemed to be past its peak and sales and revenue were sliding. In fact, under Jobs the slide continued, with revenue declining from around $11 billion in 1997 to $6 billion in 2001, despite some widely heralded marketing campaigns and huge media coverage of the return of Apple’s prodigal son. It seemed that Jobs was no magic man despite his outsized reputation.

Jobs failure to return Apple to computing’s pinnacle of power was not surprising, and I wrote about it fairly extensively in the second edition of Stupidity. Here’s what I said in 2005:

Why not take a stab at planning to put Apple back on the throne from which it once reigned microcomputing 25 years ago? After all, everyone is bored with Windows. Linux, the only possible other competitor, has all the computing charm of a diesel truck and requires a degree in computer science to install. And everything the Apple Fairy Godmother said is true, and she left out some hard revenue facts besides. In 2001, Apple’s annual revenue hovered around $6 billion. In 2005, Apple sold more than 32 million iPods, and more one billion songs were downloaded from its iTunes service by the winter of 2006. Yearly revenues from 2005 were almost $14 billion with more than a billion of that being profit…

… Apple’s growth is coming from consumer electronics, not computers, and no one on this planet has ever figured out how to take a company from 4% market share to industry dominance in the face of an entrenched competitor determined to defend its turf. Apple came close to industry supremacy in the early 1970s and 1980s, but this was before IBM woke up. And despite Microsoft’s creeping development of the senescence that inevitably afflicts all megasized corporations, unless a big meteor hits Redmond and Bellevue, Apple cannot hope Steve Ballmer and Bill Gates are going to stand idly by while Apple lops off significant amounts of market share and money from Microsoft.

This was exactly right. Apple has never come close to regaining its former prominence in manufacturing and selling laptop, desktop, and server systems, nor does it any longer want to. Beginning with the iPod, Jobs “pivoted” (this is the new hot “wonder term” that’s replaced the more prosaic “do something else”) into consumer electronics, the iPod being succeeded by the iPhone, Jobs’ piece de resistance, and as his encore, the iPad, computing’s most successful tablet. The Macintosh is now an insignificant part of Apple’s revenues, and as I pointed out earlier, the company is in the process of preparing to leave that aspect of its business behind via a transition strategy.

The result of Jobs’ strategy took Apple to 2011 revenues of $108 billion. No turnaround of such a magnitude had ever been seen before in business anywhere and we may never see such an achievement again. The market made up its mind. Jobs was indeed magic, and high-tech CEOs everywhere decided they wanted to transfer some of his marketing and sales fairy dust to themselves.

There was of course no practical way this could be accomplished; while the Catholic Church and Buddhist denominations encourage the collection of physical remainders of saints under the label of “holy relics,” Jobs and his family had no interest in this sort of thing.

The Contagion Spreads

Another option was available, and that was “sympathetic magic.” Sympathetic magic works under the assumption if you behave, dress, and can even resemble a magical person, some measure of their spirit will transfer to you. Unfortunately, the process is fraught with challenges. We’ve already seen what happened when Jeff Bezos built a special workshop just like Steve Jobs’ original “Macintosh Pirate HQ” and attempted to recreate the magic of the original Mac rollout. Ugh. The failure came about because while Bezos had the lab, he completely lacked Jobs’ understanding of product positioning.

Then there was Travis Kalanick, one-time CEO of hot, hot, hot unicorn Uber. During his tenure as CEO, Uber became the number one ride sharing company and its IPO ended with the company having at its peak a market capitalization close to $70 billion. However, in an attempt to reproduce the Jolly Roger atmosphere Jobs generated at Apple during the Mac’s development, Kalanick apparently amped things up to the point where he had to issue an executives memo asking company employees to stop having sex with their bosses, throwing beer kegs out of windows, and puking on the office furniture. Kalanick also over-channeled Jobs when he was recorded screaming at Uber drivers unhappy with the firm’s payment structure. This seemed a bit unfair because Jobs was also famous for publicly abusing people (smoothie ladies, Daniel Kottke, the MobileMe guy, random employees) and no one became that worked up about it. However, Jobs had the good luck to die before he could be hoist by the petard he’d created, a smartphone with high definition camera and video recorders in everyone’s hands. Kalanick didn’t and was escorted out of the CEO suite as his antics came to light and went viral across the globe.

The brief career of Seth Bannon, one-time CEO of Amicus, also deserves a quick examination. This isn’t the best-known case of Steve Jobs Disease, but it is the funniest. Amicus was a hot, hot, hot little company that had published an app that managed fund raising for left wing political causes and people. It was generating revenue and Bannon was on the way to becoming a New York City version of Kalanick. He was profiled in Forbes, Fortune, and all the best blogs and pubs.

Bannon had the look! Hoodie, textbook millennial laser-graded-one-inch-stubble, a “Make Mistakes” T-shirt.

He went to Burning Man!

He was accepted by Y-Combinator!

He even was allowed to occupy the same Hacker House Mark Zuckerberg stayed in! You know, the one in The Social Network movie.

He actually set up his desk in the house in the same place as the Zuck!

Then, one day in 2014, he walked into Amicus HQ and laid off the whole company without warning a la Jobs walking into Pixar and summarily firing a group of employees. (When it was pointed out to Jobs that he’d offered no severance pay and was asked to give the unfortunate workers two weeks’ notice, he agreed but made the layoff date retroactive from two weeks ago.)

Why had “Zuckerbannon” fired everyone? No money in the corporate piggy bank! You see, our young entrepreneur had failed to pay New York State and City payroll taxes for years. And no worker’s comp for months. When they forked it all over to the state (including fines), the piggy bank was busted.

Why hadn’t Bannon paid his company’s taxes?

Let Seth explain it all to you:

I hired our first accountant this year. I knew our books hadn’t been well kept and was pretty sure we owed some state taxes. He determined we owed about $5,000 in state and local taxes, which we quickly paid. But shortly after, he noticed there hadn’t been any payroll tax payments coming out of our bank account.

Impossible, I thought, Bank of America’s payroll system automatically withholds payroll taxes every month, it’s all automated. But it wasn’t. There was a single “submit tax” button in a separate part of the Bank of America website that had to be clicked to actually pay the taxes.[1]

Bannon also liked letting people know that he’d dropped out of Harvard just like Zuckerberg and, of course, Bill Gates. (Jobs never went to college either.)

Except, he’d actually dropped out of Harvard’s continuing education school. You know, the one where you study things like macramé, driver’s ed, and how to obtain your real estate license?

Which is not quite the same thing as dropping out of Harvard.

When I was reading about Seth, I started cracking up, and was then inspired to write my second novel—Selling Steve Jobs’ Liver: A Story of Startups, Innovation, and Connectivity in the Clouds. (Yes, Bannon makes an appearance, lightly disguised, in the book and has the opportunity to establish a deep personal connection with Steve Jobs. You’ll have to read the book to find out more.)

When it was all over, Bannon was no longer a CEO and Amicus was out of business.

The most virulent case of Steve Jobs Disease was that of Elizabeth Holmes, one-time CEO of one-time bio-tech company Theranos. Holmes founded Theranos in 2003 when she dropped out of Stanford at 19 to develop a blood-draw technology that was going to disrupt phlebotomy. All of us have had to have our blood drawn, and many people find it an uncomfortable and disquieting process as vial after vial of red fluid is withdrawn from their bodies. Some people find it so disturbing that they require sedation before the procedure.

Magic Carpet Ride

The book that best describes the late 20th century’s dot.com bubble was written in the 19th century by Charles MacKay, thought by many to be the Nostradamus of marketing. MacKay’s opus is entitled Extraordinary Popular Delusions and the Madness of Crowds[1] and it chronicles a long series of manias and speculative booms that have afflicted Western society since the Crusades. Reading through this classic treatise, you can find descriptions of events and circumstances that both presage and prophesize the dot.com boom. For instance, consider alchemy, the centuries-old belief that different substances could be transmuted to gold. Many think that MacKay was trying to warn future generations of the folly of buying stock from TheGlobe.com. When the company went public in November 1998, the stock opened at an incredible $87 and reached a high of $97. The company’s assets? A not-very-quick nor comprehensive search engine, web pages filled with warmed-over links, and a revenue model that lost $4.00 for every $1.00 it earned. Fool’s gold, and TheGlobe.com turned back into lead with its shutdown in August 2001.

 

In a similar vein, others believe Extraordinary Popular Delusions and the Madness of Crowds’ description of the craze for exotic tulips that spread throughout 17th-century Holland was a warning of another sort. The chapter on “Tulipomania” describes how in the space of a few months the cost of a rare bulb skyrocketed from the merely expensive to the incredible. For a brief period, the sale of a single pretty flower was enough to enable a person to retire wealthy for life. Scholars believe this tale was intended to warn us about Amazon.com stock, which at the height of the bubble reached a high of $113 per share, while at the same time the company was bleeding copious amounts of red ink and company CEO Jeff Bezos was telling everyone he had no idea when Amazon would ever turn a profit (it would take 17 years). By August 2001, the stock was trading at around $9 per share, less than the single bulb price of many collectible tulips.

 

MacKay also cast a jaundiced gaze on witchcraft and the persistent human belief in spirits and the supernatural. When he wrote that chapter, he may have been inspired by a future vision of Kozmo.com, the best known of several dot.com delivery services that sprang up during the boom. Kozmo’s original business model consisted of sending people on bicycles to deliver videos, condoms, gum, and Twinkies to lazy New York City yuppies and hungry potheads. This particular venture sucked up over $250 million in investment capital and almost made it to an IPO. Because it cost the company an average of $10 in labor and overhead to fulfill the $12 average Kozmo order, a sum that didn’t account for the company’s cost of goods and marketing expenditures, it truly would have required supernatural intervention for Kozmo to have ever turned a profit. (Kozmo’s business model has recently risen from the dot.com vault in the aspect of DoorDash. The years have passed, but the financial challenges have not.)

 

One thing that Extraordinary Popular Delusions and the Madness of Crowds doesn’t do is explain the root causes of these speculative bubbles. No one ever has. They usually share factors such as good economic times, easy access to credit, and sometimes the development of new technology, but the combination of all these circumstances usually does not generate a bubble. Many reasons for the dot.com boom have been offered, but all are somewhat unsatisfactory. The most common explanations postulate the following:

 

Major technological breakthroughs often spark speculative fever. Perhaps, but the introduction of the airplane, TV, CB radios and, most recently, the personal computer did not. And it could be argued that the internet, although a significant advance in communications, was hardly a technology breakthrough even on the scale of refrigeration, which transformed the American South from a backwater into the country’s most vibrant economic region. On the other hand, railroads, the high-tech darlings of the 19th century, triggered a speculative mania that helped contribute to a depression, as Charles Kindleberger points out in Manias, Panics, and Crashes: A History of Financial Crises (the second best book ever written about the dot.com boom). But why trains and not TVs?

 

The number of people investing in the stock market had increased tremendously over the past 30 years, making the market more volatile. This contradicts mathematics and experience. The larger markets become, the more overall stability they tend to achieve. In the post–Civil War era, “robber barons” such as Jay Gould and Jim Fisk were able to attempt to corner the U.S. gold market (and were only prevented from doing so by President Grant’s direct intervention). The Hunt brothers would attempt to reprise this feat with silver in the 1980s. But by the early 1990s, public participation in the stock market put such an enterprise beyond the power of any individual or even any group of speculators.

 

The stock market had undergone continuous expansion since the Reagan recovery of 1982. Unfortunately for this theory, few stock market expansions spark speculative bubbles.

 

Everyone who had lived through the stock market crash of 1929 was now dead and the U.S. school systems do a rotten job of teaching children about important events in American history and why they occur. This theory makes a lot of sense.

 

The internet, a technology child of the 1960s, functioned not only as a communications medium but also as a virtual magic mushroom that clouded the brains of people worldwide. Though somewhat metaphysical, this theory also makes a lot of sense.

 

Wall Street is full of idiots. This theory is both popular and has a lot going for it.

 

The people who bought stock from the idiots on Wall Street were also idiots. What?! Are you implying that the failure of the American people to, when confronted with IPOs that reeked of red ink and gobbled on about idiotic schemes to sell 30-pound bags of pet food directly to consumers at a guaranteed loss (Pets.com), not fall laughing hysterically to the floor before kicking these IPO turkeys out the door were somehow responsible for their own losses? This sort of speculation isn’t even worthy of a reply!

 

Whatever the precise reasons for a particular speculative bubble, its life cycle follows a set course, though it’s difficult to predict the exact timing of the sequence of events. First, there’s an initial boom period during which insiders get rich and the public “discovers” what’s going on. The bubble then grows larger as early skeptics enter the maelstrom and make money while sensible people stand on the sidelines scoffing at their foolishness. The speculators then turn around and make fun of the naysayers, who, embarrassed at their failure to “get it,” rush in to scoop up their fair share of the plunder before the opportunity vanishes. At this point, the bubble reaches it maximum expansion and seems to pinch off from normal reality to become a universe of its own. The normal rules of profit and loss no longer apply in this alternate realm; all that matters is supply and demand.

 

At this point, the smart money tries to bail out before the inevitable crash. A few succeed in escaping with their riches, and their exit begins to deflate the bubble. The rest of the occupants become uneasy and begin to edge toward the exit as well. For a while, the bubble seems to reach a state of equilibrium as the last group of idiots on the outside rush in to search for now nonexistent profits within the doomed alternate universe. Then contraction occurs, suddenly, and the bubble bursts. Seemingly overnight, profit, wealth, and happiness are replaced by loss, poverty, and misery.

 

Reflecting the morality of another age, Extraordinary Popular Delusions and the Madness of Crowds doesn’t waste much pity on bubble participants. In his examination of the South Sea mania, a weird 18th-century scheme that involved buying shares in companies that were going to do business of some sort on the coasts of South America, Mackay writes:

 

“Nobody seemed to imagine that the nation itself was as culpable as the South-Sea company. Nobody blamed the credulity and the avarice of the people—the degrading lust of gain, which had swallowed up every nobler quality in the national character, or the infatuation which had made the multitude run their heads with such frantic eagerness into the net held out for them by scheming projectors. These things were never mentioned. The people were a simple, honest, hard-working people, ruined by a gang of robbers, who were to be hanged, drawn, and quartered without mercy.”[2]

 

Although such observations are not politically correct in an era of universal victimhood, they’re fair. As the dot.com bubble grew and swallowed increasing volumes of innocent cash, the fact that many of the original business assumptions associated with internet and online commerce were proving invalid was no secret. At least one prediction about the internet and e-business was proving true: When things happened, they happened quickly. The web’s hyperlinked architecture made measuring response and results fairly straightforward and ensured you could learn the depressing news quickly.

In the “Avoiding Stupidity” chapter of this book’s second edition, there is a section entitled “Never Trust (Or Hire) Anyone Over Thirty.” It directly addresses the fact that age discrimination is rife in high-tech and that in most cases your career is over at 40. I’ve retained that section with almost every word untouched because what was true then remains true today.

 

The current laws against age discrimination are almost never used against companies that blatantly age discriminate and are thus rather toothless. One of the most recent examples of mass age discrimination directed against high-tech’s one-foot-in the-grave cohorts was carried out by IBM, the company famous for decades for its no-layoff policy. All that changed in the 1990s, and today IBM carries out “reductions in force” with the best of them. In 2018, ProPublica released the results of an investigation[1] that documented how from 2012-2018 IBM shed 20,000 employees, 60% of whom were over 40. The tactics used to bring down Generation Geritol included:

 

  • Firing employees because their skills were “out of date,” then hiring them back as contractors sans benefits, legal protection from age discrimination, and at lower pay. This is a favorite technique in high-tech to ensure its employee base stays young and pimply while also guaranteeing that the oldsters keep the place actually running. It should be noted that there is no proof that, say, a competent C++ programmer can’t be quickly retrained to be a competent Python coder. The underlying basics of programming are inherent in all languages. However, the old coders will want to be paid more.
  • Terminating existing positions, then inviting the soon-to-be-unemployed workers to apply for internal jobs, which amazingly enough were never awarded to internal employees but instead awarded to new hires.
  • Blackmailing fired workers by threatening to withhold benefits and contest unemployment claims if the employee didn’t agree to waive their right to sue or join a class action lawsuit.

 

ProPublica is a left-leaning, non-profit organization, and I’ve read many works by them that display their inherent bias, but I’ve seen up close how high-tech acts towards over 40s and believe this report gets it right. Their investigation sparked a major class-action lawsuit against IBM that is working its way through the courts, and I’ll bet real American dollars that Big Blue settles before going to trial. (Reinforcing this belief is the fact that in August 2019, IBM’s former head of HR, Alan Wild, testified in court that IBM had fired over 100,000 employees in order to increase its appeal to millennials.)[2] The jobs will remain gone, however.

 

Thus, I have to admit that when I read in a 2013 PR release that Dan Lyons had been hired by SaaS firm HubSpot, I was gobsmacked. At the time, HubSpot had a reputation as being a hot, Boston-based SaaS startup, staffed primarily by under 30s. Lyons, a former technology editor for Newsweek, which started at the same time Moses was publishing the Ten Commandments, staff  journalist at PC Week (epic of Gilgamesh), tech writer at Forbes (before the Big Bang), and the author of what was once one of high-tech’s best known parody websites, “Fake Steve Jobs,” was old. How old? He was over 50. Do you know how old 50 plus is in high-tech?

 

Those dinosaur fossils at the natural history museums? Infants compared to Dan. The therapsids? Children. Diploceraspis? (They look exactly like they sound). Teenagers. The only prehistoric life contemporaneous with Dan is maybe the archaea, Latin for "Live Goo in a Jar." When he showed up at HubSpot for his first day of work, he didn't need a spot at an open desk. A warm petri dish would have done just fine.

 

I gave HubSpot a great deal of credit for hiring Lyons. I mean, if a triceratops showed up for a job interview at your shiny startup, would you hire it? I don't think so. Now, maybe you should hire that dinosaur. This giant reptile might be the hottest Node.js coder out there and can even prove it! But then you look down at its resume and spot that once upon a time, it wrote a program in Visual Basic. A little red flag goes up. After asking the requisite number of questions to avoid the appearance of violating U.S. age discrimination laws, you smile brightly at the ceratopsid in front of you, help direct its 12-ton body out of your office, and promptly consign the resume to the virtual circular file.

 

But that didn’t happen at HubSpot. They hired Lyons at an upper management salary, provided him with fully paid for health benefits, granted him a pre-IPO stock options package, and per company policy, gave him access to the company’s candy wall and free beer. HubSpot announced that Lyons would be working in the marketing department, where I assumed he’d put his writing skills and technology acumen to good use.

 

The next time I heard about Dan, he’d left HubSpot and a book detailing his experiences and adventures at the company was announced in 2015 for a 2016 release. I admit, I was excited by this. Given HubSpot’s courage in defying industry norms, I thought the book would be a heart-warming story of two different worlds that meet and, despite the inevitable misunderstandings and obstacles that occur in such circumstances, ultimately find understanding and affection in each other’s arms. A sort of high-tech When Harry Met Sally romance.

 

When Startup Met Velociraptor

I misjudged.

 

Before the book’s release, word on the street was that Dan Lyons’ book was going to be a tell-all take down of HubSpot. Things became even stranger when reports surfaced that the FBI? had opened an inquiry into events surrounding the book’s release. Not much information was provided on why the nation’s largest criminal investigation force was involved in book publishing, but it certainly piqued everybody’s interest. Lyons’ book immediately moved to must-read status for many in high-tech.

 

His tome was released in April 2016 and immediately jumped to top positions in several Amazon categories, as well as enjoying excellent sales in what was left of the bookstores. In due time, I obtained my copy, but when I dived in, was immediately brought up short by several discordant elements.

 

The first was the title—Disrupted: My Misadventure in the Start-Up Bubble. It wasn’t true. Lyons couldn’t have lived through a start-up bubble because there wasn’t one. Not when started his job at HubSpot in 2013, not in 2014 while he was working there, and not when he wrote his book in 2015. Nor had there been a bubble in 2016, 2017, 2018 and 2019.

 

This doesn’t mean that there won’t be a start-up bubble, but in the long run, we are all dead. One day there will be a high-tech bubble, but this is akin to predicting that soon there will be a major earthquake near the San Andreas Fault. This will happen, but that won’t earn you gold star membership in the Gypsy Crystal Ball Club. Don’t misunderstand me. I love a good high-tech bubble as much as the next pundit and lovingly described the dot.com and ASP meltdowns in the first edition of Stupidity, but they happened.

Let’s deal with some movie review basics right now. First, the acting. Is it good? Yes, it is. The film’s lead, Michael Fassbender, an Irishman of German background, is currently one of the cinema’s finest thespians. I know this because Fassbender from time to time shows up in X-Men movies playing a young version of Magneto, King of the Evil Mutants. (Ian McKellen, when’s he’s not palling around with Hobbits as Gandalf, normally plays Magneto, the Post Social Security Years.) When I first experienced the actor emoting during an X-Men film, he was staring at a green screen grimacing powerfully while pretending to lift a submarine out of the ocean via the power of mutant magnetism while wearing a tin pot on his head. It is an enduring tribute to Fassbender’s acting chops that I did not immediately fall from my seat in the movie laughing out loud. And though he looks nothing like Jobs, and at 38 is too old for the role, the actor nonetheless allows Jobs to crawl into his body much like the monster in all those The Thing movies and possess him. It’s quite a remarkable performance.

 

Seth Rogen plays Steve Wozniak. As in the case of Fassbender, he doesn't much resemble the Apple co-founder, but they do share body types and that helps. Also, Rogen was not allowed to direct himself or be the film’s main star, and this makes him tolerable to watch (to see what happens when he’s allowed to direct and star, watch This Is the End or The Green Hornet). The problem with his performance is not how he delivers his lines, but the context in which he says them. More on this later.

 

Jeff Daniels plays John Sculley and it’s difficult to know what to say about his performance. He doesn't attempt to resemble the actual Sculley, but that’s OK because no one knows what the former Apple CEO looks or sounds like anymore. Still, Daniels is an acting professional who can deliver lines in just about any movie and sound credible. The problem is that while the movie calls his character "John Sculley," he’s tasked with playing someone who never existed on this planet, and that leaves him somewhat adrift and looking highly generic. You could drop David Duchovny into the film, have him utter Daniels' dialog, and it would make no difference. You could even argue Duchovny would be an improvement because he once played opposite a character with a name that is at least a homophone of “Sculley.”

 

The final lead is Kate Winslett. I like her despite Titanic because she starred in Sense and Sensibility, a movie I think is the best adaptation of any of the books written by my favorite author, Jane Austen. She plays Joanna Hoffman, an Apple marketer and Jobs loyalist who followed him from Apple to NeXT. Winslett has to generate an Eastern European accent, an attempt to reflect Hoffman’s childhood spent in the USSR and communist Poland before immigrating to America, and it wobbles a bit. Despite this, her performance is fun and brisk and most people who know about Hoffman’s working relationship with Jobs have judged it to be fairly accurate in tone.

 

Now, how about the directing? Steve Jobs is directed by Danny Boyle, a successful director with hits and interesting films such as Trainspotting, 28 Days, and Slumdog Millionaire (for which he won the Oscar for Best Director) under his belt. The cast was in excellent hands.

 

The screenplay for the film was written by Aaron Sorkin, the man who wrote Moneyball, won the Oscar for best screenplay for The Social Network, probably the most engrossing film made about high technology to date, and who also brought us the hit TV show The West Wing. Sorkin knows how to turn a phrase and spin up a verb. (The screenplay says it’s based on Walter Isaacson’s biography, but I was unable to detect a single fact from the book in the movie.)

 

In addition to acting, directing, and writing, the film's marketing was also in place, supporting Steve Jobs with $30 million in media advertising and celebrity appearance programs. Early word of mouth and reviews were good, and Jobs remains a figure of fascination for millions worldwide. The film cost $30 million to make, a pittance by today’s Hollywood standards, and was projected to be solidly profitable.

 

So, how’d the flick do?

Lead Balloon

It bombed. Badly. Steve Jobs performed at the box office about as well as the original Macintosh Portable did when it was when released to the market with a lead-acid-battery-just-like-the-one-in-your-car-and-about-as-heavy in 1991.[1] The film was pulled from general distribution before the end of its scheduled original run and between international, domestic, and DVD sales grossed about $39M at the box office. It needed to make $120 million to break even. Steve Jobs did garner two Oscar nominations, but goose egged on award night.

 

What went wrong?

 

One major problem is that Steve Jobs is a well-acted, scripted, and directed piece of bullcrap. The movie is to the facts surrounding Jobs’ life and relationship to Apple what the robot in the original Lost in Space is to modern day robotics and AI.

 

I’m not naive. I fully understand that in Hollywood, truth is always going to be the handmaid’s tale to entertainment. Movies take liberties with the facts. But it’s one thing to expect truth to prostitute itself to the box office, quite another to take it up to your hotel room, roughly sodomize it, slap it in the face, and exit without paying the bill and leaving a tip.

 

And I’m not going to accept the excuse that Sorkin told everyone his movie is not a documentary or even a biopic. There are rules when you use the names of actual people. If you decide to make a film focusing on the life of Henry Ford, famous early 20th-century entrepreneur, engineer, and raging anti-Semite, don’t add a scene in which he founds B'nai Brith. That’s not kosher.

 

OK, let’s review the movie. Steve Jobs is constructed in the traditional Greek model with three acts, each taking place before Jobs is scheduled to step in front of a wondering world to unveil his latest wonder device. Kate Winslett's Joanna Hoffman plays the part of a one-woman chorus, admonishing, hectoring, and advising Jobs as he prepares to take the stage.

 

The first act takes place in the minutes running up to the unveiling of the original Macintosh and it’s the best part of the movie. The camera work and dialog accurately convey the nervous energy and tension that build before a significant high-tech launch (I speak from personal experience.) There’s an ongoing bit of dramatic shtick revolving around Andy Hertzfeld[2] and whether the Macintosh is going to be able to perform its famous “It sure is great to get out of that bag” patter at its unveiling. (This didn't happen, but it is amusing.)

 

There’s also a scene with Jobs grousing at great length about Daniel Kottke[3] and his perfidy in spoiling Jobs’ chance to be Time’s “Man of the Year” (the award went to a paper mache dummy). This injustice has occurred because Kottke has confirmed to a reporter that Jobs indeed has a daughter. (This didn't happen at the launch, though it’s true Kottke did tell the truth about Lisa.) Just to help out all you budding young Steve Jobs wannabes out there, siring or birthing a child is inherently a public act. You have zero, zilch, nada, no rights to privacy in this regard. Still, at this point, the film is moving along nicely.

 

Unfortunately, the movie magic is short lived. Midway through act one, Jobs’ former girlfriend, Chrisann Brennan, and daughter Lisa (Jobs) show up at the launch. (This didn't happen.) Lisa, of course, is so sweet and adorable she makes your teeth ache from sugar shock. While Brennan and Jobs begin to squabble about paternity and child support, Jobs sneaks in an opportunity to introduce little Lisa to the Mac and the mouse and I bet you just can’t guess what happens next. Well, I’ll tell you. She draws a picture and is enchanted by the internal creativity she’s just released through the power of the computer! Who would have guessed it? (This didn't happen. At the time of the Macintosh launch, Lisa was five and Jobs didn't step up to the plate as her father until she was nine, just around the time DNA testing was coming into widespread use.)

 

That Old Lifetime Magic

At this point, I realized I’d stumbled into Steve Jobs: The Lifetime Movie and was in trouble. To my relief, Lisa soon exited the scene stage right, but I knew she’d be back. I was right.

Print on Fire: The Kindle

For a missile aimed at blowing apart a centuries-old nexus of culture, commerce, politics, and artistic creation, the Kindle ereader was somewhat underwhelming. The device was neither a piece of machine art nor an example of cutting edge design. But as we’ve already seen with Microsoft Windows, if there’s enough pent-up demand for a technology, even the ordinary and adequate can spark a marketing stampede.

 

This was the case with the Kindle. As I wrote in the second edition of In Search of Stupidity, the only factor preventing the widespread adoption of ebooks was that the quality of experience was poor because all of the devices sucked. New pages took several seconds to display. The cell phones of 2005 were inadequate for prolonged reading. The tablets and laptops of the period were too heavy and clunky.

 

The Kindle didn’t suck, though its appearance was a bit awkward. It was light. Its low power E Ink display matched the experience of reading paper. Pages displayed at an acceptable speed. It could hold about 200 plus books and publications in its 250 MB of internal storage. Integrated 3G wireless let you easily download books off Amazon’s online store. The unit was comfortable to hold.

 

The Kindle wasn’t inexpensive at $399, but when you considered the convenience and the fact that ebooks could be expected to be cheaper than their print equivalents, the price was acceptable. And soon after the Kindle’s release, Kindle reader apps appeared for iOS, Android, PCs, and on Amazon’s own Fire tablets. Any device could now be a Kindle. And Amazon had spent much time and effort ensuring that close to a 100,000 titles were available for sale on day one of the device’s release.

 

The quality of experience threshold had been met.

 

The Kindle was not a technology surprise. While I take some pride in correctly diagnosing why the 2001 ereader initiative failed and predicting the release of a successful device, I certainly wasn’t the only person to believe ereaders and ebooks were inevitable. To most of us pundits, the only question was when a successful unit would be released and what it would look like. The subsequent disruption to the book business would simply be an extension of the digital wave already overtaking newspapers, magazines, paper mail, fax, and other print technologies.

 

What we should have focused on more closely was who would release the breakthrough gizmo. In 2006, if you had asked technologists to guess who would create the first successful ereader, companies such as Microsoft, Apple, maybe Adobe because of its PDF technology, and perhaps HP because of its long association with digital printing, would have made the list of logical candidates. Some people might have pointed to this or that hot new startup. Few people would have guessed Amazon, thought of at the time as a mere merchant, not a member of the high-tech elite.

 

The first Kindle sold out in five hours and began to disrupt the book industry, to the publisher’s horror (though since Amazon had shown them prototypes of the device months before its release, they shouldn’t have been that surprised). Yet somehow the industry had persuaded itself that print was unassailable. Shortly before the launch, Barnes & Noble CEO Stephen Riggio was quoted saying, “The physical value of the book is something that cannot be replicated in digital form.” This was the same company that in its 2001 annual report proclaimed that it will “seize our future on all fronts: retail, online and digital.”

 

Making things worse for the publishers was that Amazon had failed to inform them that it intended to price all of its eBooks at $9.99, price rigging the ebook market at a single shot. They found out about it halfway through Jeff Bezos’ launch speech for the Kindle and its supporting platform, Kindle Direct Publishing (KDP) on November 19, 2007 at the Waldorf Hotel in New York City. Reportedly, they were stunned by the announcement, but it’s fair to ask why none of the major publishers thought to ask about Amazon’s pricing model while it was creating all that new content, and insisted on an answer before the event.

 

Demonstrating 2007’s initial demand was no fluke, by 2009, ebook purchases, almost all of them Kindle, reached 3 million. In 2010, 8 million. By 2013, ebooks accounted for 23% of trade publishing revenues, with 90% of sales flowing through the Kindle platform.

 

Agency of Hope

The trade publishers hated the $9.99 model. They were worried the one-price-fits-all straitjacket would devalue books, and they were right (though this problem would plague self-publishers far more than the trade houses). They thought ebooks would cannibalize sales of print books, and they did, though it was stupid to think that raising the price of ebooks would have much impact on the long-term future of print. They were afraid the Kindle was going to destroy the brick and mortar stores, and by 2010 superstore Borders was on the verge of collapse and would die the next year.

 

The publishers now faced apocalypse, while doomsday loomed ahead for the brick and mortar resellers. The pricing of the fastest growing segment of the book market was controlled by Amazon, and in the world of vendors and channels, the ability to dictate a market’s pricing model is to own the market. Soon, Jeff Bezos ruled the trade publishers. It was time to band together and sail forth to, if not kill, at least teach the Channel Shark who was making their life miserable and threatening to eviscerate their businesses a lesson he wouldn’t forget. To tame the beast, a shiny new harpoon was brought on board their virtual Orca[1]. This fearsome weapon was named “Agency Pricing” and the publishers knew that once launched, a fierce channel war would erupt. These struggles are the business equivalent of domestic disputes, where the parties involved know each other, are co-dependent, and are intimately aware of the other side’s weak and pain points. They are always nasty and bloody, but usually not fatal.

 

Not the Orca but the Indianapolis

The reason the publishers were determined to move to agency pricing was that it would enable them to regain control of their ebook destiny. Previous to the Kindle’s introduction, the publishers had sold their books to the resellers on a wholesale basis. In this model, the publisher assigns a book a “cover” or list price, and sells it to a reseller at a discount off list. The bookseller then marks up the title to whatever the market will bear. The table below provides a basic example of wholesale in action.

 

Table 8. Wholesale book pricing model

List or Cover Price

Wholesale Price to Reseller

Store Price to Buyer

$20

$10 (represents a 50% discount off list)

$15 (represents a 25% buyer discount off list price. Reseller earns gross markup of 50%)

Wholesale pricing is the most popular channel financial model and it is used in many industries. It provides a great deal of flexibility to the reseller, who is free to charge above list price if demand makes it possible.

 

By contrast, in agency pricing, the vendor sets an item or service’s price and requires the reseller to adhere to it. In return, the reseller receives a set percentage of the retail sale.

 

Table 9. Agency book pricing model

List or Cover Price

Agency Fee to Reseller

Store Price to Buyer

$20

$6.00 (represents 30% commission of list price)

$20

 

Despite some misconceptions, agency pricing is legal and widely used in different industries. Apple runs its App stores on agency pricing and every purchase you make entails a 30% deposit to Apple’s bottom line.

 

As the publishers prepared to leave port, joining them onboard the Orca was a powerful new friend, Steve Jobs. In 2010, Jobs was ready to release his last opus, the iPad. He was interested in participating in the ebook boom and thought his new tablet was just the device to compete with the Kindle. He approached the major publishers and offered them the opportunity to sell their books through its new iBookstore Books site via the agency model, with a 30% commission accruing to Apple on every sale.

 

The Big Six houses, Hachette, HarperCollins, Macmillan, Penguin, Random House, and Simon & Schuster, met with themselves and with Apple, and with the exception of Random House, thought this was a good idea. At one stroke they’d break out of $9.99 and help create a major competitor for Amazon. The bargain was struck.

 

The new Apple deal included a most favored nation (MFN) proviso. MFN meant that if another reseller undercut Apple’s ebook price, the company was free to match it. This encouraged the publishers to maintain control over their pricing and preclude Amazon from using the loss leader strategy it had used in other markets.

 

The Guys Over in the Corner No One Pays Attention To

While preparations for the coming channel war were underway, there was one group of people in publishing’s firmament who remained ignored and forgotten, the self-published authors. But if everyone was ignoring them, they weren’t ignoring the Kindle. The introduction of the first ebook platform to achieve widespread acceptance meant that the old barriers that had stood between them and mass numbers of readers could be bypassed. Amazon’s new Kindle platform allowed anyone to upload their book and sell it. There were no agents to contact and beg to submit your book to an editor, no slush piles, no bored and uninterested editors, and no endless stream of rejection letters. Heaven seemed to have descended to earth.

 

Unfortunately for self-publisher dreams, they were offered no immunity from Jeff Bezos’ belief that their margin was his opportunity. When Kindle opened for business, Amazon imposed a huge 70% MDF stocking fee on every upload. In other words, if you sold a book on Amazon for $9.99, you received $2.99 (less download fees) and Amazon received $6.99 (plus download fees).

 

This hadn’t fit anyone’s vision of a writer/reseller split, but Amazon deflected criticism via a brilliant marketing coup. In an example of using Newspeak to change the popular narrative that had Eric Blair[2], rising from his grave to shout “Bravo!” Amazon labeled the $2.99 a “royalty.” (The person at Amazon who came up this idea should have received a nice bonus and free Prime shipping for life.)

 

This is ridiculous. Amazon has never paid a self-published author a dime of royalties. Ever. The author/publisher royalty system has been in place for 200+ years and the parameters are well understood. Amazon does not:

 

  • Provide editing assistance.
  • Copyedit the work.
  • Produce it, which includes layout, illustration placement, typesetting, printing, etc.
  • Market, translate, and localize it to international markets.
  • Ship the print version of a book to resellers.
  • Price the book.
  • Promote and market the book.
  • Manage copyrights and licenses and surrender them based on contractual provisos.
  • Generate the ISBN (an international ID assigned to books).
  • Provide advance payments against projected sales (advance against royalties).
  •  

Amazon does charge you to:

 

  • Print your book (formerly via Createspace, now via KDP).
  • Download books from its servers.
  • Promote your book on its website via various marketing programs.

 

What Amazon provides is access to your book in their store. That’s why channels exist. In every industry I’ve studied, none attempts to extort a 70% stocking fee from their vendors. Such fees typically range between 2% to 3% list price, if they exist all.

 

To experience how wonderful the success of Amazon’s gambit has been for the company, read the two sentences below. The math on each works out to the same, but which sounds better?

 

  1. Amazon charges each author who uses its publishing platform a 70% stocking fee.
  2. Amazon pays authors 30% royalties on sales of their books.

 

Yes, exactly. And while statement two is untrue, the fact that currently almost the entire industry uses Amazon’s vocabulary has enabled the company to grab the high ground in discussions on its financial treatment of self-publishers. Remember, it’s up to you to create demand for your book. It was satisfying to hit the upload button on the Kindle system and be informed your book is available for sale, but you were now one of 46 million titles. To stand out from the horde, you were going to have to produce and promote your book. You would have to solicit reviews, pay for promotions, post up a website to help publicize your book, spend time and money building an email list, and on and on and on.

 

And you were going to have to fund all that with the $2.99 left over from every $10 dollars you sold.

 

Once you had cleared Amazon’s “royalty” nonsense from your head, it became evident how crushing Amazon’s stocking fee was to a self-published author’s bottom line. While it was true the publishers no longer stood between your book and the market, Amazon had replaced them with formidable time and money constraints.

The Ministry of Love

“Making progress on diversity, equity, and inclusion.”

 

“Google is committed to creating a diverse and inclusive workforce. Our employees thrive when we get this right. We aim to create a workplace that celebrates the diversity of our employees, customers, and users. We endeavor to build products that work for everyone by including perspectives from backgrounds that vary by race, ethnicity, social background, religion, gender, age, disability, sexual orientation, veteran status, and national origin.”

The above is text taken directly from the Google website at https://diversity.google/commitments/. Note that the list of protected perspectives omits political beliefs.

On November 9, 2016, the greatest triggering event in the history of political correctness and the social justice warrior (SJW) movement took place. That night, Donald J. Trump, a New York City real estate developer, marketer, the star of the reality TV show The Apprentice, and very recent Republican, defeated Hillary Clinton, also from New York, a former First Lady, U.S. senator, and U.S. Secretary of State, in a free and open election for the 45th presidency of the United States. Despite the predictions of politicians, pundits, and pollsters, Trump won a solid Electoral College victory of 306 to 232 while simultaneously losing the popular vote 46.1% to Clinton’s 48.2%. It was the most surprising and unexpected upset in American political history.

The reaction of many Democrats upon the announcement of Trump’s victory was also unprecedented. Across the country, Clinton voters wailed, gnashed their teeth, screamed into the sky, and fell to the ground crying. On television, moisture forced itself visibly out of the tear ducts of some anchorfolks, while others became emotionally distraught. Breaking with all precedent, candidate Clinton did not appear on screen to gracefully concede the election, congratulate the winner, and roll out the eternal clichés about how she would continue the fight, the future for America remained bright, and how we should come together as a country until four years from now, when the American people would undoubtedly correct the mistake they’ve just made. Instead, she reportedly threw the mother of all temper tantrums[1] and was not fit to appear in public until the next morning, at which time she dutifully performed the expected ritual.

California, which had recently instituted an electoral system designed to suppress conservative and Republican turnout, shared in the progressive sorrow. A nascent secessionist movement revved up and the state did its best to imitate South Carolina circa 1861. A completely unconfirmed rumor spread that so many Golden Staters threw themselves to the ground to pound the tips of their patent leather shoes against the earth in protest against the cosmic injustice of it all that there were fears the vibrations would trigger the San Andreas Fault.

The One Hour Sob at the Googleplex

And then there was Google’s all hands corporate meeting at the Googleplex, held shortly after the election. The meeting ran just over an hour and was filmed for internal use only. In September 2018, the video was anonymously leaked and can be viewed in its entirety online at the link in the footnote.[2]

These TGIF (thank god it’s Friday) meetings are held regularly at the world’s dominant search engine firm and allow employees to comment and grouse to upper management. Many other companies have their own version of these get-togethers. In every company I’ve ever worked or consulted for, political discussions were either brief or not welcome. Business was business and customer ideology and who they voted for was not relevant to the bottom line.

Not so at this meeting. If a video editor had removed every mention of the word Google and its derivations (googly, Googler, googleness) from the footage, a neutral observer would think they were watching an election post mortem held at the HQ of the California branch of the Democratic National Committee.

The video’s atmosphere is grim, moist, and huggy. Sergey Brin, Google co-founder, kicks off the proceedings by telling everyone that “most people are pretty upset and pretty sad.” But then he pivots to happier news by mentioning that California has legalized marijuana. This lifts the gloom a bit as the happy Googlers cheer the good news.

Celebrating their virtual contact high, everyone hugs each other. Then Eileen Naughton, Google VP of People Operations, promises to help fearful Googlers relocate to Canada. Sundar Pichai, Google CEO, drops broad hints that Google will be “adjusting” its search engines to deal with “filter bubbles.” No one comes right out and says Trump voters are Paleolithic neo-Nazis, but lots of hints are dropped— “tribalism that is destructive in the long term,” “voting is not a rational act,” “low-information voters,” and similar banal observations typically applied by the left to the right. VP for Global Affairs Kent Walker informs the rapt audience that Google’s “job is to educate policy makers.” (I’d always thought the company’s job was to build a search engine that gave the most accurate and neutral results possible, thus enabling people to educate themselves.)

Periodically, the prevailing narrative is interrupted by assurances that Google understands that conservatives may feel uncomfortable by disagreeing with the opinions held by every member of upper management and by what appears to be 100% of the members of the meeting. No one during the session stands up to offer viewpoints that differ in any way from the room’s prevailing zeitgeist.

The session reaches its penultimate point when the moistest white guy in the audience stands up and reads from a little script that urges everyone go through Google’s “unconscious” bias training (these programs are scams), watch a left-wing documentary currently playing at Google, and start political arguments during Thanksgiving.

As a Google stockholder, I was appalled. I found Brin’s delight over his employees’ easier access to a drug that makes you stupid and laugh at things that aren’t funny inappropriate. Were people who rely on Google’s much heralded navigation software going to have to worry that their self-driving car would one day run a solid red light and T-bone a minivan full of kids because a stoned Google programmer compiled the wrong code into the system while fumbling for a Twinkie to quell the munchies? That the people responsible for Google Maps may one day tell me to go left into a ravine instead of right onto a road because they were all sitting around giggling and not paying attention to the process of inputting accurate satellite data into their GPS mapping software?

The whole session was a PR disaster. It is very bad business to insult 50% of your potential customer base. I kept hoping that at some point a Google board member, perhaps one who’d attended the meeting on an impromptu basis, would leap on stage, throw a bag over Brin’s head, and drag him away before he could talk further. I started to speculate the whole get together was some sort of practice run for a Saturday Night Live satire about a group of rich, clueless, California high-tech nerds sitting around congratulating themselves for their inclusiveness while all believing and saying the same thing. But alas, no.

When the sniffles had dried and the last hug unwound, I stared at the screen and wondered what business would trust Google to provide reliable search results on any topic that dealt with politics, conservative demographics, and thousands of related key data points? Google was asked exactly this question when the video leaked. Their response was:

“At a regularly scheduled all hands meeting, some Google employees and executives expressed their own personal views in the aftermath of a long and divisive election season. For over 20 years, everyone at Google has been able to freely express their opinions at these meetings. Nothing was said at that meeting, or any other meeting, to suggest that any political bias ever influences the way we build or operate our products. To the contrary, our products are built for everyone, and we design them with extraordinary care to be a trustworthy source of information for everyone, without regard to political viewpoint.”

Despite this cheery assurance, many people had their doubts. In 2019, those doubts would be confirmed.

The Damore Defenestration

In July 2017, 28-year-old James Damore, a Google employee, made one of the biggest mistakes of his young career. He took seriously Google’s claims that it sought diversity in its workplace and that “everyone at Google has been able to freely express their opinions.” He discovered his mistake after he posted a 10 page document on one of the company’s internal forums entitled “Google’s Ideological Echo Chamber for discussion and debate.[3] The focus of the paper were efforts by Google to correct what it perceived as too few women working as software engineers. This action didn’t violate Google’s workplace regulations nor did he misuse the company’s business resources. In fact, his behaviour had at least implicitly been endorsed by Google in June when a stockholder at a company meeting asked Eric Schmidt, Chairman of Alphabet, the holding company controlling Google and several other spin offs, if conservatives were welcome at Google. Schmidt responded:

“The company was founded under the principles of freedom of expression, diversity, inclusiveness and science-based thinking. You’ll also find that all of the other companies in our industry agree with us.”

Damore’s monograph is written in an earnest, post-graduate style, includes several charts and graphs buttressing his statements, and makes the following arguments:

  • Google’s corporate culture is mainly a self-referential, progressive bubble.
  • Conservatives are uncomfortable expressing views that contradict or disagree with progressive thought and theory.
  • Google’s own internal, invisible biases have led to the company being blind to its lack of intellectual diversity.
  • That he recently had attended a Google diversity program that was not very diverse in its explanation of what diversity is.
  • That the company was practicing reverse discrimination in an attempt to enlist more women into software engineering (programming) and potentially violating the law.
  • That a partial reason for why fewer women become programmers is because of innate differences in the job preferences of the sexes.
  •  

The paper lists out with some specificity Google’s discriminatory practices. Damore states:

“I strongly believe in gender and racial diversity, and I think we should strive for more. However, to achieve a more equal gender and race representation, Google has created several discriminatory practices, including:

  • Programs, mentoring, and classes only for people with a certain gender or race.
  • A high priority queue and special treatment for “diversity” candidates.
  • Hiring practices which can effectively lower the bar​ ​for “diversity” candidates by decreasing the false negative rate.
  • Reconsidering any set of people if it’s not “diverse” enough, but not showing that same scrutiny in the reverse direction (clear confirmation bias).
  • Setting org level OKRs [Objectives and Key Results] for increased representation which can incentivize illegal discrimination.”
  •  

The paper makes suggestions how Google can improve the current situation. They include:

  • Stop alienating conservatives.
  • Confront Google’s biases.
  • Stop restricting programs and classes to certain genders or races.
  • Have an open and honest discussion about the costs and benefits of our diversity programs.
  • Reconsider making Unconscious Bias training mandatory for promo committees.

Damore argues that one of the reasons for the disparity in male/female staffing in software engineering is that women, based on inherent preferences, choose not to enter coding as a career. He also asserts that women are more prone to “neuroticism” and “anxiety,” which in turn can generate workplace stress and perhaps a desire to avoid programming. He suggests that Google seek to make the engineering environment less stressful, thus encouraging more women to become programmers.

After Damore uploaded his paper, a couple of weeks passed while Googlers interested in the topic read it. Some people were impressed by his arguments and thanked him for having the courage to bring the matter forward, though they were a decided minority.

On August 5 2017, the paper was leaked to Gizmodo, an online website leftover from Gawker that discusses science, science fiction, and technology. The headline accompanying the release read “Exclusive: Here's The Full 10-Page Anti-Diversity Screed Circulating Internally at Google.

Google Diversity da Fe

The paper quickly went viral and SJW eyes widened and forehead veins bulged. A Google engineer held her breath on Twitter and threatened to quit if the company didn’t “take action.” Another emailed JamesDamore the message “You’re a misogynist and a terrible person. I will keep hounding you until one of us is fired. F**k you.” Another large-hearted champion of diversity posted on a Google forum that "If Google management cares enough about diversity and inclusion, they should, and I urge them to, send a clear message by not only terminating Mr. Damore, but also severely disciplining or terminating those who have expressed support for his memo.” (Irony is apparently an unknown element at Google.)

Large bundles of wood were piled up around a stake set up in the middle of the Googleplex in preparation for the public burning of Damore’s career. On August 7, the career was marched out, tied to the stake, and immolated when Google announced the hapless engineer’s firing (done via E-mail). The stated reason was he had violated the company’s conduct code by “advancing harmful gender stereotypes in our workplace.” Damore sued Google and the case ended up in arbitration and is still pending.

There were some problems with this public incineration. Eric Schmidt had told everyone that Google was all in on science and Damore seemed to have it on his side. Below are some results of a quick search, using Google, for the terms women, anxiety, and neurotic:

The Anxiety and Depression Association of America

Women and Anxiety

From puberty to age 50, women are nearly twice as likely as men to develop an anxiety disorder.

Women are nearly twice as likely as men to be diagnosed with an anxiety disorder in their lifetime. In the past year, prevalence of any anxiety disorder was higher for females (23.4%) than for males (14.3%).1 The term "anxiety disorder" refers to specific psychiatric disorders that involve extreme fear or worry, and includes generalized anxiety disorder (GAD)panic disorder and panic attacks, agoraphobia, social anxiety disorder, selective mutism, separation anxiety, and specific phobias.

https://adaa.org/find-help-for/women/anxiety

This was just the star of the search:

  • Medical X Press: “Draft recommendation promotes screening women for anxiety”[4]
  • US National Library of Medicine National Institutes of Health: …“Women reported themselves to be higher in Neuroticism, Agreeableness, Warmth, and Openness to Feelings…”[5]
  • Pyschology Today: Are Women More Emotional Than Men? “For one to expect absolutely no sex differences in human emotion, one would have to believe in a god/godess-like creature, Androgyna…”[6]
  • Wikipedia: Neuroticism: “A 2013 review found that groups associated with higher levels of neuroticism are young adults who are at high risk for mood disordersand women.”[7]
  • Medical Daily: Women are far more anxious than men – here’s the science[8]

(By the way guys, don’t use the above to get cocky. We’re more likely to be serial killers.)

I periodically give lectures at such venues as technology trade shows, product managers’ groups, and high-tech councils, and before beginning my talk, I always ask this question: “How many people have read the following books?” I then list some of the seminal publications on the history of high tech. These include such books as Apple by Jim Carlton, Gates by Steve Manes, and Hackers by Steven Levy. Invariably, only one or two hands go up; often, none do.

Then, I ask how many people have read the newest, hottest, the-promised-land-is-within-your-reach-if-you-just-follow-the-diktats-of-this-newest-business-guru wonderbook. The latest wonderbook differs from year to year and decade to decade. In the 1980s it was, of course, In Search of Excellence by Thomas J. Peters and Robert H. Waterman and its myriad of profitable spin-offs (all based on an original foundation of bogus data). In the early-to-mid 1990s it was often Crossing the Chasm by Geoffrey Moore and its myriad of spin-offs (all based on product life cycle models that were first introduced in the 1950s). By the late 1990s and early 2000s it was often The Innovator’s Dilemma by Clay Christensen, which discusses how established companies have a hard time dealing with new ideas and the very successful follow-on, The Innovator’s Solution, which proposed an answer to the problem the book’s author admitted no one has ever actually used (which is a rather innovative way to end a business book series, when you come to think about it). Unfortunately, when his concepts were applied to Apple, Christensen came up snake eyes and his credibility suffered mightily. These days it’s the Built to Last series by Jim Collins and Jerry Poras that generates the most encomiums. I started to read the series and realized the 80s were calling for their business models back.

Now, in all fairness, many of these business books offer practical, if often generic, advice about how to run a business and the best things to do while you’re doing it. Like most exercise machines, many of these books “work” if you rigorously follow their commonsense advice. In most cases, thinking up new or improved products or services to sell to people (this process is currently being lionized as “innovation,” and businesses have been doing it since the pyramids, but apparently the new label makes everyone feel even better about the process), being open to new ideas, treating customers well, organizing your data, hiring good employees, not committing accounting fraud, and so on, and so on, will certainly improve your chances of success. But this is rather like saying that breathing increases your chance of competing in the 100-yard dash. It will, but mere respiration is not what separates winners from losers in a race.

The danger comes when you dig into the specifics and try to apply the generic to your specific business and its challenges. Most writers of “theory” books can’t overcome the tendency to fit the facts into their grand frameworks, leading to a lot of misleading and contradictory advice. You already know the problem with Excellence. The Chasm books sometimes work well when talking about enterprise markets with fairly well-defined buying processes, but if you’d relied on their advice during the microcomputer market’s early growth spurt in the 1980s, you would have been caught utterly flat footed by the rapid pace of events (Moore tries to deal with this problem with a later book that acts as a retrofit to the original theory, but it’s unconvincing). The Innovator books, written by an academic who has never worked in business, proffered a solution that failed when applied in the real world.

It’s not just high-tech firms that get themselves into hot water in this regard. Super-duper consulting firm McKinsey first wrote about and then introduced the concept of “eagles flying high” at Enron, one of the dot.com collapse’s most storied debacles. Enron’s theory was based on the belief that by hiring lots of smart people and letting the wind beneath their super-intelligent wings push them into the stratosphere, Enron profits would soar to ever loftier heights, clutched safely in the talons of all these Einstein flyers. Unfortunately, the theory didn’t take into account that really, really smart people might, in the interests of self-enrichment, create myriads of business deals and projects that objectively evaluated had little or no chance of turning a profit and then create a dizzying array of interlocking shell companies where accumulating debt could be buried, all at the expense of stockholders and company employees’ retirement funds.

Another problem with all business books that focus on grand theories of business success is that, in a very real sense, no such theory can ever exist. To help understand this concept further, take a quick look at a popular Hollywood fantasy, that of the young go-getter who develops a surefire way to “beat” the stock market. Now, suppose this fantasy could be translated into reality. Imagine that through the use of quantum computers and sheer genius programming, you create a stock-picking system that infallibly predicts which stocks will go up and down and then write a book releasing this information to the world.

What would happen?

What would happen is the stock market would immediately congeal into immobility and would have to be rejiggered to work in such a way that all your good advice and smart programming would be rendered useless. This goes back to the fundamental reality underlying all market-driven systems: there must be a winner and a loser in every transaction for the system to work. (It’s a grim fact, but before you run shrieking into the comforting arms of Marx and Lenin, the empirical evidence suggests that communism simply creates losers all around.)

This carries over to the competitive environment all companies must endure in market-driven economies. Competition must winnow the myriad of firms over time to ensure the market can function. Failure must happen. But failure must also always have a cause.

The Main Causes of Disaster

There are four main reasons for company failure and marketing disasters. They are:

  • Your company is based on fraud and/or the sale of illegal products and services.
  • Your company is built around an unrealistic or untenable business assumption.
  • Your company does not have a strategic vision and plan for success.
  • Your company has failed to execute business basics in the course of selling its products and services.

In regards to the first two types of failure, I don’t have much advice to give. If, like Theranos, Enron, and thousands of other companies over the course of the 20th century and continuing into the 21st, if your underlying business model is a Ponzi scheme or based on impossible to implement technology, your business will fail, and maybe you will go to jail. If your company plans to market illegal drugs in the United States, you will fail and probably go to jail if someone doesn’t shoot or decide to dismember you with a chainsaw first during a dispute about optimal distribution strategies and reseller margins.

The third class of failure, lack of strategic vision and planning, is, as you’ve seen, the one most business writers like to write about primarily because books about this topic tend to make the most money. Of course, all successful companies have to develop and sell products and/or services people will pay for; this is the essence of modern commerce, and it is here that a “strategic” plan is both useful and necessary. But few companies and theorists are content to leave it at just that. For the past several decades, American business has been obsessed with the idea that somewhere out there exists a grand unified theory of business that explains once and for all how success can be guaranteed if only the theory can be uncovered and explained. For a time “excellence” seemed to provide the Great Answer. Then the path to proven success appeared to encompass leaping like gazelles over chasms.

Next came the idea of large corporations continuously innovating and even extending the bureaucracy to formalize the approach—the corporate innovation officer! It’s his job to preside over the many innovation meetings that will bring exciting new products and services to market, once buy in from all stakeholders is obtained, and the CEO signs off. The Built to Last books snake ouroboros style back to Excellence and so the cycle continues.

Book Details and History

  • Format: Print and ebook. Ebook includes Kindle (mobi) Apple, and epub.
  • Paperback: 460 pages; includes glossary of terms and index
  • Publisher: Softletter; third edition (December, 2019)
  • Language: English
  • ISBN: 978-0-9672008-4-2
  • LOC: 2019915097
  • Product Dimensions: 6 x 0.9 x 9 inches
  • Shipping Weight: 1.5 pounds

The first edition of "In Search of Stupidity: Over 20 Years of High-Tech Marketing Disasters" was released in 2002, the second in 2006. Stupidity has sold over 100 thousand copies worldwide and been translated into several languages, including Chinese, Italian, German, Hebrew,  Korean, Polish, and Japanese.

italian_stupidity._small
Italian Edition of In Search of Stupidity
Japanese Edition. We have no idea of what they were up to with this cover!

Readers Say...