Apple’s Depressing Denouement

In 2004, Apple co-founder Steve Jobs asked famed author Walter Isaacson to write his biography. It’s a mark of Jobs’s hallowed place in the pantheon of American corporate titans that Isaacson, whose other subjects included Henry Kissinger, Benjamin Franklin, and Albert Einstein, would eventually say yes. While best-selling books about successful business leaders represent a popular niche, most specimens are fawning airport reads that combine hagiography with self-help advice for aspiring entrepreneurs. Isaacson’s Steve Jobs (2011), by contrast, was a serious work of literary non-fiction that exalted its subject as a once-in-a-generation technological savant, while also showing him to be a callous parent and scathing boss, not to mention a proponent of loopy “fruitarian” medical theories. (Much has been made of Jobs’s use of fringe therapies to treat the pancreatic cancer that killed him in 2011, but he also entertained the bizarre belief that his vegan diet allowed him to avoid bathing for days on end without developing body odour, a proposition vigorously disputed by co-workers.)

Tripp Mickle, a Wall Street Journal technology journalist who covered the Apple beat for five years, isn’t Walter Isaacson (few of us are); and, to his credit, doesn’t try to be. Nor does he seek to present his primary subjects—former lead Apple designer Jony Ive and incumbent chief executive Tim Cook—as world-changing visionaries on par with their departed boss. Indeed, the very title of his book—After Steve: How Apple Became a Trillion-Dollar Company and Lost Its Soulpresents Apple as existing in a state of creative denouement since Jobs’s death—a bloated (if massively profitable) corporate bureaucracy that increasingly feeds shareholders’ demands for quarterly earnings by milking subscription services such as Apple Music and iCloud instead of developing new products.

The first five chapters of After Steve are structured as a twinned biography, following the lives of Ive and Cook from their precocious childhoods (in England and Alabama, respectively), and on through the 2010s, when the pair jointly ran Apple (in function, if not in title) following Jobs’s death.

Timothy Donald Cook grew up in Robertsdale, a farming community located roughly halfway between Mobile, Alabama and Pensacola, Florida, the middle child of a Korean War veteran and a pharmacist’s assistant. In high school, Cook was named “most studious,” and served as the business manager for the school yearbook. “In three years of math, he had never missed a homework assignment,” reports Mickle, also noting that one teacher remembers him as “efficient and dependable.” Cook also happens to be gay, a subject that caused some awkwardness for his Methodist parents, even though Cook wouldn’t come out publicly till later in life. As a means to deflect questions, Mickle reports, Cook’s mother told drug-store coworkers that her son was dating a girl in Foley, a nearby town.

Following high-school graduation, Cook went on to study industrial engineering at Auburn University and business administration at Duke. He then gravitated to the then-burgeoning field of personal computing, quickly carving out a niche within its production and supply-management back office. At IBM and Compaq, Cook turned himself into a sort of human abacus, ruthlessly bringing reduced costs, increased efficiencies, and smaller inventories to every assembly line he set eyes on. By the time he’d arrived at Apple in 1998, Mickle reports, Cook was completely neurotic about keeping any stocked materials off the books, calling inventory, “fundamentally evil.” In time, he pioneered a process by which yellow lines were painted down the floor of Apple’s production plants, with materials on the storage side of the line remaining on suppliers’ books until the very moment they were brought to the other side for assembly.

Like Ive, Cook declined to be interviewed for After Steve. And so it is entirely possible that the man has a rich inner life that remains opaque to Mickle and the outside world more generally. But the portrait that emerges in this book is one of a fanatically dedicated workaholic who rises before 4am to begin examining spreadsheets, and thinks about little else except the fortunes of Apple Inc. during the waking hours that follow. Mickle reports a sad scene in which Cook is spotted by sympathetic strangers at a fancy Utah resort, dining alone during what appears to be a solitary vacation. We also learn that Cook’s Friday-night meetings with Apple’s operations and finance staff were sometimes called “date night with Tim” by attendees, “because it would stretch for hours into the evening, when Cook seemed to have nowhere else to be.”

Jony Ive—Sir Jonathan Paul Ive, as of 2012—grew up in east London, a silversmith’s son whose genius flowered at an early age. As a teenager, he became enraptured with principles of Bauhausian design, producing marvels that stunned his tech-school teachers. But he quickly learned that his reverence for purity of form might not lend itself to workaday life at a mainstream design agency. In one instance related by Mickle, a young Ive was asked to design a sink for a bathroom-fixtures company, and produced a half-oval basin resting on a column that angled away from the wall. It was a beautiful piece, but the client rejected it on the (not unreasonable) basis that such a cantilevered structure might break its moorings and crush a small child.

An iBook and an iMac G3, featuring the translucent casing designed by Jony Ive.

Not surprisingly, Ive bonded quickly with Jobs when the pair began collaborating at Apple in the late 1990s. The two shared a similar taste for the minimalist curvilinear aesthetics that would subsequently be embedded in the iMac, iPod, iPhone, and iPad (as well as some noble early flops, like the Apple Newton MessagePad). They also exhibited a common disdain for bean counters who focused on production costs at the expense of product purity, a trait that got the company into trouble before Cook was able to properly whip operations into shape. In 1997, for instance, Ive created a Twentieth Anniversary Macintosh (the infamous “TAM”) featuring leather accents and Bose speakers straddled in bespoke fabric. It ended up selling for $7,499, and—surprise, surprise—almost no one bought it. (At the time, a pair of New York Times journalists called it a “Ferrari on a desktop.” The Homer might have supplied a more fitting analogy.)

The 1997 TAM, as preserved at the Apple Museum in Prague.

Mickle writes in a neutral and unaffected journalistic style, and lets readers form their own opinions about Ive’s recherché professional methods. But at many points, I still found it hard to suppress an eye roll. While designing the Apple Watch, we learn, Ive and his design team studied “how the British had miniaturized towering grandfather clocks to power the rise of the empire with chronometers that enabled sailors to plot their location at sea,” and “heard from horlogists about how … wristwatches had become fashion pieces in the early 1900s after Louis Cartier had developed the iconic Tank watch.” Another Ive-led project was Apple’s new US$5 billion headquarters in Cupertino—a 3/4-mile-diamater corporate coliseum of metal and crystal called Apple Park, featuring walls made of glass so perfectly transparent that newly installed employees began walking into them. One employee broke his nose, and another received a possible concussion. “Staff began walking around the building with their arms held out like zombies, hoping their fingers would hit the glass before their faces did,” Mickle reports. In time, employees started putting up black stickers on the glass, which were informally referred to as “Jony’s tears.”

Apple Park, whose architectural style is described by some Apple employees as “space prison.”

That said, there is no question that Ive’s work was absolutely fundamental to Apple’s renaissance following Jobs’s return to the company’s helm two and a half decades ago. And his sense of form is embedded in literally billions of devices used every day by people around the planet. From his earliest days at Apple, he displayed an uncanny knack for understanding the way an object’s colour or curve might arouse a certain kind of thought or feeling among consumers. Mickle, for instance, recounts the story of the handle that sat at the back of the first iMacs. Product engineers protested that this feature added significantly to the product’s cost, but had no real useful function, since this was a desktop computer designed to sit in the same place for months or years on end. Ive countered that these pocket-protector types were missing the point: The purpose of the handle wasn’t to lift the computer, but rather to invite users to lift it. As with his other i-prefixed creations, Ive designed the iMac as a product to be touched (caressed even) for the act’s own sake, a temptation that would be unthinkable when it came to those generic-looking rectangular beige PCs that office workers hid under their desks.

Whether in the political or corporate sphere, succession plans that require multiple protégés to share power rarely prove workable, as one aspirant typically manages to quickly muscle out the others. Yet Apple proved an exception to this rule, with Ive and Cook getting along serviceably for most of the 2010s, and even generating a bona fide post-Jobs product blockbuster in the form of the Apple Watch. When Ive left Apple in 2019 after 27 years of service, the move seems mostly to have been a result of his own restlessness and boredom, not the displeasure of others. There were, after all, only so many ways Ive could redesign the Bézier-defined contours on the various black and silver slabs that constitute the core Apple product line. And in any case, his mind seemed to have drifted toward promoting Apple’s brand (and his own) at one-off celebrity-studded events such as the Manus x Machina gala at The Met.

In his managerial capacity at Apple, Ive doesn’t seem to have been anywhere near as insensitive as his departed mentor. In fact, he insisted that his design-team subordinates exhibit respect for one another at meetings (a rule that would have disqualified participation by Jobs himself, whose signature line for indicating disagreement was “You suck”). Yet in the pages of After Steve, Ive still does often come off as condescending and haughty. At an Apple parts plant in Shenzhen, Ive allegedly called out individual factory workers whom he accused of being insufficiently delicate with his company’s shimmering products. In another case, he “bristled” at getting into a Mercedes S-Class Sedan because its curves offended his sense of industrial design. (He preferred his own chauffeur-driven Bentley Mulsanne.) And then there’s the time that—according to one of the 200-odd current or former Apple employees whom Mickle interviewed for his book—Ive had an Apple engineer spend weeks retrofitting the soap dispensers on his private Gulfstream V jet because he’d found some imperfection that displeased him. To put it mildly, it’s not a good look.  

But Ive attracts our pity, too, as he appears to have some brain quirk that causes him to constantly—even involuntarily—fret over the flaws of every physical nook and cranny of his physical environment. After Steve contains speculation that he might be gifted (perhaps afflicted is the right word) with tetrachromacy, a genetic condition by which one’s mutated retinas provide massively enhanced spectral sensitivity—such that while a normal person might see a product as being a uniform shade of white, a tetrachromat might instead see a hundred shades of eggshell, vanilla yogurt, and antique lace. In the early 2010s, Mickle reports, Ive was driven to distraction by the fact that Apple’s software engineers designed home-screen app buttons with curved corners modelled on standard circular geometry, while his own product-design team instead opted for a more sophisticated geometric algorithm known as the Bézier curve. In another story, Ive complains to a co-worker about microscopic defects he observes in the stainless steel airport bar they’re drinking at, to which his colleague (who detects no imperfections whatsoever) aptly replies, “Your life must be fucking miserable.”

Apple’s critics have long pointed out the hypocrisy of a company that offers the public progressive political postures on gay rights, undocumented immigrants, and environmental sustainability, even as it’s built out an East Asian supply chain with workplace standards that fall short of those in Western countries. Skewering Apple’s recent expansion into the field of television production, Ricky Gervais famously described The Morning Show as “a superb drama about the importance of dignity and doing the right thing, made by a company that runs sweatshops in China.”

But After Steve highlights a less obvious contradiction. For more than two decades, Apple has marketed itself as a revolutionary force for professional satisfaction, creative liberation, and personal connection. Think, for instance, of all those classic ads from the early iPod years, featuring silhouetted figures lost in the rapture of music, or the iPad Air commercial channelling Robin Williams in Dead Poets Society, depicting the iPad as a gateway to “poetry, beauty, romance, love.” Sadly, none of the men who defined the company’s brand during its great glory run—Jobs, Cook, and Ive—seem to have been visited by that joyous spirit their own gadgets were designed to summon.

This is a companion discussion topic for the original entry at

Speaking of classic Apple ads…From challenging Big Brother to wanting to be Big Brother.

1 Like

I’m not sure I will ever live long enough, or have such a sad notion of life that I could somehow lament the fate of a corporation…

The Apple ethos of the Walled Garden where no one, not even the owner, can modify anything, was never attractive. Since Apple products are 20-50% more expensive than comparable other ones, I’ve never bought it to Apple crapple.


Great essay. I particularly liked the section on Ive. It also does a lot to explain the inevitability of the Pareto distribution, and more importantly Price’s Law, which dictates that 50% of the value is created by the square root of the total number of employees involved in a project. Obsession has a huge market value, provided it’s able to tap the almost intangible aspirational values of the market.

Here’s an interesting piece of trivia- Apple was never as big in the UK as it was in America, at least in the personal computing market (although Apple’s superior software compression for areas like publishing made it a shoo-in for certain professional environments). In the UK, Steve Wozniak’s vision of personal computing prevailed, with the desktop PC reigning supreme- perhaps because Sir Clive Sinclair inspired a generation of enthusiastic computer hobbyists, more than willing and able to build their own PCs.

As usual, my essays are to be found on my Substack, which is free to view and comment:

1 Like

We have used Apple computers since the days of the Lisa (1983), and at one point had a great deal of product loyalty. Why? For years, Apple computers were better at graphics (our field), had superior operating systems, and a more elegant user interface. It did not hurt that Steve Jobs was the only computer company head who had an awareness of design (our field, again).

Apple lost us before Jobs passed on. At a certain point, Apple switched from a company that focused on graphics and educational computing to a company that sold luxury computing products. To us, their products were no longer worth the money.

Secondly, Apple started locking down their operating system. Most users did not notice a difference, but we had to find workarounds to make the new systems work the way we wanted. It was a bit like buying a car, and discovering the hood welded shut.

Finally, we found Apple’s monetizing of their products to be excessive. The App Store, which monopolized Mac OS software, is a big example. It was a bit like driving down your street and discovering tollbooths.

We don’t begrudge Apple their success, but to paraphrase Ronald Reagan: we never left Apple– Apple left us.


Interestingly both Microsoft and Apple technology has always been more or less equally represented in my own home. They target separate markets. I still recall the mainframe systems being touted by IBM, until Apple came along with the MAC, and IBM soon changed tactic and actually dominate the market with the trademarked PC.

The battle was on, and the mainframe lost the war between the two portable options linked via the internet. Was this by design, or pure luck. I suspect is was the latter.

Microsoft lead the way initially with the Windows platform that allowed anyone to build the hardware and anyone to write software. I rode the wave of open source software and cloned PC’s with soldering iron in hand and had an incredible time, an experience that I still cannot get from the “Walled Garden” environment provided by Apple.

But we now live in a world where smartphones and laptops are essentially pieces of basic equipment needed by every member of my family. Here the “Walled Garden” has been a Godsend. I just could not move from the playful experimental world that I grew up in to one of technical support for all the electronics deployed by my family. And with Windows based products, technical support was a full time occupation.

Windows has recently tried to emulate Apple with its own safe “Walled Garden” approach in Windows S versions, but it is so restrictive only running limited Microsoft developed software, I had to break down the walls on the day I purchased my Lenova PC.

I for one am grateful that both approaches have so far survived.


I agree. Having both ecosystems thriving is a net benefit for the end-user/consumer. Like you, I use both. I’m PC all the way for “stationary” usage, Apple all the way for mobile/portable, and a mix of both for tweener hardware (ie. laptops).

I see no point, benefit, or justification of higher cost, for stuff that, as the article says, sits under a desk. Having broad software options and compatibility, not to mention work-related applications, takes precedence.

But the design elegance, “how it feels and looks in your hand”, and qualitative appeals of that kind are worth the extra cost to me, for stuff you schlep around every waking moment. Although it still boggles my mind that I essentially carry around $1500 in my pocket and take it out and put it on a table like it’s nothing. Pre-2007, that would’ve been unthinkable.


Buy some individual company stocks; trust me, you’ll learn to lament a lot about corporations very quickly.

1 Like

Yes, I certainly understand your thoughts there. I have literally 3 PCs, a desktop, my cellphone, and my wife has a laptop and a cellphone. I love my wife, but she is simply not good with technology to the point that she does not know how to find stuff she has downloaded. So I am always telling her stuff I have already told her multiple times. I don’t always teach nicely, to my discredit. So, yes, to whatever degree this is non-modifiable and works out of the box, good.

I’ve never built a PC from a board, but I am getting more interested (at 70) in one of those Raspberry Pi devices.


Definitely a good place to start! Raspberry Pi’s are perfect for all sorts of fun projects. I built a small arcade gaming console which ran old 1980s TurboGrafx 16 games. Tons of fun, and most of the scripts are easily found online so you do not even need to know any coding. However if you do have a bit of coding experience, they are all in Python which is very easy to learn. It practically looks like pseudo-code its so simple.


Yeah, Python is a great language. I’ve done a lot of coding, as an applied statistician and a person who had a small grant to actually modify GIMP the open-source image manipulation tool. I learned COBOL in HS in 1968.


I think that Steve Jobs was a force of nature.

Ive and Cook are just a couple of real smart guys. Nothing wrong with that. But usually you need to be a force of nature to change the world.


The irony occurred to me that the company that announced the Macintosh with the wonderful “1984” advert has now come up with SharePlay, so that we may synchronise the way we stare slack-jawed in our grey slops at screens together. The Two Minutes Thoughtlessness. Who shall throw the hammer now?