• Tech

Steve Jobs, 1955–2011: Mourning Technology’s Great Reinventor

20 minute read
Harry McCracken

Steve Jobs, whose death was announced Wednesday night, Oct. 5, 2011, wasn’t a computer scientist. He had no training as a hardware engineer or industrial designer. The businesses Apple entered under his leadership — from personal computers to MP3 players to smart phones — all existed before the company got there.

But with astonishing regularity, Jobs did something that few people accomplish even once: he reinvented entire industries. He did it with ones that were new, like PCs, and he did it with ones that were old, like music. And his pace only accelerated over the years.

He was the most celebrated, successful business executive of his generation, yet he flouted many basic tenets of business wisdom. (Like his hero and soul mate, Polaroid founder Edwin Land, he refused to conduct focus groups or other research that might tell him what his customers wanted.) In his many public appearances as the head of a large public corporation, he rarely sounded like one. He introduced the first Macintosh by quoting Bob Dylan, and he took to saying that Apple sat “at the intersection of the liberal arts and technology.”

(See photos of the long and extraordinary career of Steve Jobs.)

Jobs’ confidence in the wisdom of his instincts came to be immense, as did the hype he created at Apple product launches. That might have been unbearable if it weren’t the case that his intuition was nearly flawless and the products often lived up to his lofty claims. St. Louis Cardinals pitching great “Dizzy” Dean could have been talking about Jobs rather than himself when he said, “It ain’t bragging if you can back it up.”

Jobs’ eventual triumph was so absolute — in 2011, Apple’s market capitalization passed that of Exxon Mobil, making it the planet’s most valuable company — that it’s easy to forget how checkered his reputation once was. Over the first quarter-century of his career, he was associated with as many failed products as hits. Having been forced out of Apple in 1985, he was associated with failure, period. Even some of his admirers thought of him as the dreamer who’d lost the war for personal-computer dominance to Microsoft’s indomitable Bill Gates.

Until the iPod era, it seemed entirely possible that Jobs’ most lasting legacy might be the blockbuster animated features produced by Pixar, the company he founded after acquiring George Lucas’ computer-graphics lab in 1986. Instead, Pixar turned out to be, in Jobs’ famous phrase, just one more thing.

(See Steve Jobs’ TIME covers.)

Born in 1955 in San Francisco to an unmarried graduate student and adopted at birth by Paul and Clara Jobs, Steven Paul Jobs grew up in Silicon Valley just as it was becoming Silicon Valley. It proved to be a lucky break for everyone concerned.

He was only 21 when he started Apple — officially formed on April Fool’s Day, 1976 — with his buddy Steve “Woz” Wozniak, a self-taught engineer of rare talents. (A third founder, Ron Wayne, chickened out after less than two weeks.)

But Jobs had already done a lot of living, all of which influenced the company he built. He’d spent one unhappy semester at Reed College in Portland, Ore., and 18 happy months of “dropping in” on Reed classes as he saw fit. He’d found brief employment in low-level jobs at Silicon Valley icons HP and Atari. He’d taken a spiritual journey to India and dabbled with both psychedelic drugs and primal scream therapy.

See a two-minute video of Steve Jobs’ career.

Woz wanted to build computers to please himself. Jobs wanted to sell them to make money. Their first creation, the Apple I, was mostly a warm-up act for 1977’s Apple II. The insides of the II were the product of Woz’s technical genius, but much about it — from its emphasis on ease of use to its stylish case design — reflected Jobs’ instincts in their earliest form. In an era when most computers still looked like nerdy scientific equipment, it was a consumer electronics device — and a bestseller.

In 1981, Woz crashed his V-tail Beechcraft and spent months recuperating, returning to Apple only nominally thereafter. From then on, Jobs was the Steve who shaped Apple’s destiny. In 1979, he visited Xerox’s PARC research lab in Palo Alto, Calif., and was dazzled by what he saw there, including an experimental computer with a graphical user interface and a mouse. “Within 10 minutes … it was clear to me that all computers would work this way someday,” he later said.

At Apple, PARC’s ideas showed up first in the Lisa, a $10,000 computer that flopped. They then reappeared in improved form in 1984’s Macintosh, the creation of a dream team of gifted young software and hardware wizards led by Jobs. Launched with an unforgettable Super Bowl commercial that represented the IBM PC status quo as an Orwellian dystopia, the $2,495 Mac was by far the most advanced personal computer released to date. Jobs said it was “insanely great,” a bit of self-praise that became forever associated with him and with Apple, even though he retired that particular phrase soon thereafter.

(See the top 10 Apple moments.)

The Mac was insanely great — but it was also deeply flawed. The original version had a skimpy 128 KB of memory and no expansion slots; computing pioneer Alan Kay, who worked at Apple at the time, ticked off Jobs by calling it “a Honda with a one-gallon gas tank.” In a pattern Jobs would repeat frequently in the years to come, he had given people things they didn’t know they needed while denying them — at least temporarily — ones they knew they wanted.

Just as Jobs intuitively understood, PARC’s ideas would have ended up on every computer whether or not the Mac had ever existed. But there’s no question that he accelerated the process through sheer force of will.

“He wanted you to be great, and he wanted you to create something that was great,” said computer scientist Larry Tesler, an Apple veteran, in the PBS documentary Triumph of the Nerds. “And he was going to make you do that.” Whether Jobs was coaxing breakthroughs out of his employees or selling a new product to consumers, his pitches had a mesmerizing quality. Mac software architect Bud Tribble gave it the name it would be forever known by: the Reality Distortion Field.

(See photos of the unveiling of Apple’s tablet, the iPad.)

Jobs may have been inspiring, but he was also a high-maintenance co-worker. He dismissed people who didn’t impress him — and they were legion, inside and outside of Apple — as bozos. He was not a master of deadlines. He tormented hapless job candidates and occasionally cried at work. And he was profoundly autocratic. (Jef Raskin, the originator of the Macintosh project, said Jobs “would have made an excellent King of France.”)

Among the people whose buttons he increasingly pushed was Apple’s president, John Sculley, the man he had famously berated into joining the company with the question, “Do you want to sell sugared water for the rest of your life, or do you want to come with me and change the world?” Frustrated with Jobs’ management of the Macintosh division and empowered by the Mac’s sluggish sales, Sculley and Apple’s board stripped him of all power to make decisions in June 1985. In September, Jobs resigned.

Decades later, the notion of Apple deciding it would be better off without Steve Jobs is as unfathomable as it would have been if Walt Disney Productions had sacked Walt Disney. In 1985, though, plenty of people thought it was a fabulous idea. “I think Apple is making the transition from one phase of its life to the next,” an unnamed, overly optimistic Apple employee told InfoWorld magazine. “I don’t know that the image of a leader clad in a bow tie, jeans and suspenders would help us survive in the coming years.”

See photos of the iPhone 3G hitting stores.

Using his Apple millions and funding from Ross Perot and Canon, Jobs founded NeXT, a computer company that was even more Jobslike than Apple had been. Built in a state-of-the-art factory and sporting a logo by legendary designer Paul Rand, the NeXT system was a sleek black cube packed with innovations. Unfortunately, it was aimed at a market that turned out not to exist: academic types who could afford its $6,500 price tag. After selling only 50,000 units, NeXT refocused on software.

For a while, Jobs’ second post-Apple venture, Pixar, also looked like a disappointment. Its $135,000 image-processing computer was a tough sell; Jobs kept the company alive by pumping additional funds into it. As a sideline, however, it made computer-generated cartoons that started winning Oscars. In 1995, Disney released Pixar’s first feature, Toy Story; when it became the year’s top-grossing movie, it gave Jobs his first unqualified success in a decade. (By the time he sold Pixar to Disney for $7.4 billion in 2006, his career had reached such dizzying heights that the deal was merely a delightful footnote.)

Jobs later called the NeXT-Pixar years “one of the most creative periods of my life” and said his dismissal from Apple had been “awful-tasting medicine, but I guess the patient needed it.” It was also the time when he went from high-profile bachelorhood — he had fathered a daughter out of wedlock and dated Joan Baez — to family man. He married Laurene Powell in 1991; by 1998, they were the parents of a son and two daughters.

(Read “The Beginning of the Post–Steve Jobs Era.”)

Meanwhile, Apple sans Jobs was failing on an epic scale. Sculley had given way to a vision-free German Apple executive named Michael Spindler, who was replaced by Gil Amelio, a veteran of the computer-chip industry who was spectacularly unsuited to run Apple. He presided over $1.8 billion in losses in Apple’s 1996 and ’97 fiscal years and failed to sell the company to interested white knights IBM and Sun MicroSystems. The possibility of Apple running out of cash and ceasing to exist was not unthinkable.

Amelio did make one smart move during his 500 days at Apple. Just before Christmas in 1996, he paid $430 million to buy NeXT, thinking its software could serve as the foundation of a next-generation Mac operating system. It did. (Every operating system Apple created from 2001 onward, including the one on the iPhone and iPad, is a direct descendant.)

NeXT’s software came with a bonus: Steve Jobs. In a touching sign of naiveté, Amelio apparently thought Jobs would cheerfully serve as a figurehead for the company he had co-founded. Instead, six months after the merger, Jobs orchestrated Amelio’s ouster and accepted the position of interim CEO — iCEO for short — splitting time with his Pixar duties. “I’m here almost every day,” he told TIME in 1997, “but just for the next few months. I’m really clear on it.” He finally ditched the i in iCEO in 2000.

(See TIME’s video “Are the New iPods and Apple TV Worth It?”)

Jobs’ return cheered up beleaguered Apple fans, but few industry watchers expected miracles. “[T]he odds aren’t good that he can do more than slow the fall, perhaps giving Apple a few more years before it is either gobbled up by a bigger company or finally runs out of customers,” wrote Jim Carlton in 1998 when he updated his 1997 book Apple: The Inside Story of Intrigue, Egomania, and Business Blunders to reflect Jobs’ comeback.

During his first months back at Apple, Jobs dumped board members, cut staff, slashed costs, killed dozens of products and accepted a $150 million lifeline from perennial bête noire Microsoft. (When Bill Gates made a remote guest appearance at the 1997 Macworld Expo keynote, looming on a video screen over Jobs, the audience booed.)

Jobs rolled out an advertising campaign — “Think Different” — that got people talking about the company again. And he presided over the release of the striking all-in-one iMac, which came in a translucent case crafted by Jonathan Ive, the British industrial designer who would be responsible for every major Apple product to come. In 1998, it became the best-selling computer in America.

See a brief history of the computer.

Little by little, Jobs started acting less like a turnaround artist and more like a man who wanted, once again, to change the world. “Victory in our industry is spelled survival,” he told TIME in 2001, when Apple was still on the rebound. “The way we’re going to survive is to innovate our way out of this.”

In May of that year, Apple had opened retail locations in McLean, Va., and Glendale, Calif., the first of hundreds it would build. Big-box merchants rarely did a good job of explaining to consumers why they should choose a Mac over a cheaper Windows computer; now Apple could do the job itself, in the world’s least cluttered, most tasteful computer stores.

The single most important moment in Apple’s and Jobs’ redemption came six weeks after the 9/11 attacks. At a relatively low-key press event at Apple’s Cupertino, Calif., headquarters, Jobs explained that the company had decided to get into the MP3-player business. Then he pulled the first iPod out of his pocket. All of a sudden, Apple was a consumer-electronics company.

(See how Apple is trying to sell the concept of natural-language computing on the new iPhone.)

Soon it was an exceptionally successful consumer-electronics company. The iPod wasn’t much more than a tiny hard drive with a headphone jack and slick software, but it became a cultural touchstone, especially after Apple made it work with Windows PCs as well as Macs. Even its white earbuds became iconic. iPods gained the lion’s share of the digital-media-player market and never lost it.

At first, iPod owners got music by ripping their own music or sharing tracks via peer-to-peer networks like Kazaa. Apple, seeing a need for a simple, legal source of music, introduced the iTunes Music Store in 2003. Unlike earlier music services, iTunes offered a proposition of Jobsian elegant simplicity: songs were 99 cents apiece, and you could play them on up to three devices and burn them to CD. Music companies weren’t thrilled — they would have preferred higher prices and more restrictions — but consumers bought a million songs in the first week, and by 2008 they had purchased 4 billion of them.

Five years after Apple entered the music business, it surpassed Walmart to become the U.S.’s largest music retailer. By that time, iPods had screens capable of displaying video, and Jobs’ company was a major distributor of movies and TV shows as well.

(See more about the iPhone 4S.)

As important as the iPod was, it was ultimately just a high-tech Walkman. The iPhone, unveiled at a Macworld Expo keynote in 2007, was something far more: a powerful personal computer that happened to fit in your pocket. “Every once in a while, a revolutionary product comes along that changes everything,” Jobs said in introducing it, a statement that — unlike some of the claims he’d been known to make at keynotes — turned out to be factual rather than fluffy. It instantly made every other smart phone on the market look antique.

For Jobs, it was a do-over: a chance to prevail in the PC wars that Microsoft had won the first time around. Typically, he responded not by aping the strategy that had worked so well for Microsoft but by being even more like Steve Jobs. Like the first Mac, the first iPhone had obvious deficiencies. For instance, it shipped with a poky 2G wireless connection just as 3G was becoming pervasive. But its software was so radically better than anything anyone had ever seen that it didn’t really matter.

In 2008, Apple introduced the App Store, which seamlessly delivered programs created by third-party developers for iPhones, giving Apple a 30% cut of all developer revenue along the way. The App Store was the only authorized way to get programs onto an iPhone; Apple regularly rejected programs that it deemed unsafe, offensive or disturbingly competitive with its own efforts. And yet the iPhone ended up with both the most apps and the best apps, making it hard to argue that Jobs’ tight control had stifled the creativity of app developers.

See comparisons of the iPhone 4 and iPhone 4S.

The iPhone had serious competition, especially from handsets that used Google’s Android operating system. But the iPhone ecosystem — phone plus apps, movies and music delivered through Apple services — contributed to Apple’s success in a way no other company could match. By 2011, Apple was selling more than 220,000 iPhones a day and, according to one analyst, capturing two-thirds of the industry’s profits.

In 2010, Apple followed up the iPhone with the iPad, its first effort in a category — tablet computers — that had existed for two decades without a single hit product. Apple sold 14.8 million iPads in 2010, a number that dwarfed the predictions of Wall Street analysts. (It also flummoxed competitors, who rushed into the market with iPad competitors that were far less appealing, and sometimes much more expensive, than the real thing.) By then, it wasn’t surprising that Steve Jobs had surpassed almost everyone’s expectations; it would have been more startling if he hadn’t.

Apple’s business model at this point bore little resemblance to those of other computer makers. The rest of the industry was deeply decentralized: a consumer went to Best Buy to purchase an Acer computer running Microsoft software and then used it with Rhapsody’s music service and a SanDisk MP3 player. Tech support was typically outsourced to some nameless firm halfway around the world.

(See why the PC isn’t dying, just evolving.)

Apple had long ago stopped building its own stuff — one of its contract manufacturers, China’s Foxconn, earned its own measure of celebrity — but otherwise, it controlled what Steve Jobs called “the whole widget.” It wrote its own software, designed its own hardware and delivered such services as iTunes. It sold Macs, iPods and other products at its own stores, where face-to-face support was available for free at a “genius bar.” Once you owned an Apple device, you filled it with movies, music and apps from Apple’s online stores. The company even started designing its own processors for the iPhone and iPad. In short, it came as close as it possibly could to fulfilling the Jobs vision down to the last detail.

Jobs remained the difficult, demanding, sometimes unreasonable perfectionist Apple had thought dispensable a dozen years earlier. But the NeXT and Pixar experiences had instilled in him new discipline. He still pushed boundaries but in ways that more consistently worked in Apple’s favor. And working with chief operating officer Tim Cook, later to succeed him as CEO, he turned the company into a wildly profitable exemplar of efficiency.

More than any other major Silicon Valley company, Apple kept its secrets secret until it was ready to talk about them; countless articles about the company included the words “A spokesperson for Apple declined to comment.” It wasn’t able to stomp out all rumors, and in 2010, gadget blog Gizmodo got its hands on an unreleased iPhone 4 that an Apple engineer had left at a beer garden in Silicon Valley. Even if Apple detested such leaks, they became part of its publicity machine.

(Did Steve Jobs’ departure from Apple hurt the economy.)

Minimalism came to typify Jobs’ product-launch presentations in San Francisco and at Apple headquarters as much as the products themselves. Jobs 1.0 was known for his bow tie and other foppish affectations. Jobs 2.0 had one uniform — a black mock turtleneck, Levi 501 jeans and New Balance 992 sneakers. With kabuki-like consistency, his keynotes followed a set format: financial update with impressive numbers, one or more demos, pricing information and one more thing. Even the compliments he paid to Apple products (“Pretty cool, huh?”) rarely changed much.

He generated hoopla with such apparent effortlessness that many people concluded he was more P.T. Barnum than Thomas Edison. “Depending on whom one talks to,” Playboy said, “Jobs is a visionary who changed the world for the better or an opportunist whose marketing skill made for an incredible commercial success.” It published those words in the introduction to a 1985 Jobs interview, but they could have been written last week.

Still, even Jobs’ detractors tended to think of him and his company as a single entity. Apple was demonstrably full of talented employees in an array of disciplines, but Jobs’ reputation for sweeping micromanagement was so legendary that nobody who admired the company and its products wanted to contemplate what it might be like without him. Shareholders were even more jittery about that prospect: a stock-option-backdating scandal that might have destroyed a garden-variety CEO barely dented his reputation.

Increasingly, though, the world was forced to confront the idea of a Jobs-free Apple. In 2004, he was diagnosed with pancreatic cancer and told he had months to live; further investigation showed it was a rare form of the disease that could be controlled. Jobs turned day-to-day control of Apple over to Cook, underwent surgery, recovered and returned to work. During a 2009 medical leave, he received a liver transplant. He went on another medical leave in 2011 that became permanent when he resigned as CEO on Aug. 25, assuming Apple’s chairmanship and handing off CEO duties to Cook.

Jobs was so obviously fundamental to Apple’s success that many feared the company’s amazing run would end the moment he was no longer calling every shot. Instead, Apple prospered during his illnesses and absences. By 2011, the vast majority of its revenues came from products that hadn’t existed when Jobs took his first medical leave. He had accomplished one of his most astounding feats: teaching an entire company to think like Steve Jobs.

Always happier praising a new Apple product than talking about his private life, Jobs said little about his struggles with ill health. He did, however, address them briefly in the Stanford commencement speech he gave in 2005. And as commencement speakers are supposed to do, he gave the students — most of whom were about the same age he was when he co-founded Apple — some advice. “Your time is limited, so don’t waste it living someone else’s life,” he said, sounding as if the very thought of living someone else’s life infuriated him. “Don’t be trapped by dogma, which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.”

Steve Jobs’ heart and intuition knew what he wanted out of life — and his ambitions took him, and us, to extraordinary places. It’s impossible to imagine what the past few decades of technology, business and, yes, the liberal arts would have been like without him.

See the ALL-TIME 100 gadgets.

See the 50 best inventions of 2010.

More Must-Reads from TIME

Contact us at letters@time.com