Time to Build a More Secure Internet

Walter Isaacson is the author of “The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution.” Isaacson, the CEO of the Aspen Institute, has also been chairman of CNN and the managing editor of Time magazine.

Yes, anonymity is empowering. But escalating hacks and scams show that we need a safer alternative

The Internet was designed in a way that would allow it to withstand missile attacks. That was cool, but it resulted in an unintended side effect: it made it more vulnerable to cyberattacks. So now it may be time for a little renovation.

The roots of the Internet’s design come from the network built by the Pentagon’s Advanced Research Projects Agency to enable research centers to share computer resources. The ARPANET, as it was called, was packet-switched and looked like a fishnet. Messages were broken into small chunks, known as packets, that could scurry along different paths through the network and be reassembled when they got to their destination. There were no centralized hubs to control the switching and routing. Instead, each and every node had the power to route packets. If a node were destroyed, then traffic would be routed along other paths.

These ideas were conceived in the early 1960s by a researcher at the Rand Corp. named Paul Baran, whose motive was to create a network that could survive a nuclear attack. But the engineers who actually devised the traffic rules for the ARPANET, many of whom were graduate students avoiding the draft during the Vietnam War, were not focused on the military uses of the Net. Nuclear survivability was not one of their goals.

Antiauthoritarian to the core, they took a very collaborative approach to determining how the packets would be addressed, routed and switched. Their coordinator was a UCLA student named Steve Crocker. He had a feel for how to harmonize a group without centralizing authority, a style that was mirrored in the distributed network architecture they were inventing. To emphasize the collaborative nature of their endeavor, Crocker hit upon the idea of calling their proposals Requests for Comments (RFCs), so everyone would feel as if they were equal nodes. It was a way to distribute control. The Internet is still being designed this way; by the end of 2014, there were 7,435 approved RFCs.

So was the Internet intentionally designed to survive a nuclear attack? When TIME wrote this in the 1990s, one of the original designers, Bob Taylor, sent a letter objecting. TIME’s editors were a bit arrogant back then (I know, because I was one) and refused to print it because they said they had a better source. That source was Stephen Lukasik, who was deputy director and then director of ARPA from 1967 to 1974. The designers may not have known it, Lukasik said, but the way he got funding for the ARPANET was by emphasizing its military utility. “Packet switching would be more survivable, more robust under damage to a network,” he said.

Perspective depends on vantage point. As Lukasik explained to Crocker, “I was on top and you were on the bottom, so you really had no idea of what was going on.” To which Crocker replied, with a dab of humor masking a dollop of wisdom, “I was on the bottom and you were on the top, so you had no idea of what was going on.”

Either way, the Net’s architecture makes it difficult to control or even trace the packets that dart through its nodes. A decade of escalating hacks raises the question of whether it’s now desirable to create mechanisms that would permit users to choose to be part of a parallel Internet that offers less anonymity and greater verification of user identity and message origin.

The venerable requests-for-comments process is already plugging away at this. RFCs 5585 and 6376, for example, spell out what is known as DomainKeys Identified Mail, a service that, along with other authentication technologies, aims to validate the origin of data and verify the sender’s digital signature. Many of these techniques are already in use, and they could become a foundation for a more robust system of tracking and authenticating Internet traffic.

Such a parallel Internet would not be foolproof. Nor would it be completely beneficial. Part of what makes the Internet so empowering is that it permits anonymity, so it would be important to keep the current system for those who don’t want the option of being authenticated.

Nevertheless, building a better system for verifying communications is both doable and, for most users, desirable. It would not thwart all hackers, perhaps not even the ones who crippled Sony. But it could tip the balance in the daily struggle against the hordes of spammers, phishers and ordinary hackers who spread malware, scarf up credit-card data and attempt to lure people into sending their bank-account information to obscure addresses in Nigeria.

Isaacson, a former managing editor of TIME, is the author of The Innovators

This appears in the January 19, 2015 issue of TIME.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Business

Big Idea 2015: The Coming Micropayment Disruption

Getty Images

Walter Isaacson is the author of “The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution.” Isaacson, the CEO of the Aspen Institute, has also been chairman of CNN and the managing editor of Time magazine.

A flourishing digital economy based on easy payments could help save journalism and encourage the invention of new forms of media

The innovation that will shape the coming year, I think, will be the consumer use of digital currencies, such as bitcoin and its derivatives. Companies such as ChangeTip, BitWall, BitPay, and Coinbase – as well as other digital wallets that make use of cyber currencies or loyalty-points/miles currencies – will empower creators and consumers of content and wrest some power from the Amazons, Alibabas, and Apples. This will upend our current kludgy financial system and ignite an explosion of disruptive innovation.

Our current way of handling small transactions is a brain-dead anachronism. Even Apple Pay and other NFC systems, alas, require that payments go through the current banking and credit card systems. This adds transaction costs, both financial and mental, that make small impulse payments less feasible, especially for digital content online.

Likewise, instantly transferring money to friends, even those who have PayPal or Popmoney accounts, is more difficult than it should be. That’s why I have become addicted to my Akimbo card, which makes instant money transfers from my phone to friends and workers simple, and why I have invested in it and other disruptive money-transfer mechanisms.

An easy micropayment system for digital content could help save journalism. At the moment, most news sites are either beholden to advertisers or force readers to buy a subscription. Digital coins would add another option: people could click and pay a few pennies for an article. Frictionless coin systems that allowed us to buy digital content on impulse would support journalists who want to cater to their readers rather than just to advertisers. It would encourage news sites to produce content that is truly valued by users rather than churn out clickbait that aggregates eyeballs for advertisers

In my new book, The Innovators, I report on how the creators of the web envisioned protocols that would allow digital payments, and I argue that this would benefit individual artists, writers, bloggers, game-makers, musicians, and entrepreneurs. Ever since the British parliament passed the Statute of Anne four hundred years ago, people who created cool songs, plays, writings, and art had a right to get paid when copies were made of them. A flourishing cultural economy ensued. Likewise, easy digital payments will enable a new economy for those who sell such creations online.

A flourishing digital economy based on easy payments might also encourage the invention of new forms of media: collaboratively created role-playing games, interactive online plays and novels, and new ways to combine art and music and narrative.

In addition, it would expand the realm of crowdsourcing. At the moment, people make additions to Wikipedia or improvements to Linux out of the joy of contributing. That’s cool. But imagine a world in which non-fiction books, in-depth reporting, and various other creations could be done collaboratively, with a digital micropayment system that divvied up the revenues based on the use of each person’s contributions. I would love to curate the crowdsourced writing of a book this way.

That’s why I believe that digital currencies and micropayments are likely to be the disruptive innovation of 2015. Then we can move on to the big disruption of 2016, which will be breaking the stranglehold that monopolistic cable companies have over the way content is bundled and distributed for our televisions, so that we pay for only what we want, from wherever we want, and watch it when we want.

This Influencer post originally appeared on LinkedIn. Walter Isaacson shares his thoughts as part of LinkedIn’s Influencer series, “Big Ideas 2015” in which the brightest minds in business blog on LinkedIn about their predictions on ideas and trends that will shape 2015. LinkedIn Editor Amy Chen provides an overview of the 70+ Influencers that tackled this subject as part of the package. Follow Walter Isaacson and insights from other top minds in business on LinkedIn.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.


How Bitcoin Could Save Journalism and the Arts

The Innovators

Walter Isaacson is the author of “The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution.” Isaacson, the CEO of the Aspen Institute, has also been chairman of CNN and the managing editor of Time magazine.

Micropayment systems have the potential to reward creativity and exceptional content—on a realistic scale

The rise of Bitcoin, the digital cryptocurrency, has resurrected the hope of facilitating easy micropayments for content online. “Using Bitcoin micropayments to allow for payment of a penny or a few cents to read articles on websites enables reasonable compensation of authors without depending totally on the advertising model,” writes Sandy Ressler in Bitcoin Magazine.

This could lead to a whole new era of creativity, just like the economy that was launched 400 years ago by the Statute of Anne, which gave people who wrote books, plays or songs the right to make a royalty when they were copied. An easy micropayment system would permit today’s content creators, from major media companies to basement bloggers, to be able to sell digital copies of their articles, songs, games, and art by the piece. In addition to allowing them to pay the rent, it would have the worthy benefit of encouraging people to produce content valued by users rather than merely seek to aggregate eyeballs for advertisers.

This is something I advocated in a 2009 cover story for Time about ways to save journalism. “The key to attracting online revenue, I think, is to come up with an iTunes-easy method of micropayment,” I wrote. “We need something like digital coins or an E-ZPass digital wallet–a one-click system with a really simple interface that will permit impulse purchases of a newspaper, magazine, article, blog or video for a penny, nickel, dime or whatever the creator chooses to charge.”

TIME, February 16, 2009

That was not technically feasible back then. But Bitcoin has now spawned services such as ChangeTip, BitWall, BitPay and Coinbase that enable small payments to be made simply, with minimal mental friction or transaction costs. Unlike clunky PayPal, impulse purchases can be made without a pause or leaving a trace.

When reporting my new book, The Innovators, I discovered that most pioneers of the Web believed in enabling micropayments. In the mid-1960s, Ted Nelson coined the term hypertext and envisioned a web with two-way links, which would require the approval of the person whose page was being linked to.

Had Nelson’s system prevailed, it would have been possible for small payments to accrue to those who produced the content. The entire business of journalism and blogging would have turned out differently. Instead the Web became a realm where aggregators could make more money than content producers.

Tim Berners-Lee, the English computer engineer who created the protocols of the Web in the early 1990s, considered including some form of rights management and payments. But he realized that would have required central coordination and made it hard for the Web to spread wildly. So he rejected the idea.

As the Web was taking off in 1994, I was the editor of new media for Time Inc. Initially we were paid by the dial-up online services, such as AOL and Compuserve, to supply content, market their services, and moderate bulletin boards that built up communities of members.

When the open Internet became an alternative to these proprietary online services, it seemed to offer an opportunity to take control of our own destiny and subscribers. Initially we planned to charge a small fee or subscription, but ad agencies were so enthralled by the new medium that they flocked to buy the banner ads we had developed for our sites. Thus we decided to make our content free and build audiences for advertisers.

It turned out not to be a sustainable business model. It was also not healthy; it encouraged clickbait rather than stories that were so valuable that readers would pay for them. Consumers were conditioned to believe that content should be free. It took two decades to put that genie back in the bottle.

In the late 1990s, Berners-Lee tried to create new Web protocols that could embed on a page the information needed to handle a small payment, which would allow electronic wallet services to be created by banks or entrepreneurs. It was never implemented, partly because of the complexity of banking regulations. He revived the effort in 2013. “We are looking at micropayment protocols again,” he said. “The ability to pay for a good article or song could support more people who write things or make music.”

These micropayment protocols still have not been written. But Bitcoin may be making that unnecessary. One of the greatest advocates of using Bitcoin for micropayments is the venture capitalist Marc Andreessen, who as a student at the University of Illinois in 1993 created the first popular Web browser, Mosaic.

Originally, Andreessen had hoped to put a digital currency into his browser. “When we started, the first thing we tried to do was enable small payments to people who posted content,” he explained. “But we didn’t have the resources to implement that. The credit card systems and banking system made it impossible. It was so painful to deal with those guys. It was cosmically painful.”

Now Andreessen has become a major investor in companies that are creating Bitcoin transaction systems. “If I had a time machine and could go back to 1993, one thing I’d do for sure would be to build in Bitcoin or some similar form of cryptocurrency.”

Walter Isaacson, a former managing editor of Time, is the author of The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution, out this week.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Apple

Apple’s Watch Will Make People and Computers More Intimate

The new device will bring us one step closer to human-machine symbiosis

A fundamental quest of the digital age has been to make our devices more personal. Steve Jobs was the Zen master of this, and he ingrained it into the DNA of Apple. That was reflected in the Apple Watch that current Apple CEO Tim Cook and his team launched this week, the latest leap toward creating a more intimate connection between people and computers.

The great pioneer of computer personalization was Vannevar Bush, an MIT engineering dean who oversaw scientific research for the U.S. government during World War II. In 1945 he wrote a seminal article titled “As We May Think” for the Atlantic that envisioned a personal information device that he called a memex. A person would be able to store all of his communications and information in it, and it would serve as “an enlarged intimate supplement to his memory.” The word intimate was key, and it was one that Cook used when describing the Apple Watch.

Other ingenious innovators enhanced the intimacy between computers and humans. J. C. R. Licklider, an MIT psychologist and engineer who best deserves the title of father of the Internet, helped design a massive U.S. air defense system that involved networked computers in twenty-three tracking centers. He created easy and intuitive graphic displays, since the nation’s fate might depend on the ability of a console jockey to assess data correctly and respond instantly. He called his approach “man-computer symbiosis.” As he explained, “human brains and computing machines will be coupled together very tightly.” Douglas Engelbart, an acolyte of Bush and Licklider, invented the mouse as part of his mission to make the connection between humans and computers more personal, and at Xerox PARC, Alan Kay and others came up with friendly screen displays with folders and icons that users could point to and click.

For the Macintosh that he launched at the Flint Center thirty years ago, Jobs famously appropriated the graphical user interface from Xerox PARC, quoting Picasso as saying that “great artists steal.” He had an intuitive genius for making devices that established an intimate connection with the user. The iPod, for example, performed the simple but magical task of putting a thousand songs in your pocket. It harkened back to another great triumph of personalization. In 1954, Pat Haggerty of Texas Instruments was looking for a way to create a mass market for transistors. He came up with the idea of a pocket radio. The radio no longer would be a living-room appliance to be shared; it became a personal device that allowed you to listen to your own music where and when you wished—even if it was music that your parents wanted to ban.

Indeed, there was a symbiotic relationship between the advent of the transistor radio and the rise of rock and roll. The rebellious new music made every kid want a radio. And the fact that the radios could be taken to the beach or the basement, away from the disapproving ears and dial-controlling fingers of parents, allowed the music to flourish. Its plastic case came, iPod-like, in four colors: black, ivory, Mandarin Red, and Cloud Gray. Within a year, 100,000 had been sold, making it one of the most popular new products in history.

In the decades since Bush envisioned the intimate and personal memex, a competing school of computer science has set its sights on artificial intelligence, repeatedly predicting the arrival of machines that could think without us, perhaps even make us irrelevant. That goal has been elusive, a mirage always a few decades away. The Apple Watch, designed to touch our wrists and beat with our hearts, again shows the greater power of the approach that Bush and Licklider proposed, that of seeking an intimate symbiosis and deeply personal partnership between humans and machines.

Walter Isaacson’s history of the digital age, The Innovators, will be published in October.


Henry Kissinger Reminds Us Why Realism Matters

The former Secretary of State, now 91, argues for a moral but rational foreign policy in the age of terrorism Marco Grob for TIME

In his new book, the 91-year-old statesman strikes a note of humility

When Henry Kissinger talks about world order, to some it might seem as if he is living in a previous century. The 17th, perhaps. Beginning with his Harvard doctoral dissertation 60 years ago, he has extolled the concept of international order that was established in 1648 by the Peace of Westphalia, which ended the Thirty Years’ War. Instead of being shaped by wars of religion and the death spasms of the Holy Roman Empire, Europe’s international system was thenceforth based on independent nation-states, each sovereign over religion and other issues in its own territory. States would not interfere in the internal affairs of other states, and order would, ideally, be maintained by clever statesmen who focused on national interests and curated a balance of power.

Kissinger’s appreciation for order, he later recalled, came after his family fled Hitler’s Germany in 1938 and arrived in New York, where he realized he did not have to cross the street to avoid non-Jewish boys who might beat him up. Kissinger became an exemplar of the realist, as opposed to idealist, school of diplomacy, someone who believed that a foreign policy that is overly guided by moral impulses and crusading ideals was likely to be dangerous. “The most fundamental problem of politics,” he wrote in his dissertation, “is not the control of wickedness but the limitation of righteousness.”

Kissinger’s fellow students in Harvard’s government department scoffed at his choice of topic. The atom bomb, they contended, had fundamentally changed global affairs. One snidely suggested he should transfer to the history department.

Likewise, we are tempted to raise an eyebrow at the news that Kissinger, now 91, has produced another paean to the Westphalian system, his 17th book in 60 years, this one titled simply World Order. Respect for sovereignty? How quaint! Hasn’t he heard that in the 21st century, threats respect no borders, the world is flat, and we have a humanitarian duty to protect people in places where regimes are repressive? That is why we rejected realist thinking for a “Freedom Agenda” that included invading Iraq to make the Middle East safe for democracy, toppling Muammar Gaddafi in Libya under a humanitarian banner and seeking (well, at least until ISIS came along) to do the same to President Bashar Assad in Syria.

Hmmm…upon reflection, maybe throwing out the Westphalian system, forsaking the principle of respect for sovereignty and letting idealism overwhelm ­realism wasn’t such a good idea after all. And if that’s the case, then Kissinger’s World Order doesn’t seem dated at all. The U.S. might do well to heed his prescription that it alloy its idealism with a new dose of realism. “Westphalian principles are, at this writing, the sole generally recognized basis of what exists of a world order,” he notes.

Kissinger’s book takes us on a dazzling and instructive global tour of the quest for order, from Cardinal Richelieu to Metternich and Bismarck, the Indian minister Kautilya of the 4th century B.C. and the Ottoman Sultan Suleiman, and a succession of American Presidents beginning with Teddy Roosevelt and Woodrow Wilson, all culminating in a world order based on sovereign nation-states at the end of World War II. “By the mid-20th century,” Kissinger writes, “this international ­system was in place on every continent.”

When he was the co-pilot of American statecraft as Richard Nixon’s National Security Adviser and Secretary of State in the early 1970s, Kissinger was able to manipulate the levers of this system with a mastery that would have mesmerized Metternich. Eschewing our differences in ideologies and values, he forged a détente with the Soviet Union and an opening to China, then played off both to create a triangular balance of power that preserved the U.S.’s influence after its retreat from Vietnam.

But sustaining such a values-neutral pursuit of strategic interests is difficult in a democracy that celebrates its moral exceptionalism. “The United States has alternated between defending the Westphalian system and castigating its premises of balance of power and noninterference in domestic affairs as immoral and outmoded,” he writes. Because he and Nixon failed to weave in the idealism that is ingrained in the American DNA, popular support for their realist edifice was precarious, as if built of bricks without straw. Kissinger was attacked by moral idealists of the left and, more notably, by the nascent neoconservatives and ardent anticommunists on the right. Reaction against his realism contributed to the elections of both Jimmy Carter and then Ronald Reagan.

Although Kissinger routinely notes the importance of America’s idealism, he almost invariably follows with the word but. “America would not be true to itself if it abandoned this essential idealism,” he writes. “But to be effective, these aspirational aspects of policy must be paired with an unsentimental analysis of underlying factors.” This “yes, but” balance, with the emphasis always tilting to the but sentence, pervades Kissinger’s analysis and peppers every chapter of his book.

The need for a renewed realism, Kissinger convincingly argues, is especially true in the Middle East, where jihadists have shattered the nation-state system in their quests for global revolution based on extremist religious values. This dangerous upheaval was facilitated in part by the U.S.’s morally motivated but strategically dubious decisions to support regime change and Western-style democracy in Iraq, Libya, Egypt, Afghanistan and Syria.

On Afghanistan, Kissinger supported the initial attack on al-Qaeda and its Taliban protectors, but he looks back skeptically on the broader mission that had evolved by 2003. “The central premise of the American and allied effort became ‘rebuilding Afghanistan’ by means of a democratic, pluralistic, transparent Afghan government whose writ ran across the entire country,” he writes. But this “radical reinvention of Afghan history” was not achievable. “No institutions in the history of Afghanistan or of any part of it provided a precedent for such a broad-based effort.”

Likewise on Iraq, Kissinger initially supported the mission to topple Saddam Hussein, but he says, “I had doubts, expressed in public and governmental forums, about expanding it to nation building and giving it such universal scope.” He blames George W. Bush and his Administration for pursuing idealistic crusades that ignored earthly realities. As Bush put it in a 2003 address, “Iraqi democracy will succeed—and that success will send forth the news, from Damascus to Tehran, that freedom can be the future of every nation.” This ideal was, Kissinger notes, unmoored from realities. “To seek to achieve [American values] by military occupation in a part of the world where they had no historical roots,” he writes, “imbued the American endeavor in Iraq with a Sisyphean quality.”

Despite heart surgery this year, Kissinger at 91 is a lion in a prolonged winter. Four decades after he last served in government, he is a fixture on the New York–Washington Acela, and he makes regular trips to Russia and China, where he is still accorded meetings with top leaders. His analyses remain prescient. Just as the showdown over chemical weapons in Syria was building last year, Kissinger was at a New York City dinner where various military and intelligence experts were discussing what might happen. Kissinger predicted that Russia would suddenly step in and offer a way to resolve the chemical-weapons issue, since it and the U.S. shared a strategic interest in not having such weapons fall into terrorist hands. Two weeks later, that is precisely what happened. He also argued that it was a mistake to make the ouster of President Assad’s regime a policy objective without knowing what would replace it, because that was likely to lead to a chaotic civil war dominated by the most radical of the jihadist forces.

For his undergraduate thesis in 1950, Kissinger tackled “The Meaning of History.” At 383 pages, it attempted to tie together the philosophies of Immanuel Kant, Oswald Spengler and Arnold Toynbee, while roping in ideas from Descartes, Dostoyevsky, Hegel, Hume, Socrates and Spinoza. It was topped off with a section called “A Clue From Poetry,” featuring Dante, Homer, Milton and Virgil. At one point he declared, “Descartes’ cogito ergo sum was not really necessary.”

Kissinger ends his latest book on a different note, one of humility—a trait that for most of his career he was better at humorously feigning than at actually possessing. “Long ago, in youth, I was brash enough to think myself able to pronounce on ‘The Meaning of History,’” he writes. “I now know that history’s meaning is a matter to be discovered, not declared.”

The key to Kissinger’s foreign policy realism, and the theme at the heart of his magisterial new book, is that such humility is important not just for people but also for nations, even the U.S. Making progress toward a world order based on “individual dignity and participatory governance” is a lofty ideal, he notes. “But progress toward it will need to be sustained through a series of intermediate stages.”

Isaacson is the CEO of the Aspen Institute and a former managing editor of Time. He is the author of biographies of Henry Kissinger, Benjamin Franklin, Albert Einstein and Steve Jobs, and of The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution, to be published in October


How Your Uber Rating Can Make You a Better Person

The Hamptons Lure Uber Top Drivers Amid NYC Slow Summer Weekends
Victor J. Blue—Bloomberg / Getty Images Uber's ride request is displayed on an iPhone in New York, U.S., on Wednesday, Aug. 6, 2014.

If we all thought we were subject to being reviewed by the people around us, we might work harder to be on our best behavior

One of the many cool things about Uber is that it allows passengers to rate the driver. Not surprisingly, the drivers are very friendly. A recent hack of Uber’s site highlighted that the sauce-for-the-gander reverse is also true: the drivers rate us passengers. Even better, for a while it was possible to crack Uber’s database and see the driver’s ratings for you. With enough cleverness, you could even pry loose how others were rated.

At first this might seem a creepy invasion of privacy, especially if you’re less than a five-star passenger. But there’s a virtue to losing your anonymity. Once you know you’re being rated, just like the driver, you’re likely to be a bit nicer, sit in the front, and make conversation. The world becomes slightly more civil.

Plato in The Republic writes about the Ring of Gyges, a mythical piece of jewelry that allows its wearer to become invisible. His question is whether such a person would always be moral when assured that no one could see him. Plato’s brother Glaucon says it’s obvious that we’re more likely to behave like a jerk, or worse, when we know that we’ll never be caught or called out.

Civil liberties junkies sometimes confuse the worthy ideal of privacy with anonymity, a less exalted concept. The idea that we should be able to interact with other people anonymously is rather new in human history. It arose when people started moving from tight-knit communities to urban areas where they were generally unknown. When I was a kid in New Orleans, if I went to the local store and bought a pack of Marlboros, it would get back to my parents in about five minutes. Now we can buy anything anonymously and, worse yet, say anything.

This has not elevated the civic discourse. If I could conjure up a magic Plato ring, it would allow me to know and publicly reveal the names and addresses of all people who anonymously post vulgar rants and racist tweets. I would use it only sparingly, but I suspect that just a few such revelations would make the Twittersphere and Blogosphere suddenly a bit more civil, or at least subdued.

In the early days of the online world, people who posted in virtual communities such as The WELL knew that, even if they were using a pseudonym, they were creating an online identity and reputation that was worth tending. A year ago, the Huffington Post began requiring people to register in order to comment, and it has elevated the discourse on the site.

When the Internet Protocol was created in the early 1970s, it did not grant total anonymity to users. To this day, unless you take special precautions, a good hacker – or the NSA – can track down your I.P. address, location, and even your true identity. If a few white-hat hackers or NSA leakers published the names and addresses of, say, the trolls who hounded Robin Williams’s daughter, there would understandably be an outcry among privacy (read: anonymity) advocates, but it would have the silver lining of muting some of the haters.

Likewise, if we all thought we were subject to being rated, we might work harder to be on our best behavior. In the world of Yelp and TripAdvisor and HealthGrades, we get to rate our restaurants and hotels and doctors. I hope that expands. College students rate their professors in many places; it would be nice to allow kids and parents to rate their high school teachers. In the sauce-for-the-gander category, if teachers and waiters and hotels were all rating us, it might feel a bit Orwellian. Nevertheless, it’s a cool Ring of Gyges thought experiment to imagine how much better we would behave.


aspen journal logo

Walter Isaacson, a former managing editor of TIME, is the president and CEO of the Aspen Institute, a nonpartisan educational and policy studies institute based in Washington, DC. A former chairman and CEO of CNN, he is the author of Steve Jobs (2011),Einstein: His Life and Universe (2007), Benjamin Franklin: An American Life (2003), Kissinger: A Biography (1992), and the forthcoming The Innovators (October 2014), as well as the coauthor of The Wise Men: Six Friends and the World They Made (1986). This article also appears in the Aspen Journal of ideas


Obama Can Still Secure His Legacy

If he plays his last two years like the final quarter and not the back nine

aspen journal logo

The article also appears in the Aspen Journal of ideas

A question that faces president Obama, however the midterm elections turn out, is whether he’s going to play his final two years as the back nine of a casual afternoon of golf, coasting toward the clubhouse of former presidents, or as the final quarter of a tight basketball game.

When I was working with Steve Jobs on a biography in 2009, he had an inkling that he might only have a couple of active years left. As his cancer kept recurring, instead of slowing him, it spurred him on. In those two years, he refined the iPhone and launched the iPad, thus ushering in the era of mobile computing.

President Obama has scored two monumental achievements: helping to restore the financial system after the 2008 collapse and making it possible for every American to get health care coverage, even if they leave their jobs or have preexisting conditions. Obamacare may be undermined if the Supreme Court guts subsidies for the federal exchanges. If so the sweeping nature of the reform will survive only if Obama mounts a rousing, state-by-state campaign to rally passion for protecting the new health benefits.

As for rescuing the economy, this could be remembered as a hollow victory unless the recovery restores economic opportunity for all Americans. Growing inequality—of income, wealth, and opportunity—is the economic, political, and moral issue of our time. The fundamental creed of America is that if you work hard and play by the rules, you can support your family with dignity and believe that your children will have an even better future. But that is being lost as the middle class continues to be hollowed out and the poor get left further behind.

From the Pope to Thomas Picketty, and from Paul Ryan to Rand Paul, there has been a renewed focus on the moral imperative of economic opportunity. Obama seems ready to make that the defining passion of his final two years. Fighting for a fair deal for every American goes to the core of what he believes, rounds out the narrative of his presidency, secures his historic legacy, and leads naturally into what is likely to be the mission of his post-presidency.

The foundation for such a crusade could be a simple goal, one with moral clarity and patriotic resonance: that every kid in this country deserves a decent shot. He’s got a fresh team in place, and he’s already proposed many elements of an opportunity agenda in his My Brothers’ Keeper Initiative and other speeches. Among them: Universal preschool, so that no child starts off behind. Quality after school activities and summer internships. Apprentice programs like the bill proposed by Senators Cory Booker and Tim Scott. What also could be included is a public-private effort to create a service year program so that every kid after high school or college has the opportunity to spend a year serving their country in a military or domestic corps.

I’ve been reading Doris Kearns Goodwin’s magisterial narrative of the Teddy Roosevelt era, The Bully Pulpit. In 1903, Roosevelt felt a fierce urge to energize the American people around what he dubbed his “Square Deal for every man, great or small, rich or poor.” He spent nine weeks crossing the country by train, delivering 265 speeches. Most were carefully-crafted explanations of why corporate trusts needed to be reined in and workers needed to be respected. But when he arrived at the Grand Canyon, he began adding passionate calls to protect the environment and preserve nature. The trip not only refreshed his presidency, it refreshed him personally. The old boxer relished not only the “bully pulpit” but also being “in the arena.”

It’s probably not feasible for President Obama to embark on a weeks-long whistle-stop tour barnstorming for a new Fair Deal and a dedication to preserving the planet, though it would sure be fun to watch. It’s hard to break through all of the static, but after the midterms, it may be possible for him to propound a narrative that ties together his proposals for economic opportunity, poverty reduction, and immigration. A vision of a land of opportunity would appeal to most Republicans as well as Democrats.

For the final two years of his term, President Obama could stay above the fray and recognize that it would be pointless, given the dysfunctional nature of Congress, to try to accomplish anything significant. A rational calculus of risks and rewards, and a sober assessment of the possibilities for accomplishing anything in Washington, would argue for that approach. But I can’t help but hope that he decides to race against the clock rather than run it out.

TIME politics

A Modest Proposal for Exploiting Corrupt Politicians

Why not have a more creative way of dealing with convicted pols?

Ray Nagin, the former mayor of my hometown of New Orleans, has just been sentenced to ten years in federal prison. He began his tenure in office by cracking down on corruption, but by the end of his two terms – after his feckless performance in the wake of Hurricane Katrina – he was taking kickbacks and payoffs from city contractors.

What he did was bad, so he deserves to be punished. Yet I cannot help wondering what good it will do to put a 58-year-old happily-married father of three in a prison for a decade. He’s certainly not a danger to the community. Nor is there much likelihood that he will commit such a crime again, if only because he’s not mayor anymore.

Perhaps the justification for having America’s taxpayers pay a fortune to keep him locked up is that it will serve as a deterrent to other officials who might be contemplating corruption. However, there is scant evidence of a deterrent effect, at least in my home state. The Oakdale federal prison where the judge recommended that Nagin be sent has recently served as a home for a whole motley troupe of sticky-fingered Louisiana former politicians: Congressman William Jefferson, Governor Edwin Edwards, Insurance Commissioner Jim Brown, and State Representative Girod Jackson III.

So it seems to me that, as with nonviolent drug offenders, we need some alternatives to prison for corrupt politicians. The whole field of alternatives to incarceration seems a bit lame these days, in need of an infusion of new ideas. Perhaps Louisiana could lead the way.

One idea would be to exile Louisiana’s steady stream of colorful corrupt politicians to an island in the marshlands, such as Grand Isle. This little Elba could become a tourist attraction, like the bird sanctuary and Tabasco factory on Avery Island. Visitors from around the world could pay to poke, feed, and photograph an authentic corrupt Louisiana politician.

Another idea would be to create a Corrupt Convicts Corps and put these fallen politicians to work. They could be confined by ankle bracelets to house arrest in the evenings, but during the day they could investigate the dealings of current politicians and sleuth out corruption. They’d likely be pretty good at it, since they know the tricks of the trade.

Convicted politicians would have to serve in the corps until they ferreted out and helped convict another corrupt officeholder, who would then take over that slot in the corps. This would assure that the corps would be continually replenished with younger and wilier corrupt politicians. Such a talented posse of enforcers might serve as a deterrent. It would also cut down on the costs of prisons and anti-corruption units.

TIME Media

The New York Times‘ New Boss

Dean Baquet New York Times
Reuters Managing Editor Dean Baquet is shown in this handout photo provided by the New York Times on May 14, 2014.

Former TIME and CNN chief Walter Isaacson on the new leader of the country's most influential newspaper

Dean Baquet — newly crowned executive editor of the New York Times — manifests a rare combination in journalism: he can be a tough reporter and also a nice person.

We worked together, in the 1970s, as fresh-faced junior reporters for a feisty New Orleans afternoon paper, the States-Item, soon to be folded into the Times-Picayune. Dean and I shared a workspace and often a byline. He was a dogged investigative reporter, and I tagged along. Once we wrote a blockbusting story together about a sketchy businessman who, we alleged, had been involved in arson, and he promptly sued us for libel. I was panicked, but Dean was sanguine; he knew that the U.S. Attorney in New Orleans was about to indict the guy a few days later.

Dean will be a great editor because good journalism is essentially a collaborative endeavor. Dean, with his friendly smile and deeply sympathetic soul, knows how to enlist people to work together, partner, cooperate, and collaborate. He’s a teambuilder.

This will be especially important at the New York Times, which is stocked with the industry’s most talented journalists but does not always win awards for newsroom morale.

Dean also knows that you can be an honest, hardnosed reporter without disliking the people you cover. He is essentially an optimist, which is why his smile is so natural.

When he left the Los Angeles Times, he showed that it was possible to stand on principle, not be forced to compromise your values, do it in a quiet and graceful way, and live to tell the tale.

There are some good lessons in the rise of Dean Baquet. 1. It’s possible to be both smart and kind. 2. Good leadership in the digital age requires fostering teamwork and collaboration. 3. It’s good to lead by example. 4. Nice people sometimes finish first.

Walter Isaacson, a former managing editor of TIME, is the president and CEO of the Aspen Institute, a nonpartisan educational and policy studies institute based in Washington, DC. A former chairman and CEO of CNN, he is the author of Steve Jobs (2011), Einstein: His Life and Universe (2007), Benjamin Franklin: An American Life (2003), and Kissinger: A Biography (1992), and coauthor of The Wise Men: Six Friends and the World They Made (1986).

Your browser is out of date. Please update your browser at http://update.microsoft.com