TIME Innovation

‘I Don’t Have Ethics': The Dark Heart of Google’s Latest Chatbot

"I'm in the middle of nowhere"

A new research paper released by Google this week reaffirms what’s been more or less obvious since HAL 9000 murdered four astronauts in Stanley Kubrick’s 1968 film 2001: A Space Odyssey: computers do not make particularly dependable friends.

The Google project—an admittedly more benign exercise than Kubrick’s—was yet another attempt to fashion a “conversational model” of artificial intelligence. Colloquially, we refer to these systems as “chatbots”—programs that can somewhat convincingly engage in human discourse.

The researchers behind the experiment, Oriol Vinyals and Quoc V. Le, programmed their particular chatbot with two sets of data: guidelines from an IT troubleshooting help desk and a database of movie subtitles. The ensuing dialogue was mostly awkward and occasionally nihilistic.

“What is the color of water?” the researchers asked.

“Water,” the computer replied simply.

“Where are you now?” they asked.

“I’m in the middle of nowhere.”

“Tell me the definition of morality,” the researchers requested.

“I don’t have ethics,” the machine advised them.

According to the transcript, this went on for awhile. Philosophical banter proving futile, Vinyals and Le switched gears, with more coherent results.

“What do you think about Cleopatra?” they asked.

“Oh, she’s very regal.”

Read more dialogue here.

TIME Cyberwar

These 5 Facts Explain the Threat of Cyber Warfare

office of personnel management washington
James Lawler Duggan—Reuters Workers arrive at the Office of Personnel Management in Washington on October 17, 2013.

The disastrous hack of the federal government's Office of Personnel Management is the tip of the iceberg

America has spent decades and trillions of dollars building up the greatest military force the world has ever seen. But the biggest threat to national security these days comes from not from aircraft carriers or infantry divisions, but a computer with a simple Internet connection. That much became clear after the catastrophic hack—most likely by a foreign power—of sensitive federal employee data stored online. These 5 stats explain the evolution of cyber warfare, its astronomical costs and its increasingly important role in geopolitics.

1. Government Threats

The massive breach of the Office of Personnel Management a couple weeks ago made headlines, but Washington has been fending off cyber-attacks for years now. The federal government suffered a staggering 61,000 cyber-security breaches last year alone. This most recent wave of hacks exposed the records of up to 14 million current and former US government employees, some dating back to 1985. Compromised information includes Social Security numbers, job assignments and performance evaluations. This is dangerous information in the hands of the wrong people, which by definition these hackers are. There is a good reason why the U.S. Director of National Intelligence ranks cyber crime as the No. 1 national security threat, ahead of terrorism, espionage and weapons of mass destruction.

(CNN, Guardian, Reuters, Washington Post, PwC)

2. Business Threats

Hackers aren’t only in the game to damage governments—sometimes good old-fashioned robbery is enough. The FBI had to notify over 3,000 U.S. companies that they were victims of cyber security breaches in 2013. Victims ranged from small banks to major defense contractors to mega retailers. An astounding 7 percent of U.S. organizations lost $1 million or more due to cyber crime in 2013; 19 percent of U.S. entities have claimed losses between $50,000 and $1 million over the same span. Hacking costs the U.S. some $300 billion per year according to some estimates. Worldwide that figure is closer to $445 billion, or a full 1 percent of global income. The research firm Gartner projects that the world will spend $79.9 billion on information security in 2015, with the figure rising to $101 billion in 2018—and that still won’t be enough.

(PwC, The Wire, Washington Post, Wall Street Journal)

3. Social Media Threats

With the rise of social media also comes the rise in social media cyber crime. Social media spam increased 650 percent in 2014 compared to 2013. Nearly 30 percent of U.S. adults say one of their social media accounts has been hacked. That number is only set to grow: an estimated 10 to 15 percent of home computers globally are already infected with botnet crime-ware, and over 30,000 new websites are corrupted daily with compromising code. In a day and age where your online presence increasingly defines you to the rest of the world, hackers with access to your accounts can cause untold damage to both your personal and professional life. Back in 2011, Facebook admitted that it was the target of 600,000 cyber-attacks every day. Not wanting to scare off potential users, it hasn’t released official figures since.

(Guardian, Wall Street Journal, Cyber Shadows, Telegraph)

4. Russia

Speaking of social media, cyber threats don’t only come in the form of traditional hacking. Moscow has set up a sophisticated “troll army” under the umbrella of its Internet Research Agency to wage a massive disinformation campaign in support for its invasion of Ukraine, and of the Kremlin in general. These trolls work hard, each one pumping out 135 comments per 12-hour shift. Furthermore, each troll is reportedly required to post 50 news article a day while maintaining at least six Facebook and ten Twitter accounts. That’s a whole lot of misinformation. Despite economic hardship caused by sanctions, Moscow believes in this mission enough to employ a full-time staff of 400 with a monthly budget of $400,000.

(New York Times, Radio Free Europe Radio Liberty, Forbes, New York Times)

5. China

But the single biggest threat to the U.S. remains China. A full 70 percent of America’s corporate intellectual property theft is believed to originate from China. That doesn’t just mean random hackers who operate within China’s borders; we’re talking about elite cyber groups housed by the government in Beijing. China decided long ago that it couldn’t compete with the U.S. in direct military strength. The US already outspends China more than 4-to-1 in that regard, making catch-up near impossible. Beijing has instead decided to focus instead on commercial and government espionage. While exact figures are hard to come by, in May 2013 two former Pentagon officials admitted that “Chinese computer spies raided the databanks of almost every major U.S. defense contractor and made off with some of the country’s most closely guarded technological secrets.” That would be really impressive if it wasn’t so terrifying.

(The Wire, International Institute for Strategic Studies, Bloomberg)

TIME cyber

Continued Hacking Highlights U.S-Chinese Cyberwar Worries

Latest episode, linked to Beijing involves data on 4 million Americans

The latest massive computer hack suggests the Chinese had it right: it may be time for the U.S. to build a great wall to protect its data and that of 320 million Americans. That’s why the U.S. secretly expanded the National Security Agency’s warrantless wiretapping program to root out hackers in 2012. But, as a rash of recent data breaches makes clear, the hackers retain the upper hand.

U.S. officials said Thursday they believe that Chinese hackers penetrated federal computer networks and plundered personal information on more than 4 million current and former U.S. workers. That makes it among the largest theft of U.S. government data in history, with federal officials warning the total could grow as their probe continues. U.S. officials said the hack appears similar to others that have been made into private companies’ networks, including data on 80 million Americans pilfered from the Anthem insurance company, suggesting a widespread Chinese effort.

The Internet, made up of millions of computers and servers, only works if they can communicate easily with one another. Every password, firewall or other internal barrier built into the system to keep hackers out pushes it closer to grinding to a halt. That’s why, just like with your money, more valuable data is more heavily guarded. While the alleged Chinese hackers apparently got basic personal information—names, addresses, Social Security numbers—they apparently didn’t get into tougher-to-access personnel files that contained sensitive information that is routinely collected during background checks.

The government used its Einstein anti-hacking system to detect the breach. The Department of Homeland Security calls it “an intrusion detection and prevention system that screens federal Internet traffic to identify potential cyber threats.”

The FBI is investigating the intrusion, which involved the federal Office of Personnel Management, responsible for overseeing the personnel records of U.S. employees. The bureau believes the attack originated in China, but either lacks, or is unwilling to share, the evidence that pinpoints the nation. Attributing the source of such attacks is difficult, and the U.S. doesn’t know if this one were carried out by the government, by some entity working for the government, or hackers independent of the government.

While U.S. officials have linked the thefts to China because of the peculiar hacking techniques and computer addresses involved, they haven’t been able to come up with a motive. The data haven’t shown up on the black market. China denied any role in the hack. “If you keep using the words ‘maybe’ or ‘perhaps’ without making a thorough study, this is irresponsible and unscientific,” Chinese Foreign Ministry spokesman Hong Lei said.

The U.S. hasn’t been reticent about blaming Beijing for cyber attacks in recent years. In addition to this latest series of attacks, Beijing also downloaded terabytes of design data on the Pentagon’s $400 billion F-35 fighter program and other weapons, U.S. officials say. They’re also alleged to have stolen additional billions in intellectual property developed by U.S. companies.

“A great deal of what China, North Korea, Iran, and the vast majority of cyber-criminals and self-proclaimed hacktivists do isn’t very sophisticated,” Stephanie O’Sullivan, the principal deputy to Director of National Intelligence James Clapper, told an April cyber-security conference. They tend to exploit vulnerabilities in computer systems for which fixes exist but haven’t been installed. “The Chinese in particular are cleaning us out because we know we’re supposed to do these simple things and yet we don’t do them,” she said. “Most Chinese cyber intrusions are through well-known vulnerabilities that could be fixed with patches already developed.”

It’s not known if the latest attack exploited such a weakness, but one thing is certain: “Good basic security habits,” says Peter W. Singer of the New America Foundation, “would stop over 90% of attacks.”

TIME technology

How Donkey Kong and Mario Changed the World

Donkey Kong
Aaron Ontiveroz—Denver Post/Getty Images Donkey Kong

June 2, 1981: The arcade game Donkey Kong makes its U.S. debut

Before Mario was Mario, he was Jumpman. And when Americans first encountered him in arcades on this day, June 2, in 1981, Jumpman’s best friend — a pet gorilla named Donkey Kong — had turned on his owner, kidnapped his girlfriend and taken her hostage atop the towering steel beams of a construction site. It was up to us to help Jumpman get the girl, by coordinating his leaps from beam to beam while dodging projectiles lobbed by the furious gorilla.

Donkey Kong was a hit. It was also a milestone in video game history: the first of the so-called platform games, and one of the first to have a substantial narrative, along with a sense of humor, as Nick Paumgarten had written for the New Yorker. “Prior to Donkey Kong,” he says, “games had been developed by engineers and programmers with little or no regard for narrative or graphical playfulness.”

Its success cemented Nintendo’s role as a major player in the American video game market pioneered by Atari, following the dismal reception of Nintendo’s previous game, Radar Scope, a shooting game reminiscent of Space Invaders. Donkey Kong was a reversal of fortune that ultimately launched a line of games in which Jumpman came into his own as Mario, joined by his brother Luigi. And it helped usher in a new age of gaming — one that has since seen nearly as many ups and downs as Jumpman himself.

Following a surge of popularity in the late ’70s and early ’80s, video games started to get a bad rap — one they still haven’t quite shed — when worried parents began to see them as the undoing of the youth of America. By 1983, even before home gaming consoles were ubiquitous, video games had been blamed for “increasing crime and school absenteeism, decreasing learning and concentration, and causing a mysterious ailment called video wrist,” according to TIME.

The game industry countered the claims, arguing that video games promoted dexterity and quick thinking, and that arcades were a wholesome gathering place where young people could network and build social skills — the golf courses of the high-school set, per TIME. A University of Southern California researcher who interviewed arcade-goers for a 1983 study underwritten by Atari found that gamers tended to be “average or above average students [who] rarely played hooky from school,” and concluded that drugs and alcohol were not common on the arcade scene — if for no other reason than that they impaired players’ high-scoring abilities.

Atari and Nintendo had more to fear than parents’ concerns, however. The same year, video game profits tanked — thanks to “overheated competition, an oversupply of games, relentless price-cutting, plunging profits and a new finickiness among young video fans,” per TIME. The slowdown affected the glutted arcade market — which had more than doubled between 1980 and 1982 — and home video game sales alike.

Nintendo, powered up by Mario’s successes, largely managed to dodge the market’s profit-crushing projectiles. Atari, which lost $356 million and cut nearly a third of its payroll in 1983, did not.

Read more from 1983, here in the TIME archives: Video Games Go Crunch!

TIME Apple

Apple Is Getting an Unexpected Huge New Customer

New York City Exteriors And Landmarks
Ben Hider—Getty Images A general view of the IBM The International Business Machines Corporation offices on Madison Avenue on March 11, 2014 in New York City.

Big Macs for Big Blue

As its newfound partnership with Apple ramps up, IBM has announced that it’ll be offering its employees Mac computers, 9to5Mac reports.

A memo to employees said that IBM workers would be able to select from a MacBook Pro, a MacBook Air, or a PC when a new workstation is set up. The company reportedly plans to have around 50,000 MacBooks in use by the year’s end.

The move is another step in a stunning reversal of Apple and IBM’s longstanding rivalry. Last year, IBM and Apple announced a partnership to launch “made-for-business apps” for iPhones and iPads. That historical deal came as IBM is doubling down on the enterprise and service offerings, rather than personal computers. IBM sold its PC unit to Chinese technology company Lenovo in 2005.

TIME technology

How TIME Explained the Way Computers Work

The Computer Society
The Feb. 20, 1978, cover of TIME

You don't need a Turing Machine to understand it

When Alan Turing submitted his paper On Computable Numbers to the Proceedings of the London Mathematical Society on this day, May 28, in 1936, he could not have guessed that it would lead not only to the computer as we know it today, but also nearly all of the gadgets and devices that are so crucial a part of our lives.

The paper demonstrated that a so-called Turing Machine could perform solvable computations, a proof that is commonly seen as one of the original stepping stones toward the existence of modern computers. Though Turing, who died in 1954, never got to see a smartphone, his paper remains the touchstone behind the technology.

For a 1978 cover story about “The Computer Society,” TIME broke down how computers work in easy(-ish)-to understand terms, thus explaining why Turing mattered so much:

In the decimal system, each digit of a number read from right to left is understood to be multiplied by a progressively higher power of 10. Thus the number 4,932 consists of 2 multiplied by 1, plus 3 multiplied by 10, plus 9 multiplied by 10 X 10, plus 4 multiplied by 10 X 10 X 10. In the binary system, each digit of a number, again read from right to left, is multiplied by a progressively higher power of 2. Thus the binary number 11010 equals 0 times 1, plus 1 times 2, plus 0 times 2 X 2, plus 1 times 2 X 2 X 2, plus 1 times 2 X 2 X 2 X 2–for a total of 26 (see chart).

Working with long strings of 1s and 0s would be cumbersome for humans–but it is a snap for a digital computer. Composed mostly of parts that are essentially on-off switches, the machines are perfectly suited for binary computation. When a switch is open, it corresponds to the binary digit 0; when it is closed, it stands for the digit 1. Indeed, the first modern digital computer completed by Bell Labs scientists in 1939 employed electromechanical switches called relays, which opened and closed like an old-fashioned Morse telegraph key. Vacuum tubes and transistors can also be used as switching devices and can be turned off and on at a much faster pace.

But how does the computer make sense out of the binary numbers represented by its open and closed switches? At the heart of the answer is the work of two other gifted Englishmen. One of them was the 19th century mathematician George Boole, who devised a system of algebra, or mathematical logic, that can reliably determine if a statement is true or false. The other was Alan Turing, who pointed out in the 1930s that, with Boolean algebra, only three logical functions are needed to process these “trues” and “falses”–or, in computer terms, 1s and 0s. The functions are called AND, OR and NOT, and their operation can readily be duplicated by simple electronic circuitry containing only a few transistors, resistors and capacitors. In computer parlance, they are called logic gates (because they pass on information only according to the rules built into them). Incredible as it may seem, such gates can, in the proper combinations, perform all the computer’s high-speed prestidigitations.

The simplest and most common combination of the gates is the half-adder, which is designed to add two 1s, a 1 and a 0, or two 0s. If other half-adders are linked to the circuit, producing a series of what computer designers call full adders, the additions can be carried over to other columns for tallying up ever higher numbers. Indeed, by using only addition, the computer can perform the three other arithmetic functions.

Read the full story from 1978, here in the TIME Vault: The Numbers Game

TIME How-To

Here’s How to Spring Clean Your Computer in 7 Steps

Apple Inc. Announces The New iPad Air 2 And iPad Mini 3
Bloomberg—Bloomberg via Getty Images The 27-inch Apple Inc. iMac computer with 5K retina display is displayed after a product announcement in Cupertino, California, U.S., on Thursday, Oct. 16, 2014.

A periodic polish will keep everything from keys to apps running smoothly

From deadlines to the daily grind, there’s never a good time to drop everything and clean your workspace. But if you keep the term “spring cleaning” in mind, at least there’s one good reminder to schedule that downtime.

But don’t stop at sifting through your inbox or pulling old Post-It notes off your wall. Get down and dirty with your computer, wiping it clean from the screen to the system files, to make sure everything runs smooth the rest of the year. Don’t be daunted. Follow these seven steps to get your entire computer squeaky clean.

1. Clean Your Keyboard: If you realized your computer had a virus, you’d likely drop everything and start patching your software immediately. Guess what — your system probably has dozens of nasty bugs, and that’s just on the surface. Using tracer viruses, Charles Gerba, professor of microbiology and environmental sciences at University of Arizona, Tucson, has studied the spread of contaminants through office buildings, hotels, and healthcare facilities, proving bugs can spread from the front door to your space bar in just two hours.

Thankfully, common disinfectants were between 80 and 99% effective at stopping them — so get wiping. And don’t forget to disinfect your mouse, too.

2. Scrub Your Screen: Most of the time you spend staring at your computer’s screen it’s on, so you can’t see all the dirt and grime caked onto the glass. But power that display down, and a petri dish will materialize before your eyes. Touchscreen PCs will be worse than Macs, but everyone who uses a laptop gets a gunky buildup where your keys meet the glass when the computer is closed.

To see clearly again, all you need is a microfiber cloth and some simple cleaning solution, says PC World. Make sure your display is powered off (if it’s your laptop, power the whole machine down), and try wiping with the dry cloth first. If that’s not enough, spritz some cleaner onto the wipe, and give it a gentle polishing. And then when you’re done, hit your touchpad, too — that thing is nasty.

3. Sort Your Desktop: If you’re anything like me, when your desk is a mess, your life follows suit. Likewise, there’s a psychological effect to keeping a disorganized gaggle of files on your computer’s desktop. But if you’re one of those people who claims to know where everything is even when your hard drive looks like an episode of Hoarders, consider this argument for keeping your computer’s desktop folder clean. According to the author (and several commenters), Macs slow down when users keep too many files on the desktop, because OS X automatically catalogs previews of files located there for quick viewing. So either organize your files now, or get slowed down by them later.

4. Delete The Duplicates: Copies, copies of copies, and copied copies of copies are squeezing you out of your computer. If you don’t believe me, consider what happens when you email a PowerPoint presentation back and forth between yourself and a co-worker. First you create the slides on your computer (file #1, in your Documents folder), then you send it to your colleague via email (file #2, in your sent email), and then she sends it back to you (file #3, in your inbox). Hunt down and eliminate these redundancies to make sure your hard drive doesn’t run out of space before it’s too late.

5. Backup Your Files: When was the last time you backed your files up? If it was this time last year, you’re already at risk of losing sensitive data. (And if you can’t answer that question, you’re really in trouble.) Don’t just make a copy of your files, make three. First, make a daily, system-wide backup using Time Machine on Apple computers or File History on Windows PCs. Secondly, set up a cloud backup so you can access your digital valuables in case your daily backup fails. And finally, make an off-site, offline backup using an external drive that’s stored somewhere else, off property. This last one is your in-case-of-earthquake backup. Hopefully you’ll never need any of these copies (especially the last one), but you’ll only find out how valuable they are when you do.

6. Install Your Updates: Many computer users think since their systems work well, there’s no sense in messing them up by installing unnecessary updates. But that’s flawed logic, because it doesn’t take into account all the external forces, from viruses to wear and tear, that will cause your computer to break down. Staying current on updates will not only keep your computer running well today, it will fend off unforeseen problems tomorrow. So, take this spring cleaning opportunity to update your operating system as well as all your software — and to delete old programs that you’re no longer using.

7. Run Your Utilities: Put a coat of polish on your freshly cleaned computer by running the system utilities designed to make it run more efficiently. On Windows, pressing the Disk Cleanup button in your hard drive’s properties window (“general” tab) makes it easy to find and delete stray bits like temporary Internet files and other long-forgotten downloads. Then on the “tools” tab, error checking and defragmenting features keep your disk spinning as good as new. On Macs, the Disk Utility app in the Utilities program will scour your drive looking for and repairing broken permissions. Experts recommend you run that program monthly, but if you start doing it every spring, you’ll be a step ahead of almost everyone else.

TIME technology

Why the Computer Mouse’s Inventor Isn’t the Big Cheese

First Mouse
Rue des Archives / APIC / Getty Images A prototype of the first mouse, from 1968

April 27, 1981: The computer mouse makes its debut

For an innovation meant to make it easier to use a computer, its name was surprisingly unwieldy: “X-Y position indicator for a display system.” The word “mouse” was much catchier, and that’s what the device was eventually called when it debuted as part of a personal computer station, first sold by the Xerox Corporation on this day, April 27, in 1981.

Credit for the invention itself goes to Douglas Engelbart, who first developed the computer mouse in 1963, per TIME. By the time the mouse became commercially available, however, Engelbart’s patent had expired, and he never earned royalties for his work.

The personal computer that introduced the mouse to the world — with a similarly unwieldy name, the Xerox 8010 Star Information System, and the clunky look common to early personal computers, including a keyboard about the size of a toaster — revolutionized computing in other ways, too: It was the first with a graphical user interface, navigated by clicking icons rather than typing commands, and the first to incorporate folders, file servers and email, according to WIRED.

But like Engelbart, Xerox failed to profit significantly from its innovations. Its failure was twofold, according to the lore of the technology world, as reported by the New Yorker: Its executives didn’t realize the scope of what they’d achieved in the Star workstation — and they let Steve Jobs see it.

In exchange for shares of Apple, Xerox granted Jobs access to its innovation arm, Xerox PARC (short for Palo Alto Research Center) while it was working on the Star system in 1979. Jobs returned to Apple headquarters determined to improve upon the project.

Telling an industrial designer how to build a better mouse, he explained, per the New Yorker, “[The Xerox mouse] is a mouse that cost three hundred dollars to build and it breaks within two weeks. Here’s your design spec: Our mouse needs to be manufacturable for less than fifteen bucks. It needs to not fail for a couple of years, and I want to be able to use it on Formica and my bluejeans.”

Xerox — better known for making copies than computers — ultimately dropped the PC from its portfolio, mouse and all. And in the years that followed, its profits languished while Apple’s continued to rise. In 2000, faced with billion-dollar losses, it even implied that it might put the research center up for sale.

Two years later, however, PARC incorporated as an independent subsidiary of Xerox. Its researchers continue to innovate today — motivated by the center’s immense prestige, if not its history of profit.

As TIME put it in 2000: “The PARC has a pretty good track record when it comes to radical new visions, even if its record of holding onto them has been spotty at best. The mouse, the GUI (graphical user interface, like Windows) and arguably the PC itself were all born in this hothouse of Silicon Valley R. and D.; they ended up making a lot of money for Apple and Microsoft.”

Read more about Xerox, here in the TIME archives: Team Xerox

TIME technology

This 50-Year-Old Prediction About Computers Will Make You Sad

April 2, 1965, cover of TIME
Cover Credit: BORIS ARTZYBASHEFF The April 2, 1965, cover of TIME

TIME's 1965 hopes for automation were high

Correction appended: April 2, 2015, 9:45 a.m.

Fifty years ago, when TIME made computers the cover subject for the April 2, 1965, issue, it seemed like the technology had already grown beyond the bounds of human imagination.

A little more than a decade earlier, the magazine reported, the United States had been home to a mere 100 computers. By 1965, there were 22,500 of them. (A 2013 Census report found that 83.8% of households had a computer in the U.S., and that’s not even counting businesses or government offices.) The smallest model available weighed a now-whopping 59 lbs. The government was spending a billion dollars a year on its computers — that’s about $7.4 billion today — and 650,000 Americans were employed making or selling computers, as others in many industries lost their jobs to automation.

They had irreversibly changed the speed of life across the country, making the impossible possible. By TIME’s calculations, “To process without computers the flood of checks that will be circulating in the U.S. by 1970, banks would have to hire all the American women between 21 and 45.”

And, some experts told TIME, those changes would only continue:

Men such as IBM Economist Joseph Froomkin feel that automation will eventually bring about a 20-hour work week, perhaps within a century, thus creating a mass leisure class. Some of the more radical prophets foresee the time when as little as 2% of the work force will be employed, warn that the whole concept of people as producers of goods and services will become obsolete as automation advances. Even the most moderate estimates of automation’s progress show that millions of people will have to adjust to leisurely, “nonfunctional” lives, a switch that will entail both an economic wrench and a severe test of the deeply ingrained ethic that work is the good and necessary calling of man.

Though the economy would have to adjust, it wouldn’t be all bad. “Many scientists hope that in time the computer will allow man to return to the Hellenic concept of leisure, in which the Greeks had time to cultivate their minds and improve their environment while slaves did all the labor,” the article continued. “The slaves, in modern Hellenism, would be the computers.”

The full century during which this change was predicted is only half-way done, but at this point the chances that we’ll live to see a life of Hellenic leisure seem pretty dim. In fact, as a whole, Americans are working more than we were before computers came along to help out. (That change takes into account the entry into the workforce of many women; among men only, the average hours worked per week is slightly down, but not by as much as was predicted in 1965.)

Points for accuracy should go to the 1965 story’s dissenters, who argued even then that society has always adjusted to whatever changes technology may bring, eventually creating more work for people to do when the old jobs get displaced. Sorry, 1965 readers who were looking forward to a life of computers doing all the annoying or difficult stuff.

And, even though computers aren’t doing all the work, some of the 1965 story’s predictions did come true. For example, Computers, TIME accurately predicted, “will eventually become as close to everyday life as the telephone—a sort of public utility of information.”

Read the full 1965 cover story, here in the TIME Vault: The Cybernated Generation

Correction: The original version of this story misstated the proportion of U.S. households that had a computer in 2013. It was 83.8%.

TIME Innovation

Five Best Ideas of the Day: February 27

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

1. Hollywood is less diverse than its audiences — and it might be hurting the bottom line.

By Austin Siegemund-Broka in the Hollywood Reporter

2. Facebook’s new suicide prevention tools finally get it right.

By Ashley Feinberg in Gizmodo

3. How will we understand the power of the bacteria in our bodies? Meet the crowdsourced American Gut project.

By American Gut

4. The road to artificial intelligence begins with computers mastering video games like a human being in the 80s.

By Rebecca Morelle at BBC News

5. Salting roads and plowing snow is inefficient and costly. A smart algorithm can save cities millions.

By Marcus Woo in Wired

The Aspen Institute is an educational and policy studies organization based in Washington, D.C.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

Your browser is out of date. Please update your browser at http://update.microsoft.com