• U.S.

Technology: The Cybernated Generation

26 minute read

(See Cover)

Purring like contented kittens, the most remarkable support crew ever assembled kept unceasing vigil last week as Gemini spun through space with its two passengers. At Cape Kennedy and at the space complex in Houston, at Goddard Space Flight Center in Maryland and at 14 other sites from the Canary Islands to the South Indian Ocean, dozens of electronic computers guided, watched, advised and occasionally admonished the two astronauts. In fact, the Space Age’s first orbiting digital computer, a hatbox-sized model that can make 7,000 separate calculations a second, went along for the ride in Gemini. No space effort—American or Russian—had ever before made such extensive use of the computer, or depended more on it.

The computer is, in fact, the largely unsung hero of the thrust into space. Computers carefully checked out all Gemini’s systems before the launch, kept precise track of the spacecraft’s position in the heavens at every moment, plotted trajectories and issued precise commands to the astronauts. On their detailed instructions, the astronauts made the first change of orbit ever achieved in flight; computers not only designed the new orbit, but also told the command pilot at what time and for how long he should fire his thrusters to achieve it.

While man’s exploration of space would be impossible without computers, the biggest changes worked by these remarkable machines are taking place right on earth. Just out of its teens, the computer is beginning to affect the very fabric of society, kindling both wonder and widespread apprehension. Is the computer a friend or enemy of man? Will it cause hopeless unemployment by speeding automation, that disquieting term that it has brought into the language? Will it devalue the human brain, or happily free it from drudgery? Will it ever learn to think for itself?

The answers will not be in for quite a while, but one thing is already clear: swept forward by a great wave of technology, of which the computer is the ultimate expression, human society is surely headed for some deep-reaching changes.

“The electronic computer,” says Dr. Louis T. Rader, a vice president of General Electric, “may have a more beneficial potential for the human race than any other invention in history.” As viewed by Sir Leon Bagrit, the thoughtful head of Britain’s Elliot-Automation, the computer and automation will bring “the greatest change in the whole history of mankind.” The public, too, has begun to sense the power of the computer for good and evil. Cartoonists delight in giving computers robotlike stature and minds of their own that like to play tricks on ordinary mortals, and computers have been made the mute but decisive villains of three recent bestselling novels. The science of computers, called cybernetics after the Greek word for steersman, is the subject of an endless round of study and discussion devoted to pondering both the problems and opportunities that confront what social scientists call “the cybernated generation.”

Unto Caesar, unto Sod. As the most sophisticated and powerful of the tools devised by man, the computer has already affected whole areas of society, opening up vast new possibilities by its extraordinary feats of memory and calculation. It is changing the world of business so profoundly that it is producing a new era in Arnold Toynbee’s “permanent industrial revolution.” It has given new horizons to the fields of science and medicine, changed the techniques of education and improved the efficiency of government. It has affected military strategy, increased human productivity, made many products less expensive and greatly lowered the barriers to knowledge.

Arranged row upon row in air-conditioned rooms, waited upon by crisp, young, white-shirted men who move softly among them like priests serving in a shrine, the computers go about their work quietly and, for the most part, unseen by the public. Popping up across the U.S. like slab-sided mushrooms, they are the fastest growing element in the technical arsenal of the world’s most technologized nation. In 1951 there were fewer than 100 computers in operation in the U.S.; today 22,500 computers stand in offices and factories, schools and laboratories—four times as many as all the computers that exist elsewhere in the free world. Only eleven years ago, U.S. industry bought its first computer; today some single companies use as many as 200 computers.

The computer already has been put to work at more than 700 specific tasks, both mundane and exotic, from bookkeeping to monitoring underground nuclear explosions. Computers control the flow of electric current for much of the nation, route long-distance telephone calls, set newspaper type, even dictate just how sausage is made. They navigate ships and planes, mix cakes and cement, prepare weather forecasts, check income tax returns, direct city traffic and diagnose human—and machine—ailments. They render unto Caesar by sending out the monthly bills and reading the squiggly hieroglyphics on bank checks, and unto God by counting the ballots of the world’s Catholic bishops at sessions of the Ecumenical Council in St. Peter’s Basilica.

Formidable Adulthood. The whole science of cybernetics is now entering a new stage. In it, steadily more complex and powerful computers will be called upon to perform infinitely more varied and more difficult tasks. Boeing announced plans two weeks ago to outfit jetliners with computer-run systems that will land a plane in almost any weather without human help. A new “talking computer” at the New York Stock Exchange recently began providing instant stock quotations over a special telephone. In Chicago a drive-in computer center now processes information for customers while they wait, much as in a Laundromat. The New York Central recently scored a first among the world’s railroads by installing computer-fed TV devices that will provide instant information on the location of any of the 125,000 freight cars on the road’s 10,000 miles of track.

To perform their increasing tasks, computers are developing into formidable adulthood. Computermen claim that their machines are now entering a “third generation” in which the new science of microcircuitry and other advances in technology will enable them to reduce the bulkiness of computers, pack more ability into their frames and make them even more reliable and economical. Computers are now being banded together into “families”—compatible groups of machines, ranging from small to large, that are able to solve problems and perform functions from beginning to end by using a single language and program. To broaden the uses of computers, U.S. industry last year introduced 300 new “peripheral devices” to help out the machines—and will introduce at least as many this year.

“When these new machines realize their potential,” says John Diebold, chairman of the Diebold Group, Inc., consultants in the computer field, “there will be a social effect of unbelievable proportions. This impact on society is still to come.” Computermen have even been advised to get their machines out to “see life” in that society by setting up communications links between them and other computers in dispersed locations. Says R. M. Bloch, a vice president of Honeywell: “The computer that lacks an ability to communicate with the outside world is in danger of remaining an isolated marvel mumbling to itself in the air-conditioned seclusion of its company’s data-processing room.”

Baffling Blend. Much of the apprehension about the social effects of the computer arises from the machine’s baffling blend of complication and simplicity. Basically, the digital computer is nothing but an electronic machine that can do arithmetic and retrieve information with incredible speed—but that very speed makes it, in its way, superhuman. Inside the computer’s refrigerator-like cabinet dwells an intricate network of thin wires, transistors, and hundreds of thousands of tiny magnetized metal rings, all strung together into a memory-and arithmetic-processing unit. The location of each fact stored in the computer’s memory is no bigger than the tip of a match, and the computer never forgets these locations.

The computer receives its information, called input, from magnetic disks, magnetic tapes, punched cards or typewriter-like keyboards that feed the memory unit. Each fact is first translated into binary language, a system using two as a base instead of ten as in the decimal system, and then fed into the computer. Once it has received a given fact, the computer relays it to its memory unit via electronic impulses that “store” the numerically defined fact in several metal rings.

When someone wishes to solve a problem, he defines the problem in computer language—a combination of letters, numbers, punctuation marks and mathematical symbols. This is the part of computer science called programming, which is a way of telling a machine what to do with its information in order to achieve a desired result. As instructions are fed to the computer in this special language, the machine sends electric impulses coursing through its innards at the speed of light (186,300 miles per second), checking on each metal ring to see if it contains the information sought.

Basically, each metal ring, activated by the electric impulse answers 1 or 0, meaning that it either does or does not represent a portion of the binary numeral sought. If the computer wishes to use the number 87, for example, it might get positive responses from the rings that made up the numbers 1, 2, 4, 16 and 64—for a total of 87—while receiving negative responses from the other rings. In a vast series of such instantaneous actions, thousands of transistors in the machine turn on and off in response to the electric impulses until the machine has assembled the information requested and performed the necessary calculations. The computer thus reaches a decision on demand by counting a great many combinations of binary numbers that stand for the coded and stored information. Today’s newest computers are capable of performing calculations in billionths of a second.

Reliable & Cool. From the simple abacus of ancient times down through the mechanical adding machine, man has for centuries moved toward the computer. As early as 1671 Gottfried Leibnitz sought unsuccessfully to invent a mechanical calculating machine. “It is unworthy of excellent men,” he wrote, “to lose hours like slaves in the labor of calculation.” In 1834 an eccentric Englishman named Charles Babbage conceived the idea of a steam-driven “Analytical Engine” that in many details anticipated the basic principles of modern computers. But not until 1944 did man invent the first true computer: the Mark I, developed by Harvard Professor Howard Aiken and used to compute weapon trajectories for the U.S. Navy.

By today’s standards, Mark I was as slow and awkward as a manual adding machine. In two years it was shoved aside by the University of Pennsylvania’s celebrated ENIAC, which, as the first electronic computer, used 18,000 vacuum tubes as circuits and quick-acting switches. Though they were a big advance, vacuum tubes proved too expensive, too unreliable and too bulky: ENIAC weighed 30 tons and took up 1,500 sq. ft. of floor space. Until 1954, when Remington Rand (now Sperry Rand) first sold its UNIVAC to industry, the few computers in the U.S. were largely experimental and custom designed.

Computers did not really hit their stride until transistors and other solidstate components—tiny, reliable and cool-running—took over from vacuum tubes in 1958. The state of the art has been speeded considerably by the U.S. military and its pressing demands for larger, faster computers. One of today’s computers can make more calculations in one hour than a Yankee Stadium full of scientists could make in a man’s lifetime. Some of the more sophisticated machines can multiply 500,000 ten-digit numbers in one second. Even if no further advances were made in computer technology, some scientists maintain, the computer has provided enough work and opportunities for man for another thousand years.

A Dynamic Alliance. Without the present generation of computers, man could never hope to reach the moon. The development of jet planes would have been delayed for many years. There would be no ballistic missiles or Polaris submarines. Scholars would still be struggling to decipher the Dead Sea Scrolls, a job of rapid indexing and analysis made possible by the computer. To process without computers the flood of checks that will be circulating in the U.S. by 1970, banks would have to hire all the American women between 21 and 45. If all the computers went on the blink, the country would be practically paralyzed: plants would shut down, finances would be thrown into chaos, most telephones would go dead, and the skies would be left virtually defenseless against enemy attack.

Despite his occasional fears about this dependence, man is joining with the machine in what IBM Chairman Thomas J. Watson Jr. calls “a dynamic alliance.” Business is what makes a nation run—and computers increasingly are what make business run. They not only handle such routine matters as paper work, payrolls, billing and inventories, but are also assuming a large role in production and decision making. Computers already control many of the production processes in the paper, petrochemical, petroleum and steel industries. At Western Electric’s “Plant of Tomorrow” in Kansas City, they control the billing, shipping and warehousing, order materials, write the checks to pay for them, decide what to produce and in what quantity. The New York City Bar Association last week staged a mock trial in which it subpoenaed computerized business records as evidence, thus raising questions about how to cross-examine a computer and who to blame when a machine’s decisions cause a corporation to run afoul of antitrust laws.

The most expensive single computer system in U.S. business is American Airlines’ $30.5 million SABRE, a mechanical reservation clerk that gives instant up-to-the-minute information about every plane seat and reservation to American’s 55 ticket offices. International Harvester uses a computer to simulate driving conditions on the Ohio Turnpike and thus evaluate a truck’s probable life span. North American Aviation used computers to run 5,000 simulated test flights of the XB-70 before the plane ever got off the ground. Many large companies, especially in aerospace construction, are taking a lesson from the Navy’s PERT system (for Program Evaluation and Review Technique), which was used to analyze weekly progress reports from 10,000 contractors during the construction of the Polaris.

The world’s biggest single user of computers is the U.S. Government, which spends $1 billion a year to buy, rent and maintain 1,767 machines—not including most of the top-secret machines used by the Pentagon. “The electronic computer,” President Johnson said recently, “has enabled the Government to carry out programs which otherwise would have been impossible.” Computers make out 95% of the Government’s paychecks, keep track of all the G.I. shoes, socks and weapons all over the world, register the course, direction and speed of all shipping in the North Atlantic and, this year for the first time, have begun to check all business income tax returns and a third of individual returns. The White House is experimenting with computers to keep track of men and women available for high Government jobs.

Computers have helped scientists to discover more than 100 new subatomic particles, and are busy analyzing strange radio signals from outer space. Biochemists have used the computer to delve into the hitherto unassailable secrets of the human cell, and hospitals have begun to use it to monitor the condition of patients. Computers now read electrocardiograms faster and more accurately than a jury of physicians. The Los Angeles police department plans to use computers to keep a collection of useful details about crimes and an electronic rogue’s gallery of known criminals. And in a growing number of schools, computers have taken jobs as instructors in languages, history and mathematics.

Also the Seven Dwarfs. All this has meant fast growth for the large U.S. industry that makes computers. In the U.S. 21 companies now turn out the machines, and their production and sales ($5 billion last year) provide jobs for 650,000 people. This year they will put at least 8,000 more computers into operation. The industry offers 250 commercial models of machines, ranging in price from the $8,800 Data Systems DSI-1000 to the $4,300,000 Control Data 6600, and in size from the 59-lb. IBM computer aboard Gemini to machines as heavy as 180,000 Ibs.

IBM is far and away the leader in the field, both in the U.S. and abroad. It has so far installed 13,000 computers in the U.S. and another 3,000 in Western Europe, where industry and laboratories are just beginning to computerize. The payoff: 74% of the U.S. computer market, a dominance that leads some to refer to the industry as “IBM and the Seven Dwarfs.” The dwarfs, small only by comparison with giant IBM: Sperry Rand, RCA, Control Data, General Electric, NCR, Burroughs, Honeywell. The computers have also spawned the so-called “software” industry, composed of computer service centers and independent firms that program machines and sell computer time (for as little as $10 an hour) to businesses that do not need a machine fulltime.

Because computer technology is so new and computers require such sensitive handling, a new breed of specialists has grown up to tend the machines. They are young, bright, well-paid (up to $30,000) and in short supply. With brand-new titles and responsibilities, they have formed themselves into a sort of solemn priesthood of the computer, purposely separated from ordinary laymen. Lovers of problem solving, they are apt to play chess at lunch or doodle in algebra over cocktails, speak an esoteric language that some suspect is just their way of mystifying outsiders. Deeply concerned about logic and sensitive to its breakdown in everyday life, they often annoy friends by asking them to rephrase their questions more logically.

These men, ranging from the systems engineers at the top down to the machine operators, have made a pampered and all but adored child of the computer. Not content with having it perform wondrous feats in space and on earth, they are constantly trying to extend its capabilities. In the experimental milieu they have created, they have taught computers to play ticktacktoe, blackjack, checkers and a passable game of chess, instructed it to compose avant-garde music (the Illiac Suite at the University of Illinois), write simple TV westerns and whodunits, and even try its hand at beatnik poetry. Example:

The iron mother’s bouquet did rudely


Yes, I am as fine as many murmuring crates.

The problem with having a machine for a buddy, of course, is that it does not make a very good conversationalist —but the scientists are busy fixing that. Until now computer experts could only communicate with their machines in one of 1,700 special languages, such as COBOL (Common Business Oriented Language), Fortran (Formula Translation), MAD (Michigan Algorithmic Decoder) and JOVIAL (Jules’s Own Version of the International Algebraic Language). All of them are bewildering mixtures that only the initiated can decipher. Now some computers have reached the point where they can nearly understand—and reply in—plain English. The new Honeywell 20 understands a language similar enough to English so that an engineer can give it written instructions without consulting a programmer. The day is clearly coming when most computers will be able to talk back.

Mass Leisure. At least for now, the computer seems to raise almost as many problems as it solves. The most pressing and practical one is, of course, displacement of the work force. Each week, the Government estimates, some 35,000 U.S. workers lose or change their jobs because of the advance of automation. There are also thousands more who, except for automation, would have been hired for such jobs. If U.S. industry were to automate its factories to the extent that is now possible—not to speak of the new possibilities opening up each year—millions of jobs would be eliminated. Obviously, American society will have to undergo some major economic and social changes if those displaced by machines are to lead productive lives.

Men such as IBM Economist Joseph Froomkin feel that automation will eventually bring about a 20-hour work week, perhaps within a century, thus creating a mass leisure class. Some of the more radical prophets foresee the time when as little as 2% of the work force will be employed, warn that the whole concept of people as producers of goods and services will become obsolete as automation advances. Even the most moderate estimates of automation’s progress show that millions of people will have to adjust to leisurely, “nonfunctional” lives, a switch that will entail both an economic wrench and a severe test of the deeply ingrained ethic that work is the good and necessary calling of man.

Liberated Brainpower. Ever since the Industrial Revolution, each major technological advance has caused unemployment, but society has somehow managed on each occasion to adjust and go forward. If the new technology eliminates many of the jobs that man has been accustomed to doing, it is also bound to expand greatly the level and variety of human wants. If U.S. farms had never mechanized, for instance, and thus displaced a large pool of labor, the U.S. would have been hard pressed for workers to develop its present industrial might. Says Dr. Yale Brozen, a University of Chicago economist: “Society uses whatever number of people it has. Seventy years ago, 50% of the population farmed. Now only 7% does. That enormous change took place in just a couple of generations.”

Automation is also certain to liber ate both manpower and brainpower to tackle tasks hitherto considered impossible and to meet human needs till now deemed impractical. The world, after all, could certainly use a lot of improvement. “What the hell are we making these machines for,” says Dr. Louis Fein, a California computer consultant, “if not to free people?” Many scientists hope that in time the computer will allow man to return to the Hellenic concept of leisure, in which the Greeks had time to cultivate their minds and improve their environment while slaves did all the labor. The slaves, in modern Hellenism, would be the computers.

On the rocky road back to this Garden of Eden, a lot of people are bound to suffer for a while. But by gradually raising educational levels, retraining those displaced by automation, and seeing to it that displaced workers retain their buying power, society will somehow gradually manage to support the change. So far, the installation of computers in some industries has required so many new skills that the total unemployment level has hardly changed. In many cases, the computer has not meant an overall loss of jobs so much as a change in the type of jobs done. Says Sir Leon Bagrit: “Mechanization has sometimes given millions of people subhuman work to do. Automation does the exact opposite.”

Like a Ballerina. One area made mercilessly vulnerable by the computer is that of U.S. business management. The computer has proved that many management decisions are routine and repetitive and can be handled nicely by a machine. Result: many of the middle management jobs of today will go to computers that can do just about everything but make a pass at a secretary. As much as anything else, the computer is of great value to big business because it forces executives to take a hard, logical look at their own function and their company’s way of doing business. “Computers don’t take the risks out of business,” says Ted Mills of Manhattan’s Information Management Facilities Inc. “They just make the risks clearer.”

Though they have so far been largely excluded from management policy decisions, the men who run computer operations eventually are bound to have a bigger voice in business. Who else will understand the beasts? “There will be a small, almost separate society of people in rapport with the advanced computers,” predicts Donald M. Michael, a social psychologist at the Institute for Policy Studies in Washington. “They will have established a relationship with their machines that cannot be shared with the average man. Those with talent for the work will have to develop it from childhood and will be trained as intensively as the classical ballerina.”

Not even the computer experts, however, are quite sure what kind of thing they have created—or what are its ultimate potential and limitations. The computer, says Dr. Herbert A. Simon of Carnegie Tech, represents “an advance in man’s thinking processes as radical as the invention of writing.” Yet the computer is neither the symbol of the millennium nor a flawless rival of the human brain. For all its fantastic memory and superhuman mathematical ability, it is incapable of exercising independent judgment, has no sense of creativity and no imagination.

One of the great ironies of the computer is that it would rate as a low-grade moron if given an IQ test. “With a computer,” says Mathematician Richard Bellman of the Rand Corp., “everything is reversed. If a one-year-old child can do it, a computer can’t. A computer can calculate a trajectory to the moon. What it cannot do is to look upon two human faces and tell which is male and which is female, or remember what it did for Christmas five years ago.” Bellman might get an argument about that from some computermen, but his point is valid.

Learning About Life. Most scientists now agree that too much was made in the early days of the apparent similarities between computers and the human brain. The vacuum tubes and transistors of computers were easy to compare to the brain’s neurons—but the comparison has limited validity. “There is a crude similarity,” says Honeywell’s Bloch, “but the machine would be at about the level of an amoeba.” The neurons, which are the most important cells in the brain, number some 10 billion, and each one communicates with the others by as many as several hundred routes. So mysterious does the brain remain that few of the major connections among the neurons have been traced, and they may never be. It is clear, however, that each neuron is itself like a computer, and that eventually the idea that a machine has humanlike intelligence will become part of folklore. “We’ll laugh at the idea,” says Dr. Herbert Teager, an M.I.T. physicist, “as we do at Descartes’ theory that the pineal gland is the center of the mind.”

That is not to say that the computer cannot learn. Some computer experts believe that the machines can learn by themselves through trial and error, as children do, evaluating their mistakes and searching for better procedures. This is the heuristic approach: the method by which a computer acquires a knowledge of checkers or learns to play war games. In any case, nearly all experts agree that the computer will eventually achieve close symbiosis with man, more and more informing and reforming his entire society.

Before that day arrives, the computer has quite a bit to learn about life. Put to picking ideal marriage partners, a computer has selected brother and sister. Asked, “How do amphibians protect themselves?”, a computer at California’s System Development Corp. typed out its reply: “Roman soldiers protected themselves by locking shields.” The computer had, of course, mindlessly confused key words. Though its memory units give it an impressive quantitative advantage, the computer is qualitatively inferior to any schoolboy.

Nonetheless, it is the closest that man has ever come to transferring his intellectual powers to machines, and its needs and accomplishments are sure to occasion a lot of debate. In a book written shortly before his death, M.I.T.’s Norbert Wiener, the “father of cybernation,” said that “the reprobation attaching in former ages to the sin of sorcery attaches now in many minds to the speculations of modern cybernetics. The future offers us little hope for those who expect that our new mechanical slaves will offer us a world in which we may rest from thinking. Help us they may, but at the cost of supreme demands upon our honesty and our intelligence.”

Staggering Capacity. The most impressive fact about the age of the computer is how young it still is—and how little society has yet felt the full impact of the computer’s potential. In the years to come, computers will be able to converse with men, will themselves run supermarkets and laboratories, will help to find cures for man’s diseases, and will automatically translate foreign languages on worldwide TV relayed by satellite. Optical scanning devices, already in operation in some companies, will eventually enable computers to gobble up all kinds of information visually. The machines will then be able to memorize and store whole libraries, in effect acquiring matchless classical and scientific educations by capturing all the knowledge to which man is heir.

One of the most important computer innovations is the introduction of time sharing, in which many users across the nation have nearly simultaneous access to a central computer complex by teletype hookup. At M.I.T., Project MAC (for Machine-Aided Cognition) is already solving problems, answering questions and keeping books on an experimental basis for some 400 users. Scientists who know MAC’s language can feed their problems to the computer from typewriter-like keyboards in their own homes or labs. Thus, computers will eventually become as close to everyday life as the telephone—a sort of public utility of information.

The computer provides man with a staggering new capacity to discover, build, solve and think. Thomas L. Whisler, professor of industrial relations at the University of Chicago’s Graduate School of Business, points out that “it will change everybody’s life, and all change requires some effort and some cost to people.” So it must be: the computer is already upsetting old patterns of life, challenging accepted concepts, raising new specters to be conquered. Years from now man will look back on these days as the beginning of a dramatic extension of his power over his environment, an age in which technology began to recast human society. In the long run, the computer is not so much a challenge to man as a challenge for him: a triumph of technology to be developed, subdued and put to constantly increasing use.

More Must-Reads from TIME

Contact us at letters@time.com