Video Games

Beyond Earth Interview: ‘No Civilization Game Would Be Made Without Sid. He’s the Guy.’

Firaxis

Turn-based science fiction games are scarce in gaming’s history, much less ones with insight into the history of the genre. There’s Julian Gollop’s X-COM (or Jake Solomon’s XCOM, the recent reboot), Brad Wardell’s Galactic Civilizations, Steve Barcia’s Master of Orion and the odd Civilization mod, but the one I’d wager most remember the fondest is Sid Meier’s Alpha Centauri from 1999.

Sid Meier’s Civilization: Beyond Earth is a spiritual sequel to the latter, a 4x (eXplore, eXpand, eXploit, and eXterminate) game that trades its namesake’s traditional obsession with things that’ve already happened for things that have yet to. It’s a game whose design team sounds as intrigued by the ramifications of our post-human future as they are obsessed with folding such heady concepts into a compulsive piece-pusher — something worthy both of the “one more turn” cliche and sci-fi’s legacy of stirring, often subversive fiction.

Firaxis unveiled the game at PAX East on April 12 — it’s due later this year. You can watch the PAX East panel’s announcement here:

This is the second part of a two-part interview, here with the game’s lead designer David McDonough and lead producer Lena Brenk. The first part — with gameplay designer Anton Strenger, Sid Meier’s Civilization series senior producer Dennis Shirk and associate producer Pete Murray — is here.

In Beyond Earth, you lead different factions with contrasting cultures. One of the critiques of Civilization V‘s take on culture was that it felt like a second tech tree instead of a feature unto itself. How does culture work in Beyond Earth?

David McDonough: There’s a system called virtues, which is an expression of what your civilization cares about, so who they grow up to be, what their priorities are and so forth. It’s been totally redesigned for this game, meaning it’s different from any previous Civilization. Culture drives the acquisition of items within a virtue table, and those items have a lot of cross-linking benefits in and out of other systems in the game — everything from city progression to tile improvement to military strategies to territorial acquisition and diplomacy and so on.

Lena Brenk: The way Anton designed it, the trees are a lot deeper, so you have a tree that you can follow down, the whole column through, and the more points you spend in one tree, you get kickers — additional bonuses that you rack up. If you go very wide and select virtues from different branches of different trees, you get kickers as well, but they’re different in that they give you bonuses for going in very different directions and not focusing on one tree. So the system is quite different from prior Civilization games.

Firaxis

Recognizing that realism’s subordinate to gameplay, how hard-science-minded have you been able to keep Beyond Earth, for those who relished that aspect of Reynolds’ Alpha Centauri?

DM: We care deeply about exactly that thing. When we set out to design the game, we were already huge fans of not just science fiction, but actual science, and one of the first things we did, and you can find this on Wikipedia, is that we pulled together the original reading list that the designers of Alpha Centauri assembled. I think between us on the Beyond Earth team, we’d read about half of that list before we got started, so we read the other half, and then some.

Every part of the game’s been designed with a very careful eye toward achieving the sweet spot Alpha Centauri did, with finding a plausible link between science that everybody knows and that’s real, and science fiction that makes sense and comes from it. I think one of the best expressions of this in the game is the technology web. The future is treading technological ground that we don’t know yet, and we get to invent it. So we start you in the center of a web surrounded by technologies that are more or less recognizable, that are based on present-day Earth technologies plus a few hundred years. But then it radiates outward to any of a dozen very different technological places, and they all end up in a very interesting sci-fi place that is definitely sci-fi, but also definitely plausible, and you can see the thread all the way through from today to then and how humankind could have gotten to that technology, and why they would have, and what they’d do with it.

So as you play the game, you get to make these really interesting choices along the way, like what kind of technology is important to me, what fits my needs on the ground, what’s going to help me achieve victory, what do I just find the most attractive, and by the time the game is over, you have a collection that represents your priorities as the human race. Your neighbors on the planet will have made a different set of choices, of course, and you’ll clash because your technologies don’t line up.

LB: I can attest to the enthusiasm with which the design team went at it, and the art team as well. We love history here at Firaxis, we love Civilization of course, but going into the future — far into the future — was really cool. It was a challenge, but such an opportunity for the art team to stretch their legs. The designers came up with the technologies and said this is what we’re going for, and then the art team came in and had to imagine what that would mean for units and leaders and the alien environment, how that would look and be represented in the game. The enthusiasm was incredible, and still is incredible, since we’re only pre-alpha at this point.

What’s the timeframe in the game? How many years are we talking, from launching your colony ship to an average game’s conclusion?

DM: That’s a good question, because we don’t say specifically in the game. And we do that on purpose so the player can enjoy imagining the answers to the questions they’re asking. We hypothesize that it’s roughly 200 to 300 years from today, that that’s when the seeding occurs, and once you land on the planet, you play forward by somewhere between 1,000 and 2,000 years.

Firaxis

The reason I ask is that Alpha Centauri managed to sneak in some pretty out-there futurist notions, and if you follow guys like Ray Kurzweil today, you know he thinks this notion that Star Trek‘s going to happen in another century or two misses the point — that we’re going to be clouds of foglets or whatever long before we’d ever get to Roddenberry’s naval-metaphor view of humans sailing through space and yet somehow remain human as we define human today.

DM: Yeah, that was really the first kernel of the design, the first question we asked: What is the human race going to look like in 500 years, let alone 1,500 years? What kind of post-human weirdness is going to happen? There’s no shortage of interesting ideas in sci-fi, ideas that range from plausible to at the same time sort of terrifying.

We sculpted the game around three impressions of that, which we call affinity, and each one represents a concept somewhere between an ideology and a religion — it’s more just a philosophy of what humankind is going to be like by the time you reach the next great turning point in our history.

One of them, supremacy, is very focused on technology as the savior of humanity, that by embracing the machines and eventually integrating them to the point of replacing yourself, the future of humanity is forever assured — that these machines can survive any environment, we will never be displaced from our home again and we’re saved by the machines. Living as a nano-cloud is reflected in the ultimate extent of those technologies in the web and in some of the things you’re able to build, some of the wonders and so on. We go right up to that threshold and hint at it, then suggest to the player, “Look at this crazy place humanity’s arrived at, and just imagine what’ll happen next.”

Firaxis

It sounds like you’re hoping to use the new quest system as your primary storytelling mechanism.

LB: That’s right. In Civilization we’ve generally been able to assume that players know what the history of humanity has been, more or less, to date. You don’t need to be a historian to know who Genghis Khan was, or the Maya. That lets players tell their own story because they have a historical framework to do so.

When we’re going into the future, that framework’s obviously unclear. We still want the player to tell their own story, but giving them that framework was important, and so one of the ways we found to do this was the quest system. We use it to give snippets of information, little insights into the alien planet and the wildlife there, to give the player a feel for where they’ve landed.

Firaxis

You’ve also added a second strategic angle that’s an actual layer physically superimposed above the traditional one. How does the new orbital system relate to the planetary one?

DM: The core experience still transpires on the planet, so think of the orbital layer which exists above it as an augmentation: It’s a different way to play with the same pieces. You build orbital units in your cities, then launch them into orbit, which exists on a camera level above the planet’s surface. All of the orbital units are designed based on their effects on things on the ground (or water, as the case may be). And so everything from terraforming the ground, augmenting your improvements in your cities, buffing your military units or making military tactics possible to the point of outright bombarding holdings on the ground. And then the other way around, with things on the ground being able to shoot down orbital units. That’s how orbital play is done. Whatever your aims and ambitions and problems are on the surface of the planet, the orbital layer is an extension and complication of them.

Firaxis

Sid Meier was one of the lead designers on Alpha Centauri, and Beyond Earth carries his name in the full title. To what extent is that branding? Or put another way, how hands-on is he with Beyond Earth?

DM: Sid is really the benevolent uncle-godfather of all the designers at the studio. Every Civilization game bears his imprint and has his involvement in it. This is no exception. We never questioned that the game would be called Sid Meier’s Civilization. It belongs in the Civilization franchise and we want it to stand along with that incredible legacy.

That said, it’s a brand new experience, and it takes place literally beyond Earth. The title expresses exactly what the game is — that it belongs in the Civilization legacy, but that it’s a new idea within it. And as a designer I can tell you that Sid’s influence, his insight and his participation are extremely important. He’s always present, always willing to play the game and lend his thoughts and perspectives. I think no Civilization game would be made without Sid. He’s the guy.

MORE: The History of Video Game Consoles – Full

technology

Now You Can Explore the Star Trek: Voyager Deck With the Oculus Rift

+ READ ARTICLE

The virtual reality headset Oculus Rift already allows users to enter far-flung lands such as Tuscany, Game of ThronesWesteros and Jerry Seinfeld’s apartment. Now Oculus owners can beam up to the famous spaceship from Star Trek: Voyager as well.

The new demo, created by independent developer Thomas Kadlec, features an incredibly detailed recreation of the Voyager’s bridge, complete with computer monitors lit up with buttons and windows that offer a view out to the stars. The demo was made using Unreal Engine 4, a new game development engine that should allow more complex worlds to be built for the Rift.

Oculus VR, the company behind the Rift, has released multiple iterations of its headset to developers, who have tinkered with the technology in fascinating ways. The company, which was bought for $2 billion by Facebook in March, has yet to announce when the Rift will see a release as a consumer product.

[The Verge]

Innovation

Smooth Moves: The History and Evolution of Honda’s ASIMO Robot

+ READ ARTICLE

As the robotics realm continues to heat up, Honda’s ASIMO (short for Advanced Step in Innovative Mobility) is something of an old-timer.

It’s been around for 14 years, and has seen continual improvements – check out the above video for more of the backstory.

While some robots have a more menacing look – ahem, Atlas – ASIMO has always played the part of a cutesy, Jetsons-style robot meant, in Honda’s words, “to help those in society who need assistance.”

In that spirit, ASIMO is able to do things like opening and serving beverages. It knows sign language – both Japanese and English. It can avoid bumping into people in hallways. Stuff like that.

At the International Auto Show in New York last week, Honda showed off ASIMO’s latest improvements. The robot, once relatively rigid and… well, robotic, is now far more nimble, able to run, jump, climb stairs and kick soccer balls with more human-like dexterity.

 

Big Picture

Who Needs a Memory When We Have Google?

Brain
Getty Images

I have a confession to make: I’m an infomaniac.

In high school, I was on the debate team and got an early taste of what it’s like to dig deep into information so that I could support my debate arguments. Ever since, I have been hooked on gathering and consuming information as part of my lifestyle. I still get a morning paper delivered to my house and I start my day by checking up on the local news. When I get to the office, I log on to all types of general news and tech sites to catch up on what I missed overnight. Curiosity is in my DNA and my type-A personality drives me to be addicted to information. In my line of work, this is good, but I admit that I overachieve in this area and it sometimes becomes overwhelming.

For most of my early life, this was a manageable problem. In those days, I had newspapers, magazines and a set time to watch the network news every night at 6:00 PM. But from the beginning of the information age and especially with the advent of the Internet, the amount of information sources at my fingertips grew exponentially. I admit that, more often than not, I now have information overload. To put it another way, I have way too many tabs open in my brain at any given time.

It’s almost impossible to keep some that info straight or, even worse, remember most of it. That’s where Google and search engines come in. While I was at the TED conference recently, I talked with a lot of people from various industries. We often compared notes on things we were doing, people we know and items or events that we have been involved in over the years. What’s interesting is that the common denominator in many of these discussions is that when we got stumped on a person’s name, event or item we were talking about, instead of fretting about it, we all took out our smartphones and Googled for the answer. We almost always found what we were looking for, and the conversation continued only slightly interrupted.

In all honesty, that scene happens for me whether with business associates, friends or family. I clearly can’t remember all of the information I take in, so I now rely pretty heavily on Google and other search engines to either find the information I need at any given time or to jog my memory about the topic at hand.

I am sure that this has happened to a lot of people. The role technology plays as an extension of our memory banks has become quite important to us. I have found that when I’m digesting information now, many times I don’t even read the full stories — mostly just the headlines or a quick summary, knowing that if I ever have to recall it, I can just Google it.

In 2008 the Atlantic had a great article by Nicholas Carr titled “Is Google Making Us Stupid?” In this excerpt from this article, Carr says:

For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. “The perfect recall of silicon memory,” Wired’s Clive Thompson has written, “can be an enormous boon to thinking.” But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.

I am not sure his premise that Google makes us stupid is exactly correct. In fact, I would argue that because of a search engine’s ability to help us quickly find the information we need, it’s actually making us smarter, to a degree. But what Google seems to be doing to me — and perhaps many others — is making our minds lazy. Many times, I may be told something without really concentrating on what is being said, knowing full well that as long as I get the bullet points straight, I can always go back and look up the info.

At first I wanted to chalk some of these memory lapses up to getting older. It’s just part of aging, right? But the more I read about aging, the more I realize that some of this is happening because we are not exercising our brains as much as we should be. More and more often, we’re relying on Google to be a fallback. We concentrate less on what’s in front of us, leaning on Google for anything we can’t remember.

A while back, my wife bought me a Nintendo handheld game system that included a game called Brain Age. It was my first foray into digital brain games, and I found that the more I used it, the more it helped me fine-tune my brain to be much more cognizant of what I was reading and observing. This game came out before everyone had smartphones, and now we have dozens of brain training tools such as my favorite, Lumosity, or Condura, another brain training app.

There are a lot of studies that talk about the Internet’s impact on memory. One that was highlighted in the New York Times in 2011 shared some specific research about this issue. In the article, Patricia Cohen wrote the following:

The widespread use of search engines and online databases has affected the way people remember information, researchers are reporting.

The scientists, led by Betsy Sparrow, an assistant professor of psychology at Columbia, wondered whether people were more likely to remember information that could be easily retrieved from a computer, just as students are more likely to recall facts they believe will be on a test.

Dr. Sparrow and her collaborators, Daniel M. Wegner of Harvard and Jenny Liu of the University of Wisconsin, Madison, staged four different memory experiments. In one, participants typed 40 bits of trivia — for example, “an ostrich’s eye is bigger than its brain” — into a computer. Half of the subjects believed the information would be saved in the computer; the other half believed the items they typed would be erased.

The subjects were significantly more likely to remember information if they thought they would not be able to find it later. “Participants did not make the effort to remember when they thought they could later look up the trivia statement they had read,” the authors write.

Whether our brains have become lazy or not, the Internet has clearly impacted the way we read and digest information, and as stated in the Times’ article, search engines have now become just a part of our memory processes. Search engines are very valuable, but if they become crutches that dull our thinking and make our brains lazy, then I believe people will need to use things like Lumosity and other brain-tuning games to help them stay sharp.

Information overload makes it impossible for many of us to keep up with the constant stream of information that’s available. Because many of us try to consume so much information, most of us are forced to mostly skim highlights and summaries just to keep up. However, I believe we can’t let search engines impact our memory. At least in my case ,I don’t want that to happen, so I’m using these brain games to help me deal with this challenge.

Bajarin is the president of Creative Strategies Inc., a technology industry analysis and market-intelligence firm in Silicon Valley. He contributes to Big Picture, an opinion column that appears every week on TIME Tech.

Nintendo

Take Five Seconds to Honor Game Boy’s 25th Anniversary

+ READ ARTICLE

I’m not much for anniversary retrospectives concerning classic video game systems. Not that there’s zero value in examining history, but the older a console gets, the more it feels like we’re recycling the same factoids every time a gaming system reaches another large, round number.

So it goes with the Nintendo Game Boy, which launched in Japan on April 21, 1989. In case your memory is foggy from the last round of retrospectives five years ago, you’ll find more look-backs around the Internet on today’s 25th anniversary. (Jeremy Parish’s write-up for USGamer is pretty good.)

Personally, I prefer to let the above video do all the talking. That little start screen is all I need to unlock a trove of memories, from stuffing too many cartridges into my carrying case at home to slumping in the corner of a dingy gym next to my best friend, playing Teenage Mutant Ninja Turtles: Fall of the Foot Clan while his mom Jazzercised.

Happy 25th anniversary, Game Boy.

Opinion

Boston Marathon Bombings: Making Sense of the Social Media Blitz

Scribbled notes for the first rough draft of history

When I first heard that two bombs had exploded at the finish line of the Boston Marathon, I was sitting on my couch — the afternoon sunlight streaming through from the tall window behind me — cradling my 12-day-old daughter in my arms. While she slept peacefully, I took the opportunity to catch up on my Twitter feed.

I’ll always have that tranquil moment as a reminder of how April 15 began, before the bombs. It is a stark contrast to the feeling that immediately followed, reading a barrage of tweets — information and misinformation — originating from Boylston Street, less than 10 miles south of where we sat.

Of course I found out from Twitter; everybody did, it seemed. After reading the news in my feed, my wife and I turned on the TV to see what was going on, but after a few minutes of watching television reporters spout empty speculation and unverified information, we turned it off again.

The previous week I had returned to work after a short paternity leave; I teach journalism and writing courses at a small college just south of the city. It happened that I was teaching an Introduction to Media Studies course that semester. In the last class before my daughter was born, I asked my students to consider what the most significant revolution brought on by the Internet might be. Just over a week after I returned, we were seeing it in action.

There has been no shortage of handwringing over the role that the Internet played in the events of that day and the tense week that followed. Though there is plenty to praise — the excellent work of some eyewitnesses who truly became amateur reporters, the absolute immediacy of information — there’s also much to worry about: the emotion-fueled speculation, the misinformation, the vigilante journalism.

The Internet made all of these things a reality, and while it’s impossible to try to assign a quantitative “good”or “bad” to these developments, what we can know for sure is that we will never go back. This is how big news is reported now.

In some ways, I’m right there with the hand wringers. I’ll take truth over immediacy any day. But, from the vantage point of a year later, I’m beginning to see a great value to the stream of tweets and status updates that (sometimes inaccurately) reported the news of the Marathon Bombings and the subsequent search for the bombers as it happened. But I wouldn’t be able to see this value if it weren’t for archivists and collectors, curators, scholars, and storytellers who have, in the year that’s passed, begun to make sense of the social media blitz of that week.

If journalism is the first rough draft of history, eyewitness reports captured on mobile phones and broadcast to the world are the first notes — scratchings, written hastily on Post-its, which later become an outline that eventually inform the first draft as well as the drafts that follow. They are hastily scribbled and stuck in the moment, but later, when a skilled storyteller comes along, they begin to take shape into a cohesive narrative. And, particularly in the case of the Marathon Bombings, they take on a life of their own as a kind of meta-narrative — we get a sense of how we respond when tragedy strikes.

It’s not always a pretty picture. The false identification of the perpetrators first on Reddit and then on the front page of The New York Post, the hasty and lazy reporting by cable news networks, the threats against Muslims here in Boston, serve as a perfect example of how instant news culture can do very real harm. But even those mishaps can teach us something valuable.

I’m grateful to those organizations like the NULab at Northeastern University who, in collaboration with WBUR, are creating a digital archive of artifacts from the bombings. And the marathon memorial that will open at the Boston Public Library, which features items such as running shoes, t-shirts, and photos left near the finish line as an instant tribute. And the hand sewn flags, mailed in from around the country and the world, on display at the Museum of Fine Arts as part of its “To Boston With Love” installation.

What all of these have in common is that they are collections of natural and emotional reactions in the immediate wake of tragedy, and though they may not mean much individually, when collected and curated, they tell a story of their own; they truly become a memorial. This is how I choose to think of those tweets and status updates. Even with their misleading and sometimes flat-out wrong information, even if they are a testament to the ways in which the cult of now has reshaped our news consumption, taken together they are a monument to a city that experienced tremendous grief on an otherwise beautiful day in April, and they tell the story of how we continue to cope with that tragedy today.

Jonathan D. Fitzgerald is the author of Not Your Mother’s Morals: How the New Sincerity Is Changing Pop Culture for the Better and the editor of Patrolmag.com.

Internet

4 Reasons to Be Bullish About Netflix

A new survey of web users' entertainment habits finds Netflix has surpassed YouTube as the top online video site but as the company publicizes its first-quarter earnings, analysts are eager to see how much its original content is helping to bring in more subscribers

Few companies are better positioned to capitalize on the Internet video boom than Netflix, the erstwhile DVD rental business turned online streaming powerhouse.

That’s part of the reason why investors have pushed Netflix shares up more than 100% over the last year, although the stock is down about 20% over the last month amid broader weakness in the tech sector. A new study by Experian shows that Netflix is playing an important role in the still-nascent “cord-cutting” trend, in which users eschew cable TV service in favor of services like Netflix and Aereo.

On Monday, Netflix will report earnings results for the first three months of 2014, and Wall Street analysts are eager to see the extent to which the company’s original content—House of Cards, in particular—is helping to attract new subscribers.

“We believe the company should benefit from the launch and awareness around season 2 of House of Cards and continued improvements in content, as well as positive seasonality driven by more Internet-connected devices and colder weather,” Morgan Stanley technology analyst Doug Anmuth wrote in a note to clients.

Netflix’s stock price has historically been very volatile, and if the company fails to meet expectations Monday—analysts expect the company to announce that it has added 2.25 million new subscribers—its shares could take a tumble. But over the long term, there are several reasons to be optimistic about Netflix’s prospects, according to a recent study by investment bank RBC Capital Markets. Here are four of them:

1. For the first time, Netflix has surpassed YouTube to become the leading online video site, according to the RBC Capital Markets survey, which asked 1,033 Internet users about their entertainment habits. (This is the 10th such survey conducted by RBC technology analyst Mark S. Mahaney since May 2011). Some 44% of respondents said they use Netflix to watch movies or TV shows—up from up from 37% one year ago—edging out YouTube, which came in at 43%.

2. Most Netflix customers are happy with the service, according to the survey. Overall Netflix satisfaction levels are now at record levels, with 66% of current subscribers responding that they are either “extremely satisfied” or “very satisfied” with their service, up from 62% one year ago.

3. Netflix subscribers are increasingly less likely to leave the service, the survey found, with 69% of current subscribers “not at all likely” to cancel their subscriptions in the next three months, up from 66% one year ago. “We note that this is the highest level we have tracked in more than two years,” Mahaney wrote.

4. There is increasing evidence that Netflix’s original content is keeping customers subscribed to the service by acting as “an anticipatory anti-churn factor,” as Mahaney describes it. Some 47% of Netflix subscribers said that original content was “extremely important,” “quite important,” or “moderately important,” when deciding about whether to remain a subscriber, up from 42% in November 2013.

Taken together, these trends provide a reason to be optimistic about Netflix’s prospects. And despite the dramatic increase in the Netflix’s stock price over the last year, Mahaney argues that the company remains undervalued. “We continue to believe that Netflix has achieved a level of sustainable scale, growth, and profitability that isn’t currently factored into its stock price,” he wrote in a recent note to clients.

Wall Street analysts will also be eager to hear more details from Netflix executives about the company’s controversial agreement to pay Comcast for a direct connection to the nation’s largest broadband provider. Although Netflix CEO Reed Hastings has expressed his displeasure about the deal, there is clear evidence that the interconnection pact is substantially boosting Netflix performance for Comcast subscribers, which should improve customer satisfaction even further.

“We believe it is likely that Netflix is having similar conversations concerning interconnection agreements with other broadband providers,” Mahaney wrote, “and we view the Comcast deal as incrementally positive for Netflix in the long term, as it should provide a better user experience for the company’s streaming subs.”

Internet security

Healthcare.gov Users Urged to Change Passwords Over Heartbleed Fears

No security breach has been detected but online healthcare enrollees are warned to change their passwords as a precaution against the programming flaw. The government is reportedly carrying out a review into the Heartbleed bug

People who used the Obama administration’s healthcare.gov website to enroll in insurance plans under the government’s healthcare reform law are being warned to change their passwords in defense against the notorious Heartbleed internet security flaw.

“While there’s no indication that any personal information has ever been at risk, we have taken steps to address Heartbleed issues and reset consumers’ passwords out of an abundance of caution,” said a post on the website. The government is reportedly carrying out a review into the Heartbleed bug, according to the Associated Press.

The Heartbleed programming flaw has affected widely used encryption technology, and major internet services have recommended users change their website passwords. Critics have said the healthcare online enrollment presents myriad opportunities for hackers to exploit security flaws. The IRS has already said it was not affected by Heartbleed.

Obama announced this week that about 8 million people have enrolled in the insurance plans, exceeding forecasts.

Technologizer

The History of Technology, as Told in Wacky British Pathé Newsreels

How computers--gigantic, noisy ones--changed practically everything

In an inventive, generous act, British Pathé has uploaded its entire collection of 85,000 pieces of footage from vintage newsreels to YouTube. If you stop by to check it out, you might have trouble pulling yourself away. It’s a fascinating survey of what happened to the world from 1896-1976, told in bite-sized chunks.

The collection is searchable, so I pulled up some choice bits relating to computers–especially how they got used to automate practically everything in the 1960s. This stuff was amazing at the time–especially, it seems, if you were a British newsreel announcer.

1949: An engineer teaches a machine to play noughts and crosses, better known to you and me as tic-tac-toe

1962: Pan Am and IBM sign a deal to computerize airplane reservations (watching this, it hit me: how the heck did they do them before computers?)

1966: Rowland Emett, the Rube Goldberg of the U.K., demonstrates his homemade computer

1967: During an outbreak of hoof and mouth disease, horse-racing fans settle for a computerized simulation

1967: The latest in automation–from the Auto-Typist to a pocket-sized dictation machine–gets demonstrated at the Business Efficiency Exhibition

1968: Honeywell demonstrates its “girl robot,” Miss Honeywell, who, I regret to say, I suspect of being an elaborate hoax

1968: A report on the Univac-powered Tinder of its day, complete with a Beatles soundtrack

1968: A Putney man composes music with his home computer, which happens to be a PDP-8 minicomputer

One thing I learned from watching all of these: Unless British Pathé sweetened its soundtracks, computers used to be noisy. I’m just as glad we no longer have to listen to that incessant clackety, clackety, clackety, clacking.

Technologizer

Bye-Bye FuelBand: Nike Won’t Be the Last Company to Get Out of Wearable Hardware

Nike FuelBand
Nike

A pioneer in fitness trackers decides they don't have a future--at least as a Nike product line

Nick Statt of Cnet has a scoop: He’s reporting that Nike is laying off most of the people on its team responsible for the FuelBand fitness tracker. Instead of making its own hardware, the company will focus on fitness-related software henceforth.

It’s impossible to hear this news without bringing up the fact that Apple CEO Tim Cook is on Nike’s board and wondering whether it relates in any way to any plans Apple might have in the smart watch/fitness category. There, I just did. But that’s all I’m going to say, because who knows?

This I do know: I’m sorry to see the FuelBand go away. Though it didn’t do anywhere near as much as a Fitbit or Jawbone Up, I loved Nike’s hardware design, with its straightforward display and a clasp that locked securely and doubled as a USB connector. (It’s one of the few wearables that doesn’t make you keep track of a stupid little charging dongle.) I was hoping to see it evolve further; Statt says an upcoming model was canceled, though the current FuelBand SE will stay on the market.

Still, I think it’s possible that Nike’s move is a smart one, strategically. There are just gazillions and gazillions of fitness trackers on the market now–a little like there were once gazillions of e-readers, and before that, gazillions of MP3 players. And now phones such as Samsung’s Galaxy S5 are adding enough fitness-related features–it even has a heart-rate monitor–to render a wristband superfluous for some folks. (The evidence suggests that Apple plans to turn the iPhone into a health aid, too. )

Bottom line: Whether or not Nike has any specific knowledge of anything Apple might be planning to unveil, it has a good idea which way the wind is blowing. I’ll bet it won’t be the last player to exit this category.

Your browser, Internet Explorer 8 or below, is out of date. It has known security flaws and may not display all features of this and other websites.

Learn how to update your browser