MONEY Opinion

Innovation Isn’t Dead

177800130
Dave Reede—Getty Images A farmer looks out over his field of canola being grown for biofuel while the encroachment of his farmland by housing development is in the background, Winnipeg, Manitoba, Canada

Most important innovations are only obvious in hindsight.

Wilbur and Orville Wright’s airplane flew for the first time in December 1903. It was one of the most important innovations of human history, changing the world in every imaginable way.

To celebrate their accomplishment, the press offered a yawn and a shoulder shrug.

Only a few newspapers reported the Wright’s first flight at Kitty Hawk, N.C. All of them butchered the facts. Later flights in Dayton, Ohio, the brothers’ home, still drew little attention.

David McCullough explains in his book The Wright Brothers:

“Have you heard what they’re up to out there?” people in town would say. “Oh, yes,” would be the usual answer, and the conversation would move on. Few took any interest in the matter or in the two brothers who were to become Dayton’s greatest heroes ever.

An exception was Luther Beard, managing editor of the Dayton Journal … “I used to chat with them in a friendly way and was always polite to them,” Beard would recall, “because I sort of felt sorry for them. They seemed like well-meaning, decent enough young men. Yet there they were, neglecting their business to waste their time day after day on that ridiculous flying machine.”

It wasn’t until 1908 — five years after the first flight and two years after the brothers patented their flying machine — that the press paid serious attention and the world realized how amazing the Wrights’ invention was. Not until World War II, three decades later, did the significance of the airplane become appreciated.

It’s a good lesson to remember today, because there’s a growing gripe about our economy. Take these headlines:

  • “Innovation in America is somewhere between dire straits and dead.”
  • “Innovation Is Dead.”
  • “We were promised flying cars. Instead we got 140 characters.”

The story goes like this: American innovation has declined, and what innovation we have left isn’t meaningful.

Cancer? Not cured. Biofuel? An expensive niche. Smartphones? Just small computers. Tablets? Just big smartphones.

I think the pessimists are wrong. It might take 20 years, but we’ll look back in awe of how innovative we are today.

Just like with the Wright brothers, most important innovations are only obvious in hindsight. There is a long history of world-changing technologies being written off as irrelevant toys even years after they were developed.

Take the car. It was one of the most important inventions of the 20th century. Yet it was initially disregarded as something rich people bought just to show how deep their pockets were. Frederick Lewis Allen wrote in his book The Big Change:

The automobile had been a high-hung, noisy vehicle which couldn’t quite make up its mind that it was not an obstreperous variety of carriage.

In the year 1906 Woodrow Wilson, who was then president of Princeton University, said, “Nothing has spread socialistic feeling in this country more than the automobile,” and added that it offered “a picture of the arrogance of wealth.”

Or consider medicine. Alexander Fleming discovered the antibiotic effects of the mold penicillium in 1928. It was one of the most important discoveries of all time. But a decade later, penicillin was still a laboratory toy. John Mailer and Barbara Mason of Northern Illinois University wrote:

Ten years after Fleming’s discovery, penicillin’s chemical structure was still unknown, and the substance was not available in sufficient amounts for medical research. In fact, few scientists thought it had much of a future.

It wasn’t until World War II, almost 20 years later, that penicillin was used in mass scale.

Or take this amazing 1985 New York Times article dismissing the laptop computer:

People don’t want to lug a computer with them to the beach or on a train to while away hours they would rather spend reading the sports or business section of the newspaper. Somehow, the microcomputer industry has assumed that everyone would love to have a keyboard grafted on as an extension of their fingers. It just is not so …

Yes, there are a lot of people who would like to be able to work on a computer at home. But would they really want to carry one back from the office with them? It would be much simpler to take home a few floppy disks tucked into an attache case.

Or the laser. Matt Ridley wrote in the book The Rational Optimist:

When Charles Townes invented the laser in the 1950s, it was dismissed as ‘an invention looking for a job’. Well, it has now found an astonishing range of jobs nobody could have imagined, from sending telephone messages down fiberglass wires to reading music off discs to printing documents, to curing short sight.

Here’s Newsweek dismissing the Internet as a fad in 1995:

The truth [is] no online database will replace your daily newspaper, no CD-ROM can take the place of a competent teacher and no computer network will change the way government works.

How about electronic publishing? Try reading a book on a computer. At best, it’s an unpleasant chore: the myopic glow of a clunky computer replaces the friendly pages of a book. And you can’t tote that laptop to the beach.

Yet Nicholas Negroponte, director of the MIT Media Lab, predicts that we’ll soon buy books and newspapers straight over the Internet.

Uh, sure.

You can go on and on. Rare is the innovation that is instantly recognized for its potential. Some of the most meaningful inventions took decades for people to notice.

The typical path of how people respond to life-changing inventions is something like this:

  1. I’ve never heard of it.
  2. I’ve heard of it but don’t understand it.
  3. I understand it, but I don’t see how it’s useful.
  4. I see how it could be fun for rich people, but not me.
  5. I use it, but it’s just a toy.
  6. It’s becoming more useful for me.
  7. I use it all the time.
  8. I could not imagine life without it.
  9. Seriously, people lived without it?

This process can take years, or decades. It always looks like we haven’t innovated in 10 or 20 years because it takes 10 or 20 years to notice an innovation.

Part of the problem is that we never look for innovation in the right spot.

Big corporations get the most media attention, but innovation doesn’t come from big corporations. It comes from the 19-year-old MIT kid tinkering in his parents’ basement. If you look at big companies and ask, “What have you done for the world lately?” you’re looking in the wrong spot. Of course they haven’t done anything for the world lately. Their sole mission is to repurchase stock and keep management consultants employed.

Someone, somewhere, right now is inventing or discovering something that will utterly change the future. But you’re probably not going to know about it for years. That’s always how it works. Just like Wilbur and Orville.

More From Motley Fool:

TIME LGBT

Why June 26 Should Be a National Holiday to Honor Progress

Carlos McKnight, from Washington, D.C., waves a rainbow colored flag outside the U.S. Supreme Court in Washington, D.C., U.S., on Friday, June 26, 2015. The high court will decide by the end of the month whether the Constitution gives gays the right to marry. The court's actions until now have suggested that a majority of the nine justices will vote to legalize same-sex weddings nationwide. Photographer: Andrew Harrer/Bloomberg *** Local Caption *** Carlos McKnight
Andrew Harrer—© 2015 Bloomberg Finance LP Carlos McKnight, from Washington, D.C., waves a rainbow colored flag outside the U.S. Supreme Court in Washington, D.C., U.S., on Friday, June 26, 2015. (Andrew Harrer/Bloomberg)

Charlotte Alter covers women, culture, politics and breaking news for TIME in New York City.

It's an important date not just for gay Americans, but for us all

When several historic events happen on the exact same day, it’s a sign: June 26 should be a national holiday.

On Friday the Supreme Court ruled that gay Americans had the right to marry in every state in the country. On the exact same date two years ago, the same court struck down the Defense of Marriage Act, allowing same-sex couples to access federal benefits. And when the Supreme Court ruled that same-sex sexual activity should be legal in every state in the Lawrence vs Texas ruling in 2003, it did so on … June 26. It’s a coincidence, but also much more than that.

Because June 26 isn’t just an important date for gay Americans– it’s a date that symbolizes how rapidly change can happen in America, how quickly our attitudes can evolve, and how, when used correctly, our system is one that propels us all towards a more equal state.

In other words, June 26 is a date that represents what happens when America works the way it’s supposed to work. Only 11 years ago, in 2004, Massachusetts became the first state to allow gay couples to marry. In a little over a decade, gay marriage has gone from a provocative pipe dream to a legal and constitutional right. In that time, the battle has been fought in the legislatures, in the courts, and in the American national conscience.

In 2004, then-Senate candidate Barack Obama said he believed marriage was “between a man and a woman.” In 2010, he said his views on same-sex marriage were “evolving.” This morning in 2015, the White House Twitter avatar turned rainbow-colored, in celebration of the Supreme Court’s decision.

June 26 isn’t just a symbol of marriage equality or gay rights– it’s a day that commemorates a collective change of mind, the American ability to choose freedom and equality.

But wait! Isn’t June 26 a little too close to July 4? If we had two national holidays within the course of a week, wouldn’t the U.S. economy come grinding to a halt and the world implode?

Not necessarily. Just think about how glorious it would be to have two national holidays just over a week apart. It would be the perfect timing for a summer vacation, one that all Americans could enjoy with their families. Maybe they’d celebrate by taking trip to an American beach town, staying in an American hotel, eating at American restaurants. Maybe they’d fly somewhere on an American airline or grill some American burgers. Studies have shown that vacations are good for the economy, and that if everyone took their allotted vacation time, it would support 1.2 million jobs and create $21 billion in tax revenue.

June 26 and July 4 could be sister holidays– both celebrations of freedom, equality, and the promise of America.

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

MONEY Opinion

Defaulting on Student Loans Is Stupid, Not Brave

shirt saying "can't pay, won't pay"
Jeffrey Blackler—Alamy

Don't be a martyr.

Today’s college graduates need to get the message that defaulting on federal student loans is not just stupid, it is unnecessary.

That message easily could get lost in the controversy following a recent New York Times opinion piece, “Why I Defaulted on My Student Loans,” in which writer Lee Siegel not only defended his decision to ignore his education debt, but urged others to join him as a way of changing the economics of paying for college.

“If people groaning under the weight of student loans simply said, ‘Enough,’ then all the pieties about debt that have become absorbed into all the pieties about higher education might be brought into alignment with reality,” Siegel wrote. “Instead of guaranteeing loans, the government would have to guarantee a college education.”

Perhaps.

We now have some indication that debt strikes may work: The U.S. Department of Education expanded its forgiveness program after some former students of the Corinthian Colleges chain – which shut down all its remaining campuses in late April – publicly refused to pay their debt, saying they’d been defrauded.

Before the chain collapsed, it had been fined by the Education Department for falsifying job placement claims and was the target of numerous other investigations for predatory lending and deceptive recruitment.

But the Education Department’s willingness to reconsider likely was less a response to the debt strike than to the intervention of state attorneys general and consumer advocates who encouraged the feds to be more generous with Corinthian victims.

In any case, Siegel’s piece downplays the personal and lifelong financial devastation that default can inflict, while ignoring the fact that there are now plenty of options for dealing with most student loan debt.

If debt collectors are sometimes likened to junkyard dogs, the Department of Education would be the one who is not chained or fenced. There is no statute of limitations on federal student loans, which means its collectors can chase you to your grave.

Along the way, they can snatch your tax refunds and garnish your wages without having to go to court. They can even take a bite out of your Social Security checks, something other creditors cannot do.

Borrowers pursued by other collectors can escape into bankruptcy court. Student loan borrowers are unlikely to find relief this way, since few can meet the stringent “undue hardship” clause courts require to erase this debt.

These powerful tools are among the reasons why the federal government expects its collection efforts to recover enough money in principal, interest and penalty fees to offset any defaults in its student loan programs.

Even before collection actions start, though, the failure to pay will take a heavy toll on a borrower’s credit – and thus impair the person’s ability to get credit cards, apartments and jobs.

Bad credit can inflate the cost of insurance and the size of deposits required for utilities and wireless phone service.

All this pain is increasingly unnecessary, since improvements in federal payment plans mean the vast majority of today’s student borrowers can find ways to avoid default.

The latest iteration of the Education Department’s Pay As You Earn program, for example, caps payments at 10 percent of discretionary income – defined as the amount over 150 percent of the federal poverty level for the borrower’s household size and state of residence. For some low-income borrowers, that can translate into monthly payments of zero dollars.

Forgiveness of remaining balances is possible after 10 years for those with public service jobs and after 20 years otherwise.

There are certainly some who will still struggle. Among them, for example, are higher-income borrowers who have crushing levels of other debt, who may have trouble staying current even under Pay As You Earn.

People with older student loans are not eligible for the new arrangement and may have to pay up to 15 percent of their discretionary income with the other available income-based repayment plan.

Parents who took out federal PLUS loans also are not eligible for the more generous repayment options, although they may qualify for an income-contingent plan that caps payments at 20 percent of discretionary income.

Meanwhile, private student loans, which make up roughly 15 percent of the $1 trillion or so currently owed in education debt, offer far fewer repayment options and consumer protections than federal student loan debt. That is why Congress should seriously consider the Consumer Financial Protection Bureau’s suggestion that these loans – which involve no taxpayer funds or government guarantees – be easier to discharge in bankruptcy court.

Other reforms, such as expanding grant aid, simplifying financial aid forms and expanding Pay As You Earn to other borrowers, should be considered as well. What is not worth discussing is default as political protest. Today’s graduates have much better options than to make human sacrifices of themselves.

Read next: Someone Took Out a Student Loan in My Name

TIME Opinion

Why History Urges Caution on Proposed Trade Deal

Photomechanical print of the Chicago Special, Burlington Route, a Class I railroad that operated in the Midwestern United States. Commonly referred to as the Burlington or Q. Photographed by William Henry Jackson ( 1843-1942). Dated 1900.
Universal History Archive/Getty Images Photomechanical print of the Chicago Special, Burlington Route, a Class I railroad that operated in the Midwestern United States. Dated 1900.

When it comes to globalization, the 19th century has a lot to teach the 21st

Many 21st-century Americans like to believe that globalization—like sex, profanity, and partisan media—is an invention of the modern era.

But of course it is not.

Nevermind NAFTA. Consider, instead, the case of a 16th-century enslaved African man, forcibly transported to the Caribbean to grow South Asian sugarcane for English consumers. Or the story of an early-modern South American indigenous person, put to work in the Bolivian silver mines, so fat cats in Europe could buy Chinese porcelain and Indian Ocean spices.

And yet this long history of globalization is rarely invoked in contemporary debates over globalization—most recently in the wrangling between Congress and President Obama over the Trans-Pacific Partnership (TPP) trade deal (currently stuck in legislative limbo until at least late July, as President Obama and House Speaker John Boehner search for a path forward). Instead, pundits focus on the abstract metrics that tend to dominate discussions of today’s political economy. Proponents argue, for instance, that TPP and other trade deals will boost economic efficiency and lead to cost savings for consumers by removing barriers to trade. Opponents, on the other hand, contend that it will kill jobs and erode both labor and consumer protections. (They also argue that it’s an offense against democratic transparency—but that’s another story.)

Lost in conversations surrounding globalization, however, is the kind of concrete, human-scale perspective that history can provide: a perspective that would seem to support a course of caution on TPP, lest we overtax the human capacity for wrenching social and economic change. Especially relevant, in this instance, is the story of an earlier flashpoint in the history of globalization: a moment, spanning the early 19th century, when dramatic shifts in transportation technology, investment, and infrastructure put a growing number of backwoods Americans into contact with wider, increasingly-global markets.

These developments had their advantages. The construction of the Erie Canal—linking the Hudson River with the Great Lakes—meant that farmers as far west as Wisconsin now had easy access to New York’s networks of goods and buyers. An advantageously placed railroad put two-bit towns, denied the blessings of a navigable river, in touch with a wider world of people and things. Steamboats, improved oceangoing vessels, and the telegraph radically reduced the time and cost associated with both transportation and communication.

The result was a broadening of Americans’ economic horizons. For some, like a hardscrabble Illinoisan named Abraham Lincoln, these developments were a godsend. Markets and infrastructure, meant education and opportunity for the young Lincoln—an antidote to the material privation of his rural childhood and an escape from the small, narrow life that his log-cabin upbringing seemed to offer. Not surprisingly, these early experiences made him a life-long advocate of what contemporaries called ‘internal improvements’: public investments in infrastructure and technology designed to integrate American markets with one another and with the wider world.

Of course, these early benefits of globalization also came at a steep cost. The ability of Wisconsin farmers to sell grain in Europe and beyond meant they were now vulnerable to distant, dangerous and unpredictable oscillations in global cereal markets. And new urban credit instruments—carried to rural areas on the same canals, railroads and steamboats that brought clocks, clothing and other desirable consumer goods—not only expanded possibilities for gain but for failure as well.

Indeed, for every cagey backwoods entrepreneur who made a fortune from these new possibilities—using bank loans to successfully speculate in Western lands or purchase new fields on which to grow grain for the lucrative export trade—one or more of his neighbors likely failed. Financial neophytes of the first order, some omitted to read the fine print of their loans, and soon found themselves buried under mountains of interests. Others, meanwhile, simply had poor timing. When financial disaster struck—as it did in 1819, 1837, and again in 1857—those with outstanding debt found the international markets on which they increasingly depended callously slack. Left without the cash flow to meet their credit obligations, their creditors—unsurprisingly—were quick to pounce.

Beset by the specters of foreclosure and dispossession (sound familiar?), many of the casualties in this early round of globalization took flight for the cold comfort or urban wage labor. In this process, too, there would be winners and losers: those who preferred urban energy to rural somnolence, commercial and manufacturing work to backbreaking agricultural toil.

For nearly all who went through this process of dispossession and migration, it would be a wrenching one: a profound shift in the pace and content of their work, in the rhythms of their lives, in their social and cultural horizons, in their identity and sense of self. Those who experienced it, from rural Midwestern families and Irish potato farmers to German peasants and countless others, would never forget the transformation they underwent.

These changes, one might argue, were inevitable. And, in the aggregate, they were positive. The march of globalization has, overall, produced a more prosperous and cosmopolitan planet, releasing large swaths of humanity from the grip of parochialism and privation in the process. Those who argue otherwise have engaged in a rather selective reading of the human past.

But, just because the aggregate outcome of social and economic change was positive, doesn’t mean that it was easy for the people who experienced it. Indeed, the experience of economic modernization was so unsettling for some Americans, that many 19th-century politicians, including the otherwise reprehensible Andrew Jackson, struggled to slow—if not halt—the version of globalization that was shaking their constituents’ world. Vilifying federal support for everything from banking to infrastructure, these politicians sought to preserve what they saw as a traditional way of life for as long as possible.

Today, many Americans likely struggle to understand what these politicians were all about. Who wouldn’t want a functional national banking system or decent roads, they might ask. But, in many respects, the political questions these early American politicians were navigating were remarkably similar to the ones at the heart of the TPP debate: how to embrace the benefits of social and economic change, while making those changes manageable in human terms. How to have a dynamic economy without overtaxing individuals’ capacity for change.

Then as now, we would do well to err on the side of caution: to welcome aspects of globalization without providing it undue encouragement. Change will come—with all its promises and pitfalls—regardless of what Congress and the President decide on TPP. But our elected officials needn’t make that process more jarring by removing some of the last remaining brakes on globalization.

The Long ViewHistorians explain how the past informs the present

Sean Trainor has a Ph.D. in History & Women’s, Gender, and Sexuality Studies from Penn State University. He blogs at seantrainor.org.

MONEY Opinion

How to Protect Our Kids’ Credit Now

Foreign Intelligence Surveillance Act
Bill Clark—CQ-Roll Call,Inc. Rep. Jim Langevin, D-R.I., participates in the news conference on Foreign Intelligence Surveillance Act (FISA) Improvement Legislation on Tuesday, March 25, 2014.

Children are particularly vulnerable to identity theft because they have little reason to access their credit histories.

An 18-year-old looking to purchase his first car.

A young woman applying for the student loan that will put her through college.

A foster youth aging out of the system and eager to get a place of his own.

These are exciting milestones in the lives of young people, turning points that mark new beginnings and the start of independence. Now imagine you’ve reached this crossroads only to discover that your identity had been stolen. Instead of the pristine, untapped credit record you’re expecting, you find years of charges, debt and defaults racked up by a criminal using your name and Social Security number.

It’s a scary thought, and not as rare as you may think. Identity theft has been the top consumer complaint received by the Federal Trade Commission for more than a decade, and those complaints increasingly involve minors or young adults tapping into their credit for the first time. The ensuing chaos and barrage of paperwork is a difficult maze to navigate for most adults, never mind young people who have not yet even opened their first credit card.

Children are particularly vulnerable because they have little reason to access their credit histories. By the time the discrepancies are discovered, the damage has been done. We must make it easier for parents to protect their children’s financial futures.

All children are vulnerable to identity theft, but foster youth are especially susceptible. Their personal information, including Social Security number, is passed through many hands, increasing the chances of abuse. Moreover, when they age out of the system, they often lack a parent advocate to fight on their behalf. As a co-chair of the Congressional Caucus on Foster Youth and someone who grew up with foster siblings, this is an issue about which I care deeply.

In 2011, I successfully incorporated a provision into the Child and Family Services Improvement Act mandating free credit checks for foster youth over 16 years old, giving them time – and assistance – to clear inaccuracies from their records before they aged out of the system.

I believe similar protections are necessary for all children, and I continue to call on my colleagues in Congress to enact a solution.

The Protect Children from Theft Act, which I introduced in April, aims to safeguard children from becoming victims of identity theft. The bill directs the Consumer Financial Protection Bureau to write a rule that gives parents and guardians the ability to create a protected, frozen credit file for their children. Placing a freeze on a credit report would prevent lenders and others from accessing a credit report entirely, which in most instances would stop an extension of credit. I hope that this legislation, if passed, would create a simple, easy-to-understand process for families to protect their child’s financial interests. New parents are consumed with many questions and concerns; diapers and teething likely take precedence over their child’s future credit score. We need a process by which parents and guardians have an easy, streamlined way to freeze a child’s credit.

As co-founder and co-chair of the Congressional Cybersecurity Caucus, I am well aware that cybersecurity is not a problem that can be solved, only managed. An often overlooked component to that management is resilience, being able to recover from an incident. We are all increasingly reliant on technology and the data that drive it; today, we trust a multitude of networks with personal financial data and private information, including health care records and, yes, even our Social Security numbers. If we want to benefit from the economic efficiencies of technology but still avoid identity theft, we need personal cyber resiliency so that we can recover when our data are compromised. We need to keep tabs on who has our personal information and what is at risk in the case of a breach. We need to check our credit scores, put alerts on our credit cards and work with our banks to ensure our financial information is as safe as possible. And we need to exercise the same vigilance for our children and their data.

I will continue to fight to protect children from identity theft to give them a fair shot when their time comes. Let’s share our good cyber habits with the next generation and make sure that when they are ready to buy that car, take out that student loan or sign a lease on that new apartment, identity theft doesn’t derail the milestone.

Read next: When Someone You Love Opens a Credit Card in Your Name

More From Credit.com:

TIME Opinion

Happy Birthday, Popeyes Chicken

Al Copeland
AP Images Al Copeland holds a piece of his spicy fried chicken outside one of his 34 fast food outlets in New Orleans, on June 20, 1979.

A personal ode to Popeyes chicken in honor of the chain's birthday

When I first heard that Popeyes was turning 43 on Friday, I found it hard to believe. After all, to me, Popeyes never seems like it’s more than about 5 minutes old, since that’s how long it takes me to inhale an 8-piece meal (dark meat only). But, on the chicken chain’s birthday, it’s worth remembering that Popeyes almost didn’t exist at all.

The June 12 date is actually the birthday of a restaurant called “Chicken on the Run,” which Al Copeland opened on that day in 1972 in a New Orleans suburb. As the official Popeyes history tells it, the chicken was traditional (non-spicy) Southern style and sales were underwhelming. Chicken on the Run was also late to the poultry fast food game. KFC had opened more than 40 years earlier in 1930, while Chick-fil-A arrived in 1946. The former has established itself as America’s preeminent chicken chain and the latter has developed a cult following for its chicken sandwiches and limited availability west of the Mississippi or north of the Mason-Dixon.

But rather than give up, Copeland closed the place, reopened it under the new name “Popeyes” (after a character in the movie The French Connection) and made the chicken spicier. And, in the last 40 years, Americans have made it clear that they have room in their hearts and stomachs for another chicken chain. It paid off for Copeland—when he died in 2008, TIME noted that “whatever his success, he wasn’t shy about public displays of wealth, indulging in over-the-top Christmas-light displays and Lamborghinis and Rolls-Royces.”

More than four decades after launching, Popeyes isn’t the most ubiquitous or the most profitable fast-food chain. But it has managed to carve out a reputation for good eats among the general public and those with more discerning palates. Rare is the fast food chain lauded by consumers and critics alike, but Popeyes—along with West Coast burger joint In-n-Out—has proved itself a worthy outlier. As a longtime fan, I think the chain’s success comes down to the three core components of fried chicken excellence:

  1. Meat. This aspect of fried chicken is actually often overlooked in favor of its sexier, more-discussed counterpart, skin (more on this in a bit). Nevertheless, it is absolutely crucial if you’re going to have an excellent piece of fried chicken. Popeyes has incredibly tender, moist meat. Sure, no one ever likes to use the word moist, but that’s the best way to describe Popeyes’ meat, so use it we shall.
  2. Skin. The most common descriptor of so-called “good” fried chicken is the phrase “crispy skin.” Lots of places have crispy skin, but what Popeyes and other excellent fried chicken joints have is crunchy skin.
  3. Flavor. The final and perhaps most important component of fried chicken is the actual taste itself. Popeyes seasonings are quite likely the key to world peace.

You may not agree. But this loyal fan stands firm.

MONEY Opinion

The US Could Have Blocked the Massive Cyberattack on Federal Employee Data

open padlock
Jose Luis Pelaez—Getty Images

Chinese hackers are suspected of stealing personal information on 4.1 million workers.

True or False? There was no way the Office of Personnel Management could have prevented hackers from stealing the sensitive personal information of 4.1 million federal employees, past and present.

If you guessed “False,” you’d be wrong. If you guessed, “True,” you’d also be wrong.

The correct response is: “Ask a different question.” Serious data breaches keep happening because there is no black-and-white answer to the data breach quagmire. So what should we be doing? That’s the right question, and the answer is decidedly that we should be trying something else.

The parade of data breaches that expose information that should be untouchable continues because we’re not asking the right questions. It persists because the underlying conditions that make breaches not only possible, but inevitable, haven’t changed—and yet we somehow magically think that everything will be all right. And of course we keep getting compromised by a shortlist of usual suspects, and there’s a reason. We’re focused too much on the “who” and not asking simple questions, like, “How can we reliably put sensitive information out of harm’s way while we work on shoring up our cyber defenses?”

According to the New York Times, the problems were so extreme for two systems maintained by the agency that stored the pilfered data that its inspector general recommended, “temporarily shutting them down because the security flaws ‘could potentially have national security implications.’”

Instead, the agency tried to patch together a solution. In a hostile environment where there are known vulnerabilities, allowing remote access to sensitive information is not only irresponsible — regardless the reason — it’s indefensible. Yet according to the same article in the Times, the Office of Personnel Management not only allowed it, but it did so on a system that didn’t require multifactor authentication. (There are many kinds, but a typical setup uses a one-time security code needed for access, which is texted to an authorized user’s mobile phone.) When asked by the Times why such a system wasn’t in place at the OPM, Donna Seymour, the agency’s chief information officer, replied that adding more complex systems “in the government’s ‘antiquated environment’ was difficult and very time consuming, and that her agency had to perform ‘triage’ to determine how to close the worst vulnerabilities.”

Somehow I doubt knowing that protecting data “wasn’t easy” will make the breach easier to accept for the more than 4 million federal employees whose information is now in harm’s way (or their partners or spouses whose sensitive personal information was collected during security clearance investigations, and may have been exposed as well).

A New Approach

Given the above circumstances, the game changer — at least for the short-term — may be found in game theory. In an “imperfect information game,” players are unaware of the actions chosen by their opponent. They know who the players are, and their possible strategies and actions, but no more than that. When it comes to data security and the way the “game” is set up now, our opponent knows that there are holes in our defenses and that sensitive data is often unencrypted.

Since we can’t resolve vulnerabilities on command, one way to change the “game” would be to remove personal information from systems that don’t require multifactor authentication. Another game changer would be to only store sensitive data in an encrypted, unusable form. According to Politico, the OPM stored Social Security numbers and other sensitive information without encryption.

This fixable problem is not getting the attention it demands, in part because Congress hasn’t decided it’s a priority.

The U.S. is not the only country getting hit hard in the data breach epidemic. The recent attack on the Japanese Pension Service compromised 1.25 million records, and Germany’s Bundestag was recently hacked (though the motivation there appeared to be espionage, according to a report in Security Affairs).

According to an IBM X-Force Threat Intelligence report earlier this year, cyberattacks caused the leak of more than a billion records in 2014. The average cost for each record compromised in 2014 was $145, and has increased to $195, according to Experian. The average cost to a breached organization was $3.5 million in 2014, but is now up to $3.8 million. More than 2.3 million people have become victims of medical identity theft, with a half million last year alone. Last year, $5.8 billion was stolen from the IRS and the Treasury Inspector General for Tax Administration predicts that number could hit $26 billion by 2017.

If you look at the major hacks in recent history — a list that includes the White House, the U.S. Post Office and the nation’s second largest provider of health insurance — it would seem highly unlikely that a lax attitude is to blame, but that is precisely the problem. A former senior administration adviser on cyber-issues spoke off the record with the New York Times about the OPM hack: “The mystery here is not how they got cleaned out by the Chinese. The mystery is what took the Chinese so long.”

During this siege-period, while our defenses are no match for the hackers targeting our information, evasive measures are necessary. I agree with White House Press Secretary Josh Earnest, who said, “We need the United States Congress to come out of the Dark Ages and actually join us here in the 21st century to make sure that we have the kinds of defenses that are necessary to protect a modern computer system.”

But laws take a long time, and we’re in a cyber emergency. The question we need to ask today is whether, in the short term, the government can afford not putting our most sensitive information behind a lock that requires two key-holders — the way nukes are deployed — or storing it offline until proper encryption protocols can be put in place.

This story is an Op/Ed contribution to Credit.com and does not necessarily represent the views of the company or its affiliates.

More From Credit.com:

MONEY Opinion

How History Can Mess With Your Investing Strategy

Traders work on the floor of the New York Stock Exchange March 2, 2009.
Shannon Stapleton—Reuters Traders work on the floor of the New York Stock Exchange March 2, 2009.

People get history wrong when they look back at specific events and expect them to repeat in the future.

Do you read history? Believe almost all of it? Use it as a guide to the future?

Me too. But let’s consider something. Take these two statements:

“11 million jobs have been created since 2009. The stock market has tripled. The unemployment rate nearly cut in half. The U.S. economy has enjoyed a strong recovery under President Obama.”

“The recovery since 2009 has been one of the weakest on record. The national debt has ballooned. Wages are stagnant. Millions of Americans have given up looking for work. The economy has been a disappointment under President Obama.

Both of these statements are true. They are both history. Which one is right?

It’s a weird question, because history is supposed to be objective. There’s only supposed to be one “right.”

But that’s almost never the case, especially when an emotional topic like your opinion of the president is included. Everyone chooses the version of history that fits what they want to believe, which tends to be a reflection of how they were raised, which is different for everybody. We do this with the economy, the stock market, politics — everything.

It can make history dangerous. What starts as an honest attempt to objectively study the past quickly becomes a field day of confirming your existing beliefs. This is like steroids for inflating your confidence and puts you on a path to misguided, regrettable decisions. (Misguided and regrettable decisions being the one thing everyone agrees history is filled with).

In his book Why Don’t We Learn From History?, B.H. Liddell Hart wrote:

[History] cannot be interpreted without the aid of imagination and intuition. The sheer quantity of evidence is so overwhelming that selection is inevitable. Where there is selection there is art.

Those who read history tend to look for what proves them right and confirms their personal opinions. They defend loyalties. They read with a purpose to affirm or to attack. They resist inconvenient truth since everyone wants to be on the side of the angels. Just as we start wars to end all wars.

I see this all the time in investing. The amount of investing data is incomprehensible and growing by the day. Anyone can think up a narrative, then sift through mountains of historical data to find examples backing it up.

Think stocks are expensive? History agrees. Think stocks are cheap? History agrees. Think tax cuts spur economic growth? History agrees. Think tax cuts don’t spur economic growth? History agrees. History shows that raising interest rates is both good and bad for stocks. It proves that buy-and-hold investing is the best and the worst strategy. In the age of big data, no idea is so absurd that a good spreadsheet can’t make it look right.

And a lot of the historical events investors try to study — recessions, bear markets, bouts of hyperinflation — are rare enough that we don’t have many episodes to draw conclusions from.

To know a lot about recessions, for example, you’d ideally want hundreds of examples to study. But there have only been 33 recessions in the last 150 years. And the data we have on most of them is dubious. Estimates on how much the economy contracted during the 1920 recession range from 2.4% to 6.9%, which is the difference between a moderate recession and a near-depression. In the last 50 years, when data is more reliable, there have been just seven U.S. recessions. So how are we supposed to take seriously any historical statistic about the average recession? How long the average recession lasts? How frequently they occur? How high unemployment goes? We’re talking about something that has occurred just seven times in the last half-century.

So, what good does looking at history do us?

A lot, in fact. You just can’t take it too far.

People get history wrong when they look back at specific events and expect them to repeat in the future. It’s so easy to underestimate how much past events were caused by trivia and accident rather than trends that should repeat in some clean way. Investors who have unshakable faith in markets reverting back to specific historical averages have some of the worst track records you can imagine. This goes into overdrive when you acknowledge the subjectiveness of historical recordkeeping. “I have written too much history to believe in it,” historian Henry Adams once said.

But history can be great at teaching broad, unspecific lessons. Here are five.

Something usually occurs to keep good news and bad news from going on forever.Recessions end because excess gets washed away; booms end because everything gets priced in. Most people wake up every morning wanting to make the world a better place, but psychopaths, idiots, charlatans, and quacks are persuasive enough to occasionally shake things up.

Unsustainable things last longer than you think. Every war was supposed to be over in a month, every boom was surely going to pop any day, and every round of Federal Reserve money printing meant high inflation right around the corner. In reality, things that look unsustainable can last for years or decades longer than seems reasonable. “I was right, just early,” are famous last words, and indistinguishable from “wrong”.

Normal things change faster than you expect. “History doesn’t crawl,” Nassim Taleb writes, “it leaps.” Things “go from fracture to fracture, with a few vibrations in between. Yet we like to believe in the predictable, small incremental progression.”

Irrationality spreads at the worst possible times. Most people can keep their heads straight when things are calm. It’s when things get exciting — bull markets, bear markets, wars, recessions, panics — that emotions take over. Importantly, decisions during made during those crazy moments are the most important decisions you make over the long run.

Nothing is stronger than self-interest. When someone in charge of lots of people gains the most by promoting their own interests, you get inefficiencies at best, disaster more often. What everyone knows is the truth or the right thing to do is ignored because a few people can get ahead doing something else. This describes most organizations.

More From Motley Fool:

MONEY Opinion

Putting My Money Where My Mouth Is

Gary Musgrave

A low-cost snoring fix led to a good night's sleep -- and a very happy wife.

By the time she met me, my wife had already put up with her fair share of snoring: Her late mother, bless her heart, sounded like the devil’s motorcycle.

I never knew I had a problem. I assumed I was tired, cranky, and forgetful because I was a parent, not because I was snoring. My wife, however, informed me I’d become a menace when I slept on my back. For two years she’d roll me onto my side and build a barricade of pillows so I wouldn’t topple back down. I’d topple anyway. She’d send me to the couch, or—and this is what broke my heart and made me seek help—she’d stagger off herself, trailing her blanket like Linus in “Peanuts.”

I called a clinic I’ll refer to as the Crystal Dreamery. Turns out, there was a whole industry waiting to determine if my snoring was benign (that is, harmful to my marriage but not my health) or a sign of sleep apnea (scarier because this would mean I stopped breathing repeatedly).

The first step was a consult with an ENT doctor. He gave me a home sleep monitor so I didn’t have to spend the night in some weird pod, like Michael Jackson. My diagnosis: apnea. I needed either a Top Gun–style mask to force air down my throat or a pointy, custom-made mouthpiece to nudge my lower jaw forward, widen my airway, and make me look like an inbred vampire. Either way, Crystal Dreamery would charge my insurance company about $5,000, though my out-of-pocket costs wouldn’t top $300.

I went the vampire route. I could have bought a generic guard for $50, but I like a medical professional in the loop when I contemplate the flow of oxygen to my brain. Still, it felt like a shakedown of the insurance company. The snore guard worked, but it broke three times.

It was around this time that I noticed a similar device at my son’s orthodontist, nestled on a shelf next to casts of terrifying tween teeth. The doctor told me he could make me a quasi-bionic mouth guard for $350. We’re still waiting to hear if insurance will cover it, but I’m not losing sleep either way. It works like a vampiric charm.

My wife’s only complaint is that the guard makes me lisp. So I don’t talk when it’s in. She’s the love of my life, and she deserves some peace and quiet.

Jeff Giles’s novel, The Mercy Rule, will be published next year by Bloomsbury. You can follow him on twitter @MrJeffGiles.

Do you have a purchase you consider Money Well Spent? Email us about it and what it means to you at wellspent@moneymail.com.

TIME Opinion

Which Ronald Reagan Are the GOP Presidential Candidates Embracing?

Ronald Reagan Speaks
Bill Ray&—The LIFE Picture Collection/Getty Ronald Reagan giving a speech in 1965

Is it the one who said he wanted to cut the debt or the one who actually left us with a bigger debt?

History News Network

This post is in partnership with the History News Network, the website that puts the news into historical perspective. The article below was originally published at HNN.

At the start of another Republican presidential campaign, we can be assured that the candidates will at some point express their fidelity to the spirit Ronald Reagan. And yet this fidelity, often bordering on reverence, is largely unjustified. For it is an inconvenient fact about Reagan that he was a failure when judged on his own policy terms.

Reagan came into public life as a deficit hawk and a crusader for smaller government. Starting with his first public political address in 1964, he constantly bemoaned the large deficits, the growing national debt, and the creeping inflation of the era as evidence that the United States had lost its way. In 1975, when he decided to challenge Gerald Ford for the Republican nomination, Reagan laid out his policy solution to address the crisis. He looked forward to, in his words, “a systematic transfer of authority and resources to the states.” By giving up federal responsibility for welfare, education, housing, food stamps, Medicaid, and community and regional development, Reagan believed that he could create efficiencies that would save the government $90 billion. That amount, he predicted, would be enough to pay down the national debt, balance the budget, reduce inflation through tax cuts, and prime the economy for further growth. With one masterstroke, he claimed to be able to solve all problems.

Although Ford had a field day with Reagan’s proposal and Reagan eventually lost the nomination, he never really departed from his fundamental belief that the nation’s problems had a simple solution. And after the campaign, when he was introduced to supply-side economics and the idea that you could grow the economy through the magic of tax cuts that would pay for themselves, he retrenched his policy vision in that new economic language.

But once in office, his vision ran into problems. In his first meeting with the cabinet, he told them what he most wanted was to “reduce the size of government very drastically.” His team mobilized to push through a $35 billion budget cut that was approved just before the July recess. With victory in hand, he then successfully promoted a massive tax cut that would occur in several steps. A 5 percent reduction was scheduled to begin on October 1, 1981, followed by an additional 10 percent reduction in each of the following two years. The Congressional Budget Office projected the cost at $180 billion over three years, but Reagan waved the concerns away. “Every major tax cut that has been made in this century in our country has resulted in even the government getting more revenue than it did before, because the base is so broadened by doing it,” he told reporters.

Unfortunately, rather than the expected boom in tax receipts, Reagan almost immediately faced huge deficits that grew into the biggest budget shortfalls in peacetime. Reagan’s budget director, David Stockman, began pushing to give back some of the tax cuts. But Reagan refused, complaining after one budget meeting, “I think Dave S. tells us more than we need to know for budget decisions.” But eventually he had to concede there was a problem. In 1982, as the size of the deficits became clear, he sadly admitted to his diary, “We who were going to balance the budget face the biggest budget deficits ever.” Still, he never took responsibility, consistently blaming the economy, the Democrats, the social welfare system, anything but his own policies.

Yet it was what he said in public that generated his true success—and the mystique that we still see today. Instead of admitting his failure and adjusting accordingly, he turned to another longstanding habit: sloganeering about the Founding Fathers. It turns out that he didn’t really care about the deficit or even smaller government, which also grew under Reagan largely because of his increased military spending. “Our real concerns are not statistical goals or material gain,” he told the Conservative Political Action Conference in 1982. What he wanted instead was a renewal of “‘the sacred fire of liberty’ that President Washington spoke of two centuries ago.”

And so, as his policy failures emerged, Reagan increasingly turned to the Founders to cover his shortcomings. His policies would no longer be measured in numbers or reality, but instead with the putative values of the founding generation. To those who criticized the deficit, he responded that he stood for “the dream conceived by our Founding Fathers,” which honored “individual freedom consistent with an orderly society.” He then varied that theme depending on audience. To the nation’s governors, he praised the fact that “Jefferson and Adams and those other far-sighted individuals” had created a “system of sovereign States” that was “vital to the preservation of freedom.” It was that system that he was working to restore. To evangelicals, he said that “the Founding Fathers were sustained by the faith in God” and had written that faith into the founding documents. Their religious vision was the basis of his policies. In response to Walter Mondale’s suggestion that taxes be raised to address the exploding deficit, Reagan claimed that Mondale saw “an America in which every single day is tax day, April 15th,” whereas he saw “an America in which every day is Independence Day, the Fourth of July.”

As strained gestures to the Founders took the place of reasoned analysis and debate, the Republican Party turned in on itself. The niceties of economics, the numbers that guide policy decisions, the trade-offs and choices inherent in governance—all these in Reagan’s Republican Party were beside the point. But the deficit never went away. When he left office, the debt had tripled under his tenure. The nation went from being the world’s largest creditor nation to being the world’s largest debtor, which it remains to this day. Taxes didn’t even go down much, except for the very wealthy. When he entered office, 19.4 percent of national income went towards taxes. Eight years later it was 19.3 percent. But because he cut so much in domestic spending (most of the increases were for the military), many state and local governments had to raise taxes, so the overall tax burden actually shifted upward.

These numbers tell a different story than you are likely to hear in the Republican campaign. But I suggest that we judge Reagan on these terms—his original ones—rather than falling for the propaganda.

David Sehat is an associate professor of history at Georgia State University and the author of The Jefferson Rule: How the Founding Fathers Became Infallible and Our Politics Inflexible (May 2015).

Your browser is out of date. Please update your browser at http://update.microsoft.com