TIME Retail

This Japanese Company Is Sending Whiskey to Space

Astronauts Complete Last Of Three Spacewalks
NASA—Getty Images In this handout from National Aeronautics and Space Administration or NASA, Expedition 42 Flight Engineer Terry Virts and Commander Barry "Butch" Wilmore work outside the International Space Station (ISS) on their third spacewalk March 1, 2015 in space.

They promise it's for science

Japanese whiskey maker Suntory is sending samples to space in an experiment to see how the trip might affect the drink’s taste.

Suntory, one of Japan’s largest makers of alcoholic beverages, said Friday that the samples would be stored in a Japanese facility at the International Space Station, the AFP reports.

The company’s researchers believe that storing whiskey in zero-gravity for longer than a year could cause it to age differently than it would on Earth, perhaps leading to a mellower flavor.

The AFP reports that the space whiskey will not be made available for sale, but rather tested in a laboratory by researchers. “For the moment, we’re not thinking about applying the study results to commercial products,” a Suntory spokeswoman told the AFP.

TIME museums

See Original Models of the Apple I and Other Iconic American Inventions

The first U.S. patent was issued on July 31, 1790

It was 225 years ago Friday that Samuel Hopkins of Philadelphia was granted a U.S. patent for his new method of making potash, a salt useful for fertilizer. The patent was signed by George Washington, who had established the patent system mere months earlier.

Hopkins’ patent was the first such document in the nation’s history, but it was far from the last. As can be clearly seen by the documents and objects on show at Inventing in America—an exhibit that opened earlier this month at the Smithsonian’s National Museum of American History, in collaboration with the U.S. Patent and Trademark Office, and will be on view through 2020—the tradition of ingenuity in the United States has been a fruitful one. And that makes sense: as John Gray, the museum’s director, said in a statement, the U.S. itself was a new invention when it was founded.

It used to be required that a patent application come with a model of the idea, and now the museum has thousands of those models, along with prototypes and trademark examples. From the printing presses and typewriters of the 19th century, to DuPont Kevlar—celebrating its 50th birthday this year—and the Apple computer, here are some examples to get the inspiration going for the next big invention. (Sorry, a thinking cap isn’t one of them.)

TIME Science

The Case Against Designer Babies Falls Apart

It is time we moved away from absolute bans and started focusing on how to mitigate the dangers and risks instead

When it comes to technological advances that could reduce human suffering, improve health and reduce disease, we are generally all in favour. But recent advances in procedures that tinker with reproductive cells are often seen as an exception. They attract fierce opposition from people who believe they are unethical and should be treated as serious criminal offenses – which in some jurisdictions they are already. I don’t think these arguments are decisive, however. Indeed some of them are not convincing at all.

Ethical debates about changing the human genome make a distinction between two different types of cells. All cells except those involved in reproduction are known as somatic. These have been the subject of less controversial research for a number of years now – for example editing a type of white blood cell known as T-cells has become a major area of inquiry in cancer research.

Cells involved in reproduction are called germ cells. Changing them, which is sometimes described as germline editing, can have effects that can be inherited by the offspring of the people whose bodies are amended. In other words, the changes can enter the gene pool.

The main objections to such procedures fall into four categories: they are unpredictable and dangerous; they are the slippery slope to eugenics and designer babies; they interfere with nature and involve playing God; and they will exacerbate social inequality and cause a division between the genetically enhanced and the rest of us.

First among equals

To begin with, not everything that leads to social inequality is unethical. And even in instances when such practices are unethical, it doesn’t automatically follow that they should be illegal.

For instance, it is clearly arguable that any advanced system of higher education might perpetuate social inequality. Those who succeed in their studies might tend to get better jobs than people who are less educated. And the children of parents who are highly educated are more likely to become highly educated than other children. But very few would argue that this makes higher education or indeed family units unethical. Neither do we normally say that scientific research should be undertaken only if won’t lead to social inequality and divisiveness. It is whimsical to attach such a requirement in the case of genome editing.

As for the course of nature, we alter it when we dam a stream or build a house. We play God if we inoculate a child against polio or operate upon a baby with a hole in her heart rather than watch her die. If we could edit a germline such that these possibilities were permanently eradicated, why shouldn’t we consider doing so? We play God when we consider it reasonable and so we should.

You have to consider the ethics of acts of omission in this context. It is wrong to push someone off a cliff. But it is also sometimes wrong to fail to prevent someone from accidentally falling off a cliff. In the same way, it would surely be wrong to deliberately edit a germline so that someone who would have lived a long and healthy life will lead a short, miserable one. But what about the reverse? What if we could deliberately edit a germline to lengthen someone’s life expectancy and make them healthier? Would we not have an ethical obligation to do so? And surely if their descendants would also enjoy the same benefits, the duty to intervene becomes even stronger.

Indeed you can turn the question around and say: if one could design healthy babies, what would be the moral justification for failing to do so? It is not obvious that there is one. The term “designer baby” is emotive, pejorative and misleading. But if there is a slope that leads to them, we should perhaps edge along it carefully.

Risks and rewards

This brings us to the fourth common objection. Several prominent scientists argued in Nature in March that there were “serious risks” around editing germ cells. They wrote:

In our view, genome editing in human embryos using current technologies could have unpredictable effects on future generations. This makes it dangerous and ethically unacceptable.

They argued that it may be impossible to know the precise effects of modifying an embryo until after birth. I would readily accept that. Yet risk and uncertainty are different concepts. Germline editing might be as likely to produce unpredicted benefits as harms. It does not follow that it is dangerous. It is, rather, uncertain.

The precise effects of failing to proceed with germline editing can be uncertain too. We are far from certain that developing such procedures will be more dangerous than avoiding them. And in some cases we can be pretty sure that some people will otherwise either die or only survive in pain, illness or incapacity. If we know that the absence of genetic editing is dangerous, why shun it? Surely we have a moral duty to do the opposite.

Finally, think about what happens with normal childbirth. In such situations, the genetic outcomes are generally not known until after birth. It does not follow that normal childbirth is dangerous, however. Even when it is dangerous and risky, it does not follow that it is unethical – nor of course that it should be illegal.

Equally it is far from clear that germ editing is dangerous. Even if it were, it does not follow that it is unethical or that it should be banned. For too long we have allowed religious groups and other well meaning people to prevent us from exploring avenues that are potentially vital to human progress with arguments that range from questionable to completely wrong. It is time we moved away from absolute bans and started focusing on how to mitigate the dangers and risks instead.

This article originally appeared on The ConversationThe Conversation

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Culture

A Dying Language Is Making a Comeback

The story of the language’s decline, loss and rebirth is a remarkable example of cultural survival

In the summer of 1930, at the dawn of the Great Depression, a 21-year-old linguist named Morris Swadesh set out for Louisiana to record the area’s Native American languages, which were disappearing rapidly.

Morris and his peers were in a race against time to document them, and in the small town of Charenton on the Bayou Teche, he encountered Benjamin Paul and Delphine Ducloux, members of a small tribe called Chitimacha – and the last two speakers of their language.

But today, if you visited the Chitimacha reservation, you’d never know that their language went unspoken for half a century.

Over the past several decades, many Native American tribes have participated in what has become a robust language revitalization movement. As their populations of fluent speakers dwindle and age, tribes want to ensure that their heritage languages are passed on to the next generation – before it’s too late.

But because the Chitimacha tribe had no living speakers for a number of decades, it made the challenge that much greater. In the end, the story of the language’s decline, loss and rebirth is a remarkable example of cultural survival.

Why document a language?

Unlike some other cultural legacies, languages leave no trace in the archaeological record. There’s often no trace in the written record, either.

Only a small portion of the world’s estimated 7,000 languages are well-documented in places like dictionaries and grammar books. Those that are least well-documented are the most endangered.

Many dead or dying languages contain exotic features of verbal and written communication. Chitimacha, for example, doesn’t use a word “be” in phrases like “she is reading.” Instead, speakers must use a verb of position, such as “she sits reading” or “she stands reading.” These are things that challenge linguists’ understanding of how language works.

By working with Ben and Delphine, Morris was trying to capture a small piece of that linguistic diversity before it vanished.

One day, with Morris sitting on Ben’s porch dutifully scribbling down his every word in a composition notebook, Ben finished a story (a riveting tale of how the Chitimacha first acquired fire by stealing it from a mythical old blind man in the west). He then went on to tell Morris:

There were very many stories about the west. I believe I am doing well. I have not forgotten everything yet. When I die, you will not hear that sort of thing again. I am the only one here who knows the stories.

Ben passed away three years later, and Delphine not long thereafter. After their deaths, it seemed the Chitimacha language was doomed to silence.

Why do languages die?

How does a language come to have only two speakers? Why have so many Native American languages become endangered? The causes are manifold, but there are two main ones: sharp reductions in the population of the community that speaks the language, and interruptions in the traditional means of transferring the language from one generation to the next.

In the past, the former caused the most damage. Native American peoples were decimated by European diseases and subject to outright warfare.

Prior to European contact, the Chitimacha were lords of the bayou, with a territory stretching from Vermillion Bay in the west to present-day New Orleans in the east. They were expert canoe-makers and wielded extensive knowledge of the region’s labyrinthian network of waterways.

But by the time the French arrived in present-day Louisiana in 1699, the tribe’s numbers had dwindled to around 4,000, their communities gutted by European diseases that spread faster than the Europeans themselves.

After a protracted war with the French, they retreated deep into the bayou, where the their reservation at Charenton sits today. The 1910 census recorded just 69 people living there.

Only later did the second cause of language decline occur, when children on the reservation were sent to the infamous Carlisle Indian School in Pennsylvania, which interrupted the transmission of the language to the next generation.

Ben and Delphine, born in the latter half of the 1800s, were part of the last generation to learn the language at home. Eventually their parents and many of their peers passed away, leaving them as the last two speakers of the language.

Renaissance on the bayou

Ben probably never imagined that the efforts of him and Delphine would spark the tribe’s linguistic renaissance, awakening their language from 60 years of silence.

In the early 1990s, cultural director for the tribe Kim Walden received a call from the American Philosophical Society Library informing her that they had all of Morris’ notebooks, and even his drafts for a grammar manual and dictionary, which totaled hundreds of pages in all. Thus began the herculean effort to revive the language.

The tribe put together a small-but-dedicated team of language experts, who set out to learn their language as quickly as possible. They began to produce storybooks based on Ben and Delphine’s stories, and word lists from the dictionary manuscript.

In 2008, the tribe partnered with the software company Rosetta Stone on a two-year project to create computer software for learning the language, which today every registered tribal member has a copy of. This is where I came in, serving as editor and linguist consultant for the project, a monumental collaborative effort involving thousands of hours of translating, editing, recording and photographing. We’re now hard at work finishing a complete dictionary and learner’s reference grammar for the language.

Today, if you stroll through the reservation’s school, you’ll hear kids speaking Chitimacha in language classes, or using it with their friends in the hall. At home they practice with the Chitimacha version of Rosetta Stone, and this past year the tribe even launched a preschool immersion program.

The kids even make up slang that baffles adult ears, a sure sign that the language is doing well – and hopefully will continue to thrive, into the next generation and beyond.

This article originally appeared on The Conversation

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Science

Scientists Identify Long-Lost Remains of Early Virginia Settlers

A stone cross marking the grave of a 17t
Mladen Antonov—AFP/Getty Images A stone cross marking the grave of a 17th-century British settler is seen at the archaeological site of Jamestown, Va., on November 22, 2011.

The bodies were buried in the 17th century

Scientists used technology to identify the remains of four early residents of Jamestown, Va., the first permanent English settlement in what would become the United States.

The Jamestown Rediscovery Foundation at Historic Jamestowne and the Smithsonian’s Museum of Natural History announced on Tuesday that the settlers lived—and held high positions—in early English America as far back at 1608. About 100 people settled along the James River in what would become the first English settlement in 1607. The colony, however, was nearly wiped out due to conflict—with Native Americans in the area and with each other—as well as famine and disease. Among the identified remains were those of Rev. Robert Hunt, Jamestown’s first Anglican minister, and Captain Gabriel Archer, a leader among the early settlers and a rival of Captain John Smith. The remaining two, Sir Ferdinando Wainman and Captian William West, were relatives of the governor Lorde De La Warr.

Archeologists with Jamestown Rediscovery have been working to identify the remains since they were found in November of 2013. Scientists from both the Smithsonian and the Rediscovery Foundation examined artifacts from the graves, forensic evidence and technology like CT scans to determine who they were. (There’s a video explaining how here, on their website.) The discovery of the burial site, however, dates back to 2010 when Jamestown Rediscovery uncovered what the organization says is the earliest known Protestant Church in North America. Within that church— in the chancel, considered the holiest part of the building—scientists found the four burial sites that held the remains of these early settlers.

“This is an extraordinary discovery, one of the most important of recent times,” said James Horn, President of the Jamestown Rediscovery Foundation, in a press release. “These men were among the first founders of English America. They lived and died at a critical time in the history of the settlement — when Jamestown was on the brink of failure owing to food shortages, disease, and conflict with powerful local Indian peoples, the Powhatans.”

The church they were buried in is significant, too. According to Jamestown Rediscovery, Pocahontas and John Rolfe were married there.

TIME psychology

The Trick to Memorizing an Entire Foreign Dictionary

Practice makes perfect

New Zealand’s Nigel Richards, who doesn’t speak French, has won the French-language Scrabble world championships. In the Scrabble world, Richards is considered to be the best player ever, having won the English world Scrabble championships three times, the U.S. national championships five times and the U.K. Open six times. His latest remarkable feat was achieved after reportedly memorising the entire French Scrabble dictionary in just nine weeks.

Richards is not the only person who has wowed the world with exceptional memory skills. Dave Farrow is the Guinness World Record holder for greatest memory. In 2007 he spent around 14 hours memorising a random sequence of 59 separate packs of cards (3,068 individual cards), looking at each card once. In 1981, Rajan Mahadevan recited from memory the first 31,811 digits of pi, a record that was astonishingly broken by Hideaki Tomoyori in 1987, who recited 40,000 digits.

For those of us struggling to remember what happened a couple of days ago, such innately superior memory capacity is remarkable. The question of whether these people are born with exceptional memory ability or acquire it by deliberate practice has interested both scientists and the general public alike for hundreds of years.

Memory genius comes with practice

Many books were published in the 1980s and 90s on the topic of genius and exceptional performance, with pioneering research comparing the superior performance of chess experts over beginners.

What became apparent, however, is that, although some people were able to recall large amounts of information seemingly effortlessly, their memory was truly exceptional only for materials that were specific to their expertise. In one study in the 1970s, William Chase and Herbert Simon at Carnegie Mellon University had world chess experts recall the configuration of chess pieces on a chessboard. When the chess experts were shown an actual chess positioned board, their recall of the pieces was far superior to novices. However, with random chessboards, players of all skill level had the same poor recall performance.

In order to answer the question of how to achieve exceptional memory performance, Chase, alongside K Anders Ericsson, developed the “skilled memory theory” which proposed three basic principles.

First, individuals need to rely on prior knowledge and patterns to encode and store the material in long-term-memory – what they called the “encoding principle.” Second, encoded information needs a “retrieval structure” – meaning it is associated with a cue when first seen so that it can be triggered during retrieval from long-term memory. And third, with additional practice people become more proficient in their encoding and can store the same amount of presented information in less time – the “speed-up principle.”

Techniques to try

What this is referring to is a mnemonic strategy. We are all capable of using such strategies although some of us are more skilled at it then others. The oldest and most common method is the method of loci (Latin for “places”). In the method of loci, the mnemonist first creates a series of places, imagined rooms (the encoding principle), then puts what is to be remembered in said rooms, and finally walks from room to room in a fixed order, to recall the material (retrieval structure principle).

The more familiar and elaborate the detail of the imagined place is, the faster they will be able to place and retrieval material (the speed-up principle). Many mnemonic methods such as loci require such visualisation. For example, digit sequences can be associated with word links. If 59 is “lip” and 47 is “rock”, then 5947 can be remembered by an interactive image of “lips kissing a rock.” Other mnemonic techniques include a digit-consonant system or converting digits into syllables (based on the Japanese language) which are then regrouped into words.

Although Richards is a somewhat reclusive figure, so we can’t say for certain what techniques he used, it is more than likely that he is highly skilled at mnemonic strategies, along with having an exceptional mathematical talent to play scrabble. He, and others like him are able to utilize mnemonic strategies beyond our comprehensible understanding. However, whether it is his dedication to practice or some innate superior memory that is responsible for this ability is still under scientific investigation.

This article originally appeared on The ConversationThe Conversation

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME Science

This Affects Your Ability to Do Well on Exams

girl-writing-rear-view
Getty Images

Differences in academic exam results are—to a large extent—explained by differences in people’s DNA

Could it be that genetic differences can affect how well children perform in exams? Our research suggests that this may well be the case and that individual differences between children are, to a large extent, due to the inherited genetic differences between them that predisposes them to do well academically, whatever the subject.

We also found that there is shared genetic influence across a range of subjects, even after controlling the exam results for general intelligence.

It goes without saying that children’s exam results at the end of compulsory education play a significant role in their future education and career paths. And it is also reasonable to assume that schools play a major role in school achievement. But children differ in educational achievement within the same school – and even the same classroom. This suggests that factors other than school or classroom differences explain the wide variation in pupils’ exam results.

Our new research, published in Scientific Reports, examined the GCSE results, using classical twin method, that compares the correlations between identical and non-identical twins, and found that individual differences in exam results are to a large extent explained by the inherited differences in children’s DNA sequence.

We also found that many of the same genes influence achievement across a range of subjects – so, children who tend to do well in one subject tend to do well in others, largely for genetic reasons.

Previous research using data from the Twins Early Development Study (TEDS), found that there is substantial heritability for educational achievement in early and middle school years. Heritability is a population statistic, which describes the extent to which differences between children can be explained by the differences in their DNA, on average, in a particular population at a particular time.

So, for example, a heritability of 90%, means that 90% of individual differences observed in a group of people for a trait are explained by genetic differences between them and 10% explained by environmental factors. What it doesn’t tell us, is anything about an individual.

We already knew, based on our research which was published in 2007, (based on a UK representative sample of 7,500 twins pairs who were tested at the ages of 7, 9, and 12) that the average heritability for literacy and numeracy is almost 70%. In other words, more than two-thirds of the variation seen in academic test results is explained by genetic differences between children.

Further research from 2013 also found that educational achievement, as measured by standardised exams (GCSEs), at age 16 is also substantially heritable, with genetic factors explaining about 60% of the variance in results of the core subjects of English, mathematics and sciences.

How genes influence achievement

Our new study sought to determine whether the high heritability of core academic subjects also extends to various other subjects, such as history and geography, which involve more fact-based knowledge – or art, music and drama, which are more subjective subjects.

We analysed achievement data from the twins in TEDS to assess the extent to which genetic factors influence various school subjects – and, in particular, GCSE exam results.

We found that genes explain a larger proportion of the differences between children across different subjects (54-65%) than shared environmental factors, such as home and school environment combined (14-21%).

However, it’s important to stress that heritability is a population statistic and this does not mean that genetics explain 54-65% of a single child’s school achievement. But it does indicate that differences in academic exam results are, to a large extent, explained by differences in people’s DNA.

Our study indicates that this substantial heritability for school exams is not explained by intelligence alone, as heritability for GCSE grades for all subjects remained substantial even after statistically removing the intelligence scores from the exam results. This finding is in line with our previous research in which we found a similar result for the mandatory subjects of English, maths and science.

We had also found that heritability of GCSE exam results involves the joint contribution of many other factors, including children’s self-efficacy, or pupil’s belief in his/her abilities, behavioural problems, personality traits, well-being, and their perceptions of school environment – as well as their intelligence.

Although our results cannot be applied directly to classroom teaching right now, they do, however, add to the growing knowledge of why children differ so widely in educational achievement.

Same genes, range of subjects

Our new results also indicate that achievement across a wide range of academic subjects including English, mathematics, science, humanities, second languages, business and art are influenced by many of the same genes.

This shared genetic influence is, to a large extent, independent of intelligence, suggesting that there is a genetically driven “general academic achievement factor”. This means that its largely down to genetic reasons that children who tend to do well in one subject also tend to do well in others even when different levels of intelligence is controlled.

Our findings could also facilitate molecular genetic research that aims to identify the genes responsible for academic achievement by focusing on achievement across different subjects, rather than focusing only on a specific subject such as mathematics or English.

This article originally appeared on The ConversationThe Conversation

TIME Ideas hosts the world's leading voices, providing commentary and expertise on the most compelling events in news, society, and culture. We welcome outside contributions. To submit a piece, email ideas@time.com.

TIME MERS

There May Have Been a Major Breakthrough in MERS Treatment

487737801
Getty Images

Researchers in Hong Kong have cured infected monkeys of MERS using existing drugs

Two existing and widely available drugs may prove to be effective treatments for Middle East Respiratory Syndrome (MERS), new research published by the University of Hong Kong suggests.

According to the South China Morning Post, the medicines—lopinavir with ritonavir and a type of interferon—were tested on marmosets, small monkeys that a 2014 U.S. study concluded would be the best subject for MERS trials because of the way their reactions to the virus mimics human illness. The drugs, currently used to treat HIV and sclerosis, were found to be effective in curing MERS-infected marmosets.

The research is the first of its kind in the world.

“We would recommend doctors to start using both drugs immediately to treat MERS patients if they are critical,” said Jasper Chan Fuk-woo, one of the researchers, told SCMP. “The evidence in this study is quite strong in proving the effectiveness of these two drugs.”

Currently, there is no known cure for MERS.

Meanwhile, South Korea, which struggled with a MERS outbreak in May and June, has not reported any new MERS cases for 23 days and no deaths for more than two weeks. The country declared a “de-facto end” to its outbreak on July 28, although a spokesman for the World Health Organization told the BBC it would not declare an official end to the country’s outbreak until 28 days had passed with no new infections—twice the disease’s incubation period.

[SCMP]

TIME Science

Watch Neil DeGrasse Tyson Explain the History of the Universe in 8 Minutes

Because of course he does

Expert mind-blower, Pluto-hater and all-around explainer Neil DeGrasse Tyson has now proved he can explain everything in the universe by explaining literally everything in the universe.

On MinutePhysics, the famed astrophysicist draws from his book “Origins: Fourteen Billion Years of Cosmic Evolution.” He starts with the beginning of time about 13.7 billion years ago, when “all the space, and all the matter, and all the energy of the known universe was contained in a volume less than one-trillionth the size of the point of a pin.” He ends with homosapiens, who are enabled “to deduce the origin and the evolution of the universe,” like Tyson just did.

TIME Science

How We Measured Time Before Clocks

Big Ben, Palace of Westminster, London.
Heritage Images / Getty Images A view of the space behind one of the glass faces of Big Ben, in the clock tower at the Palace of Westminster, London, with a man inspecting repairs.

Our conceptions of time have become more accurate but less personal

History Today

This post is in partnership with History Today. The article below was originally published at HistoryToday.com.

If we think of time at all it is as a dimension: something we travel through, an abstract and universal measure against which we mark our progress and against which we are judged, from minute to minute, from hour to hour, from day to day, from birth to death. It dominates our lives and, like life under all tyrannies, we are so immersed in the ubiquity of its oppression we don’t notice the constraints. From where I sit now, I can see the time in three places. If I cared to, I could find it in four more without moving from my chair. The computer on which I am writing this can, with a little effort, be made to measure time in microseconds.

I can think of no practical use for that level of knowledge; but it is difficult not to feel the anxiety of its influence. Quick is good. Fast is better. Speed is everything. Most of us mark out our working days — and too much of our private lives — in the fine-sliced minutes of deadlines, alarms, appointments and schedules.

The ability to dissect time in such detail is a relatively recent phenomenon. Clocks did not have minute- or second-hands until the late 1600s. Hence there are no seconds in Shakespeare and minutes are mostly metaphor. The shortest practical unit of time in his plays is the quarter hour, as in the length of time Lady Macbeth has been seen trying to scrub the imagined blood from her hands, or that Prince Hal boasts it would take him to learn how to speak like a tinker.

But what was it like to live in a world so heedless of the passage of time? One answer is that one’s relationship with it becomes far more subjective and personal. The Greeks recognized two kinds of time: chronos — the scientific measurement of its passage — which is the sense we have retained; and kairos, which is more epiphanic, opportunistic and experiential. It was, and remains, also the Greek word for weather.

Even Renaissance science had to resort to more ad hoc, human measures, a quality of experience we can savor in a weather-related story. Among the papers of Thomas Harriot, the English mathematician and sometime scientific adviser to Walter Ralegh, is the record of a rainy afternoon in his room up beneath the leads in Durham House, Ralegh’s magnificent London home on the Strand, overlooking the Thames. Presumably at a loose end, Harriot decided to calculate how much rain would have fallen in his room over a 24-hour period were it not protected by its roof.

He had no means of measuring the passage of minutes or seconds, so he used his pulse, assuming that each beat of his heart equated to a second.

This was poor science but it points to an understanding of the world which we can no longer share. Time wasn’t only, or even principally, an external measure but also something to which our bodies and our experience of our bodies, our sense of ourselves, could be wholly aligned.

There is a similar story about the Counter-Reformation Cardinal and Jesuit, Robert Bellarmine, one of the judges who sentenced Giordano Bruno to be burned at the stake and the man who told Galileo to abjure Copernicanism. He was not in any sense anti-intellectual and had always had a deep personal interest in astronomy and science, but simply refused to accept that it could not be reconciled with doctrine.

On one occasion he set out to measure the speed of the sun’s rotation about the earth by sitting on a beach in south-west Italy — most likely Calabria — and timing the sunset. With no means of measuring time, however, he fell back on an intensely familiar, regular, unvaried unit of time, the recitation of Psalm 51, the miserere: ‘Have mercy upon me, Oh God.’ It is an acutely poignant image, the very measure of time he used embodying both the futility of his actions and the devotional passion of his certainties.

On some level, then, the emerging tension between chronos and kairos was also the struggle between empiricism and, for want of a better word, spirituality. The shadow of these tensions fall across Henry IV. Prince Hal’s destiny, his royal inheritance, is the arrow of time pulling him forward towards history. Falstaff is all kairos, life in the moment, to whom the measure of minutes and hours is superfluous. ‘What a devil hast thou to do with the time of the day?’ Hal asks him in their first scene. Falstaff – no one’s idea of spiritual – has no answer. But then, what is the answer to the demands of chronos?

Mathew Lyons is author of The Favourite: Ralegh and His Queen (Constable & Robinson, 2011).

Your browser is out of date. Please update your browser at http://update.microsoft.com