Inside The Weird Brains Of Real-Time Translators

Inside The Weird Brains Of Real-Time Translators

The world’s most powerful computers can’t perform accurate real-time translation. Yet interpreters do it with ease. Geoff Watts meets the neuroscientists who are starting to explain this remarkable ability.

One morning, earlier this year, I paid a visit to the sole United Nations agency in London. The headquarters of the International Maritime Organisation (IMO) sit on the southern bank of the Thames, a short distance upstream from the Houses of Parliament. As I approached, I saw that a ship’s prow, sculpted in metal, was grafted like a nose to the ground floor of this otherwise bland building. Inside I met a dozen or so mostly female IMO translators. They were cheerful and chatty and better dressed than you might imagine for people who are often heard but rarely seen.

I walked upstairs to a glass-fronted booth, where I prepared to witness something both absolutely remarkable and utterly routine. The booth was about the size of a garden shed, and well lit but stuffy. Below us were the gently curving desks of the delegate hall, which was about half-full, occupied mostly by men in suits. I sat down between two interpreters named Marisa Pinkney and Carmen Soliño, and soon the first delegate started talking. Pinkney switched on her microphone. She paused briefly, and then began translating the delegate’s English sentences into Spanish.

Let’s unpick what she did that morning and itemise its components.

As the delegate spoke, Pinkney had to make sense of a message composed in one language while simultaneously constructing and articulating the same message in another tongue. The process required an extraordinary blend of sensory, motor and cognitive skills, all of which had to operate in unison. She did so continuously and in real time, without asking the speaker to slow down or clarify anything. She didn’t stammer or pause. Nothing in our evolutionary history can have programmed Pinkney’s brain for a task so peculiar and demanding. Executing it required versatility and nuance beyond the reach of the most powerful computers. It is a wonder that her brain, indeed any human brain, can do it at all.

Neuroscientists have explored language for decades and produced scores of studies on multilingual speakers. Yet understanding this process — simultaneous interpretation — is a much bigger scientific challenge. So much goes on in an interpreter’s brain that it’s hard even to know where to start. Recently, however, a handful of enthusiasts have taken up the challenge, and one region of the brain — the caudate nucleus — has already caught their attention.

The caudate isn’t a specialist language area; neuroscientists know it for its role in processes like decision making and trust. It’s like an orchestral conductor, coordinating activity across many brain regions to produce stunningly complex behaviours. Which means the results of the interpretation studies appear to tie into one of the biggest ideas to emerge from neuroscience over the past decade or two. It’s now clear that many of our most sophisticated abilities are made possible not by specialist brain areas dedicated to specific tasks, but by lightning-fast coordination between areas that control more general tasks, such as movement and hearing. Simultaneous interpretation, it seems, is yet another feat made possible by our networked brains.

Simultaneous interpretation often evokes a sense of drama. This may be because of its history: the creation of the League of Nations after World War I established the need for it, and use of the technique during the trials of senior Nazis at Nuremberg showcased its power. Doubts about accuracy lingered nonetheless; the UN Security Council didn’t fully adopt simultaneous interpretation until the early 1970s. “Until then they didn’t trust the interpreters,” says Barbara Moser-Mercer, an interpreter and researcher at the University of Geneva. But now the two traditional capitals of the multilingual conference world — the UN offices in Geneva and New York — have been joined by Brussels, as the expanding European Union incorporates more and more languages. The current total is 24, and some meetings involve interpretation of every one.

Looking down over the delegates at the IMO, I was reminded of the view from a captain’s bridge, or the gallery of a television studio. I had a feeling of control, a perverse reaction given that control is one thing interpreters lack. The words they utter and the speed at which they talk are determined by others. And even though Pinkney and Soliño had copies of some of the speeches that had been prepared for that morning, they had to be alive to humorous asides. Puns, sarcasm, irony and culture-specific jokes are an interpreter’s nightmare. As one interpreter has noted in an academic article, “Puns based on a single word with multiple meanings in the source language should generally not be attempted by interpreters, as the result will probably not be funny.” Quite.

Many of the delegates spoke in English, so the pressure on Anne Miles in the into-English booth down the hall was sporadic. Miles speaks French, German, Italian and Russian, and has been interpreting for 30 years. In between translating she told me about word order, another challenge that interpreters face daily. “With German the ‘nicht‘, the ‘not’, can come at the very end of the sentence. So you may be enthusing about something and then the speaker finally says ‘nicht‘. But if you’re a German native you can hear the ‘nicht‘ coming by the intonation.” Word order is a particular problem in fish meetings, which Miles said she dreads. In a long sentence about a particular variety of fish, and in a language where the noun — the name of the fish — comes towards the end, the interpreter is left guessing about the subject of the sentence until it’s completed.

There’s humour in these pitfalls, of course. Miles told me about an agricultural meeting at which delegates discussed frozen bull’s semen; a French interpreter translated this as “matelot congelés“, or ‘deep-frozen sailors’. And she shared an error of her own, produced when a delegate spoke of the need to settle something “avant Milan” — ‘before Milan’, the city being the venue for a forthcoming meeting. Miles didn’t know about the Milan summit, so said that the issue wasn’t going to be settled for “mille ans“, or ‘a thousand years’.

Some speakers talk too fast. “There are various strategies. Some interpreters think it’s best just to stop and just say the delegate is speaking too fast.” Miles herself doesn’t find that useful because people have a natural pace, and someone asked to slow down is likely to pick up speed again. The alternative is to précis. “You have to be quick on the uptake. It’s not just language skills in this job, it’s being quick-brained and learning fast.”

Challenges of this kind make simultaneous interpretation tiring, and explained why the two interpreters took it in turns to rest every half an hour. Watching by video is even worse. “We don’t like it at all,” Miles told me. Studies confirm that the process is more exhausting and stressful, probably because body language and facial expressions provide part of the message, and are harder to decipher when working remotely. “You get fewer visual clues as to what’s going on, even with a video link,” said Miles.

Then there’s the tedium. Crisis talks in New York might be gripping, but the average politician, never mind the average technical expert on marine regulations, isn’t likely to induce rapt attention for hours on end. The audience may slumber, but the interpreter must remain vigilant. As the meeting sailed on into a polyglot fog of procedural niceties and resolutions, each with sections and subsections, I realised how tiring this vigilance must be. Having nodded off in many a science conference — even once when chairing — I was in awe of the interpreters’ fortitude.

Moser-Mercer trained as an interpreter — she is fluent in German, English and French — before being sidetracked by neuroscience. “I got very intrigued with what was going on in my brain while I was interpreting,” she says. “I thought there has to be a way to find out.” When she arrived at the University of Geneva in 1987 there wasn’t a way — the interpretation department was concerned with training, not research. So she set out to create one by collaborating with colleagues in the brain sciences.

“Language is one of the more complex human cognitive functions,” Narly Golestani, Group Leader of the university’s Brain and Language Lab, tells me during a recent visit. “There’s been a lot of work on bilingualism. Interpretation goes one step beyond that because the two languages are active simultaneously. And not just in one modality, because you have perception and production at the same time. So the brain regions involved go to an extremely high level, beyond language.”

Inside The Weird Brains Of Real-Time Translators

In Geneva, as in many other neuroscience labs, the tool of choice is functional magnetic resonance imaging (fMRI). Using fMRI, researchers can watch the brain as it performs a specific task; applied to interpretation, it has already revealed the network of brain areas that make the process possible. One of these is Broca’s area, known for its role in language production and working memory, the function that allows us to maintain a grasp on what we’re thinking and doing. The area is also linked with neighbouring regions that help control language production and comprehension. “In interpretation, when a person hears something and has to translate and speak at the same time, there’s very strong functional interplay between these regions,” says Golestani.

Many other regions also seem to be involved, and there are myriad connections between them. The complexity of this network deterred Moser-Mercer from tackling them all at once; unravelling the workings of each component would have been overwhelming. Instead the Geneva researchers treat each element as a black box, and focus on understanding how the boxes are linked and coordinated. “Our research is about trying to understand the mechanisms that enable the interpreter to control these systems simultaneously,” says team member Alexis Hervais-Adelman.

Two regions in the striatum, the evolutionarily ancient core of the brain, have emerged as key to this executive management task: the caudate nucleus and the putamen. Neuroscientists already know that these structures play a role in other complex tasks, including learning and the planning and execution of movement. This means that there is no single brain centre devoted exclusively to the control of interpretation, say Hervais-Adelman and his colleagues. As with many other human behaviours studied using fMRI, it turns out that the feat is accomplished by multiple areas pitching in. And the brain areas that control the process are generalists, not specialists.

§

One of the triggers of this piece was a trivial conversation. Someone told me of a simultaneous interpreter so proficient that he could do a crossword while working. No name or date or place was mentioned, so I was sceptical. But just to check I contacted a few professional interpreters. One thought he might have heard a rumour; the others were dismissive. An urban myth, they said.

I ask Moser-Mercer if interpreters ever do anything else while interpreting. In a job dominated by women, she tells me, some knit — or used to when it was a more popular pastime. And you can see how a regular manual action might complement the cerebral activity of translation. But a crossword puzzle? Moser-Mercer hasn’t tried it, but she tells me that under exceptional circumstances — a familiar topic, lucid speakers, etc. — she thinks she could.

That such a feat might be possible suggests that interesting things are indeed happening in the brains of simultaneous interpreters. And there are other reasons for thinking that interpreters’ brains have been shaped by their profession. They’re good at ignoring themselves, for example. Under normal circumstances listening to your voice is essential to monitoring your speech. But interpreters have to concentrate on the word they’re translating, so they learn to pay less attention to their own voice.

This was first demonstrated 20 years ago in a simple experiment devised by Franco Fabbro and his colleagues at the University of Trieste in Italy. Fabbro asked 24 students to recite the days of the week and the months of the year in reverse order while listening to themselves through headphones. First they heard themselves with no delay. They then repeated the exercise with delayed feedback of 150, 200 and 250 milliseconds. Even a slight delay subverts speech, forcing listeners to slow down, stutter, slur and even come to a halt. Sure enough, many of the students made errors. But half of the group were in their third or fourth year at the university’s School of Translators and Interpreters, and these students suffered no significant disruption.

Some habits acquired in the workplace may carry over to the home. One way that experienced interpreters acquire speed is by learning to predict what speakers are about to say. “I will always anticipate the end of a sentence, no matter who I’m talking to and whether or not I’m wearing a headset,” says Moser-Mercer. “I will never wait for you to finish your sentence. Many of us interpreters know this from our spouses and kids. ‘You never let me finish…’ And it’s true. We’re always trying to jump in.”

Interpreters also have to be able to cope with stress and exercise self-control when working with difficult speakers. I read one review, based on questionnaires given to interpreters, which suggested that members of the profession are, as a consequence, highly strung, temperamental, touchy and prima donna-ish. Maybe. But I couldn’t see it in Marisa, Carmen or Anne.

A few years ago, the Geneva researchers asked 50 multilingual students to lie in a brain scanner and carry out a series of language exercises. In one subjects merely listened to a sentence and said nothing. Another involved the students repeating the sentence in the same language. The third was the most onerous: subjects were asked to repeat what they were hearing, this time translating it into another language.

In cognitive terms this seems like a big step up. Initially the students just had to listen, and then to repeat. Task three required them to think about meaning and how to translate it: to interpret simultaneously. But the scans didn’t reveal any neural fireworks. “There wasn’t a huge amount of additional engagement,” says Hervais-Adelman. No extra activity in regions that handle comprehension or articulation, for example. “It was just a handful of specific regions that were handling the extra load of the interpreting.” These included areas that control movement, such as the premotor cortex and the caudate. Interpretation, in other words, may be about managing specialised resources rather than adding substantially to them.

This idea remains unconfirmed, but the Geneva team added weight to it when they invited some of the same students back into the fMRI scanner a little over a year later. During that period 19 of the returnees had undergone a year of conference interpretation training, while the others had studied unrelated subjects. The brains of the trainee interpreters had changed, particularly parts of the right caudate, but not in the way you might expect — activity there lessened during the interpretation task. It is possible that the caudate had become a more efficient coordinator, or had learned how to farm out more of the task to other structures.

“It could be that as people become more experienced in simultaneous interpretation there’s less need for the kind of controlled response provided by the caudate,” says David Green, a neuroscientist at University College London who was not involved in the Geneva work. “The caudate plays a role in the control of all sorts of skilled actions. And there’s other work showing that as people get more skilled at a task you get less activation of it.”

The story that is emerging from the Geneva work — that interpretation is about coordinating more specialised brain areas — seems to gel with interpreters’ descriptions of how they work. To be really effective, for example, a simultaneous interpreter needs a repertoire of approaches. “The process has to adapt to varying circumstances,” says Moser-Mercer, who still does 40 to 50 days of interpretation a year, mainly for UN agencies. “There could be poor sound quality, or a speaker with an accent, or it might be a topic I don’t know much about. For instance, I wouldn’t interpret a fast speaker in the same way I would a slow one. It’s a different set of strategies. If there isn’t time to focus on each and every word that comes in you have to do a kind of intelligent sampling.” It may be that the flexible operation of the brain networks underpinning interpretation allows interpreters to optimise strategies for dealing with different types of speech. And different interpreters listening to the same material may use different strategies.

The results from the Geneva group also fit with a wider theme in neuroscience. When fMRI became widely available in 1990s, researchers rushed to identify the brain areas involved in almost every conceivable behaviour (including, yes, sex: several researchers have scanned the brains of subjects experiencing an orgasm). But on their own those data didn’t prove terribly useful, partly because complex behaviours don’t tend to be controlled by individual brain areas. Now the emphasis has shifted to understanding how different areas interact. Neuroscientists have learned that when we consider a potential purchase, for example, a network of areas that includes the prefrontal cortex and insula helps us decide whether the price is right. Interplay between another set of brain areas, including the entorhinal cortex and the hippocampus, helps store our memories of routes between places.

This more sophisticated understanding has been made possible in part by improvements in scanning technology. In the case of the caudate, activity there can now be distinguished from that in other parts of the basal ganglia, the larger brain area within which it is located. The finer-grained scans have revealed that the caudate is often involved in networks that regulate cognition and action, a role that puts it at the heart of an extraordinarily diverse range of behaviours. As a team of British researchers noted in a 2008 review, studies have shown that the caudate helps control everything from “a rat’s decision to press a lever to a human’s decision about how much to trust a partner in a financial exchange”.

One of the review’s authors was John Parkinson of Bangor University in Wales. I ask him if he would have predicted that the caudate would be involved in simultaneous interpretation. He says that at first he wouldn’t have. “The caudate is involved in the intentionality of an action, in its goal-directedness. Not so much in carrying it out but in why you’re doing it.” Then he thought about what interpreters do. Computers translate by rote, often with risible results. Humans have to think about meaning and intent. “The interpreter must actually try to identify what the message is and translate that,” says Parkinson. He agrees that the involvement of the caudate makes sense.

Given that the Geneva research is based partly in a department tasked with training interpreters, it’s natural to wonder if their scientific findings might eventually find a direct practical application. Moser-Mercer and her colleagues are careful to avoid extravagant claims, and rule out suggestions that brain scanners might be used to assess progress or select candidates with an aptitude for interpreting. But even if studying simultaneous interpretation doesn’t lead to immediate applications, it has already extended our knowledge of the neural pathways that link thinking with doing, and in the future it may help neuroscientists gain an even deeper understanding of the networked brain. The Geneva team wants next to explore the idea that some high-level aspects of cognition have evolved from evolutionarily older and simpler behaviours. The brain, they suggest, builds its complex cognitive repertoire upon on a lower level of what they call “essential” processes, such as movement or feeding. “This would be a very efficient way to do things,” Moser-Mercer and her colleagues tell me in an email. “It makes sense for the brain to evolve by reusing or by adapting its processors for multiple tasks, and it makes sense to wire the cognitive components of control directly into the system that will be responsible for effecting the behaviour.” Simultaneous interpreting, with its close back-and-forth relationship between cognition and action, may be an ideal test bed for such thinking.

Pictures: World Economic Forum, download.net.pl

This article first appeared on Mosaic and is republished here under Creative Commons licence. It is written by follow him on Twitter.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.