SHE looks as innocuous as Miss Marple, Agatha Christie’s famous detective.
But also like Miss Marple, Julia Hirschberg, a professor of computer science at Columbia University, may spell trouble for a lot of liars.
That’s because Dr. Hirschberg is teaching computers how to spot deception — programming them to parse people’s speech for patterns that gauge whether they are being honest.
For this sort of lie detection, there’s no need to strap anyone into a machine. The person’s speech provides all the cues — loudness, changes in pitch, pauses between words, ums and ahs, nervous laughs and dozens of other tiny signs that can suggest a lie.
Dr. Hirschberg is not the only researcher using algorithms to trawl our utterances for evidence of our inner lives. A small band of linguists, engineers and computer scientists, among others, are busy training computers to recognize hallmarks of what they call emotional speech — talk that reflects deception, anger, friendliness and even flirtation.
Programs that succeed at spotting these submerged emotions may someday have many practical uses: software that suggests when chief executives at public conferences may be straying from the truth; programs at call centers that alert operators to irate customers on the line; or software at computerized matchmaking services that adds descriptives like “friendly” to usual ones like “single” and “female.”
The technology is becoming more accurate as labs share new building blocks, said Dan Jurafsky, a professor at Stanford whose research focuses on the understanding of language by both machines and humans. Recently, Dr. Jurafsky has been studying the language that people use in four-minute speed-dating sessions, analyzing it for qualities like friendliness and flirtatiousness. He is a winner of a MacArthur Foundation fellowship commonly called a “genius” award, and a co-author of the textbook “Speech and Language Processing.”
“The scientific goal is to understand how our emotions are reflected in our speech,” Dr. Jurafsky said. “The engineering goal is to build better systems that understand these emotions.”
The programs that these researchers are developing aren’t likely to be used as evidence in a court of law. After all, even the use of polygraphs is highly contentious. But the new programs are already doing better than people at some kinds of mind-reading.
Algorithms developed by Dr. Hirschberg and colleagues have been able to spot a liar 70 percent of the time in test situations, while people confronted with the same evidence had only 57 percent accuracy, Dr. Hirschberg said. The algorithms are based on an analysis of the ways people spoke in a research project when they lied or told the truth. In interviews, for example, the participants were asked to press one pedal when they were lying about an activity, and another pedal when telling the truth. Afterward, the recordings were analyzed for vocal features that might spell the deception.
For her continuing research, Dr. Hirschberg and two colleagues recently received a grant from the Air Force for nearly $1.5 million to develop algorithms to analyze English speakers and those who speak Arabic and Mandarin Chinese.
Shrikanth Narayanan, an engineering professor at the University of Southern California who also uses computer methods to analyze emotional speech, notes that some aspects of irate language are easy to spot. In marital counseling arguments, for instance, the word “you” is a lot more common than “I” when spouses blame each other for problems.
But homing in on the finer signs of emotions is tougher. “We are constantly trying to calculate pitch very accurately” to capture minute variations, he said. His mathematical techniques use hundreds of cues from pitch, timing and intensity to distinguish between patterns of angry and non-angry speech.
His lab has also found ways to use vocal cues to spot inebriation, though it hasn’t yet had luck in making its computers detect humor — a hard task for the machines, he said.
Read more . . .
Bookmark this page for “lie detection” and check back regularly as these articles update on a very frequent basis. The view is set to “news”. Try clicking on “video” and “2” for more articles.