#SootinClaimon.Com : ขอบคุณแหล่งข้อมูล : หนังสือพิมพ์ The Nation.
In 2018, John Rael, a volunteer track coach in Taos, N.M., was on trial for allegedly raping a 14-year-old girl when his lawyer made an unusual request.
He wanted the judge to admit evidence from “EyeDetect,” a lie-detection test based on eye movements that Rael had passed.
The judge agreed, and five of the 12 jurors wound up voting not to convict. A mistrial was declared.
EyeDetect is the product of the Utah company Converus. “Imagine if you could exonerate the innocent and identify the liars . . . just by looking into their eyes,” the company’s YouTube channel promises. “Well, now you can!” Its chief executive, Todd Mickelsen, says they’ve built a better truth-detection mousetrap; he believes eye movements reflect their bearer far better than the much older and mostly discredited polygraph.
Its critics, however, say the EyeDetect is just the polygraph in more algorithmic clothing. The machine is fundamentally unable to deliver on its claims, they argue, because human truth-telling is too subtle for any data set.
And they worry that relying on it can lead to tragic outcomes, like punishing the innocent or providing a cloak for the guilty.
EyeDetect raises a question that draws all the way back to the Garden of Eden: Are humans so wired to tell the truth we’ll give ourselves away when we don’t?
And, to a more 21st-century query: can modern technology come up with the tools to detect those tells?
An EyeDetect test has a subject placed in front of a monitor with a digital camera and, as with the polygraph, is lobbed generically true-false queries like “have you ever hurt anybody” to establish a baseline. Then come specific questions. If the subject’s physical responses are more demonstrative there, they are presumed to be lying; less demonstrative, they’re telling the truth. The exact number of flubbed questions that determines a failure is governed by an algorithm; the computer spits out a yes-or-no based on an adjustable formula.
Where the polygraph measures blood pressure, breathing and sweat to determine the flubbing, EyeDetect looks at factors like pupil dilation and the rapidity of eye movement. “A polygraph is emotional,” Mickelsen said. “EyeDetect is cognitively based.” He explains the reason the company believes eye movements would be affected: “You have to think harder to lie than to tell the truth.”
EyeDetect plays into a form of techno-aspirational thinking. Our Web browser already pitches us a vacation we swear has only lived in our minds while dating apps serve up a romantic partner dreamed up in our hearts. Surely an algorithm can also peer into our soul?
But experts say such logic may not have much basis in science.
“People have been trying to make these predictions for a long time,” said Leonard Saxe, a psychologist at Brandeis University who has conducted some of the leading research in the field of truth-detection. “But the science has not progressed much in 100 years.”
Like most renowned experts, he has not reviewed EyeDetect’s research specifically. But, he says, “I don’t know of any evidence that eye movements are linked to deception.”
When it comes to the polygraph, experts have a long history of declaring failure.
The machine, which celebrates its centennial this year; continues to be used in areas like police interrogations, government security-clearance investigations and sex-offender monitoring. The market is valued at as much as $2 billion, powered by many federal and local offices using it for hiring purposes.
Yet the American Psychological Association takes an unequivocal position – “Most psychologists agree that there is little evidence that polygraph tests can accurately detect lies,” it declares on its website. Good liars, after all, can cover up tics, while nervous truth-tellers might set the machine berserk.
A 1988 federal law bans private employers from administering polygraphs, if with loopholes. Most states don’t accept them as evidence in court (New Mexico is famously looser), while a 1998 Supreme Court ruling denied federal criminal defendants had a particular constitutional right to them.
If it turns out to be more accurate than a polygraph, EyeDetect can conjure a number of useful consequences – and a few dystopic ones. What grisliness awaits if anyone could know if you were telling the truth just by looking at you? That lie to spare your Aunt Lily’s feelings at Christmas would be out the window; so would being a teenager.
If it proves hollow, though, an entirely different danger lurks: With its veneer of authority, many legal experts worry, it could lead law enforcement, private employers, government agencies and even some courts even further down the wrong path than the polygraph.
“It’s the imprimatur that’s the issue: we tend to believe that where there’s science involved it’s reliable,” said Loyola Marymount University law professor Laurie Levenson, who has studied the issue.
She said she was concerned that people wouldn’t get clearances for jobs or would otherwise be held accountable for things they did not do because of false positives. She noted it also could help the guilty get away.
There are historical reasons for skepticism about any new truth-telling tech. Like diet sweeteners to the soft-drink industry, such innovations come along in the legal world at regular intervals. But they often fall short of their promise. About 15 years ago, the functional MRI, which posited that blood flow to the brain could be the key to truth-detection, enjoyed a period of buzz. But the device largely did not meet scientific standards for broad usage, and the procedure’s cost and intensiveness further inhibited wide-scale adoption.
The p300 guilty-knowledge method championed by the longtime Northwestern professor Peter Rosenfeld, who asserted that the future of truth-detection lay in brain waves, gained some enthusiasm from the scientific community; it was the subject of several dozen outside academic articles, a number of them with positive results, and has won the tentative support of Henry Greely, a Stanford Law professor and one of the leading experts on tech and the law. Among other advantages, it involves an EEG that is fairly cheap and easy to use.
Still, Rosenfeld died last winter with the method not in widespread use. The work is continued by his lab, which has had a group of graduate students working on the p300, and is emphasized in the field by people like John Meixner, an assistant U.S. attorney in Michigan and a protege of Rosenfeld’s.
On a recent afternoon, Mickelsen sat in his office in the tech corridor of Lehi, Utah, and, over Zoom, coolly screen-shared a series of graphs and charts to make the case why EyeDetect is different from failed past technologies.
EyeDetect’s accuracy rate is determined by a simulated-crime interrogation. A group of “innocent” and “guilty” subjects are told whether to commit a simulated crime of petty cash theft in a manufactured environment and then administered the ocular test. The rate at which the machine will then correctly predict a person’s truth-telling status – Converus researchers already know the right answer for each subject – is between 83 and 87%, according to the test results, the company says.
That’s about the same as a polygraph tends to achieve in its tests, though the polygraph can discard up to 10% of borderline results as “inconclusive,” while the EyeDetect gives a result on every test, leaving its accuracy percentage higher. Mickelsen also says the system is preferable to a polygraph because, by entirely automating the test, it avoids the possibility of human bias.
The man at EyeDetect’s scientific core is John Kircher, a now-retired University of Utah professor who has consulted for the CIA and had his lab funded by the Defense Department. Kircher had been researching and writing software for lie-detection technologies for decades when, in the early 2000s, he came across a University of New Hampshire professor who was researching how our eye patterns change during reading.
“Suddenly it hit me: all the software I had developed for 30 years could be applied to this problem,” Kircher said. He wedded the two and, for much of the past two decades, has been perfecting EyeDetect, now as Converus’s chief scientist.
Kircher says that the software for his eye tracker captures a machine-level 350,000 eye-movement metrics – including “fixations,” the milliseconds-long pause between words – over a 25-minute test; four metrics are taken every second. (Converus also has the EyeDetect+, which adds a computer-administered variation on a traditional polygraph.)
EyeDetect has won its supporters in the field. Law-enforcement customers laud the system as smoother than a traditional polygraph.
“People will come in nervous because they’re expecting what they see on TV, where you’re hooked up to this machine and sweating and it just seems really invasive,” said Lt. Josh Hardee of the Wyoming Highway Patrol. “This is just clean and quick.”
Hardee’s department has used EyeDetect to screen more than 150 prospective job candidates in the last two years. His department and others pay around $5,000 for the EyeDetect system – which consists of a high-definition camera, a head-rest and software that generates the questions and takes and calculates the responses – and then $80 to Converus to score each individual test. (They must be trained to use the system as well.) The machine reduces liar false alarms, Hardee says, because anxiety plays less of a role.
Other public officials have also been persuaded. The Tucson, Ariz., fire department uses EyeDetect to screen employees. The machine is also put to use, Converus says, by law-enforcement or corrections departments in states including Idaho, New Hampshire, Washington, Utah, Ohio and Connecticut. Defense lawyers in the ongoing case of Jerrod Baum, accused of killing two teens in Utah, have petitioned the judge to allow EyeDetect. The jury in the Rael case appeared amenable too. (The defendant later pleaded guilty and avoided jail time; he was given four years probation.)
But many experts are not swayed by the enthusiasm. Saxe makes the point many scientists and academics do: even if eye movements are fundamentally different under different sets of circumstances, there’s no way to directly link them to lying. In fact, they could well have to do with just the fact that the subject is taking a test.
“Fear of detection is not a measure of deception,” he said. At heart, the issue may come down to the 21st-century desire to automate and digitize a process – human emotion and motivation – that fundamentally resists the enterprise.
Stanford’s Greely says EyeDetect doesn’t dislodge his broader skepticism about truth-telling tech, either.
“I see no reason to believe that this works well, or, really, at all,” he said in an email. “Show me large, well-designed impartial studies and I’ll be interested.”
In a phone interview he noted that while it’s not hypothetically impossible for the body to engage in particular physiological changes in response to lies, the burden of proof lies heavily on the new technologies. The tests involving simulated crimes that EyeDetect uses he said contain a fundamental flaw – people being instructed to lie in a test situation might well react differently, and more demonstratively, than a criminal in the real world.
He also noted a lack of published research by people not affiliated with Kircher or his lab.
Kircher says the Defense Department is currently conducting a study of ocular technologies which he hopes will conclude by the summer. A DoD spokesman did not reply to a request for comment.
Not all outside experts are unmoved, however.
“All truth-detection methods are imperfect. But here’s the reason it’s worth relying – not over-relying, but relying – on them,” said Clark Freshman, a law professor at UC Hastings who specializes in lie detection, expressing optimism about the EyeDetect.
“People can do even worse than a coin-flip at telling whether someone is lying. So if you get better results – even if it’s just 70 or 80% accurate – than if we didn’t use it, and it’s generally free from bias, I don’t understand why you wouldn’t make it part of the picture.”
He said studies showed juries do not overly rely on these technologies, but instead include them as one factor among many.
There is also a particular abuse concern with EyeDetect. Without the human element, there may be less bias. But the device could also be intentionally set at algorithmic levels that would make it difficult to pass – at least with the polygraph there’s a transcript of human conversations. (Converus says that it trains customers on how to use the machine and, while it acknowledges that it allows them to set their own “base rate of guilt,” a spokesman also says that “if we observe a BRG set outside of a reasonable range, we make an inquiry.”)
Even if ocular technology can’t actually root out fibbers, there may be some value in how it could discourage them in the first place. In other words, EyeDetect may not need whiz-bang technology. It just needs to look high-tech. As Brad Bradley, the fire chief in Tucson, says in materials from Converus: “A guilty applicant, such as one with a drug history, looks at it and thinks, ‘I’m not going to pass. So, I’m not going to even apply.'”
Still, the odds of this moving from realms like hiring to mainstream courtrooms are slim. Plato said the eye is the window to the soul; he was silent on whether it was a ticket out of jail.
“This is going to be an uphill climb in almost any court, especially after the debacle that was the polygraph,” said Loyola Marymount’s Levenson.
Nor, in her opinion, do they deserve to scale the hill. “Truth-telling should be determined by people, not machines,” she said.
Published : November 16, 2021
By : The Washington Post