Feds: Accuracy of Face Recognition Software Skyrockets
Get the world’s most fascinating discoveries delivered straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Delivered Daily
Daily Newsletter
Sign up for the latest discoveries, groundbreaking research and fascinating breakthroughs that impact you and the wider world direct to your inbox.
Once a week
Life's Little Mysteries
Feed your curiosity with an exclusive mystery every week, solved with science and delivered direct to your inbox before it's seen anywhere else.
Once a week
How It Works
Sign up to our free science & technology newsletter for your weekly fix of fascinating articles, quick quizzes, amazing images, and more
Delivered daily
Space.com Newsletter
Breaking space news, the latest updates on rocket launches, skywatching events and more!
Once a month
Watch This Space
Sign up to our monthly entertainment newsletter to keep up with all our coverage of the latest sci-fi and space movies, tv shows, games and books.
Once a week
Night Sky This Week
Discover this week's must-see night sky events, moon phases, and stunning astrophotos. Sign up for our skywatching newsletter and explore the universe with us!
Join the club
Get full access to premium articles, exclusive features and a growing list of member rewards.
After months of testing, federal investigators interested in gauging its usefulness in national security applications say face recognition software has improved by a factor of 20 in the last five years.
At least that's the recently reported results of the latest Face Recognition Vendor Test (FRVT) conducted during the past year by the National Institute of Standards and Technology (NIST) to determine which algorithms did the best job of verifying a person's identity by examining his or her face.
Jonathon Phillips, an electrical engineer who directed the test at NIST, explained to LiveScience that a similar FRVT conducted in 2002 showed that the best algorithms failed to make a correct comparison 20 percent of the time. But the rate of false rejections in the latest test was only 1 percent.
At least, he cautioned, those were the results of the best contenders—some of the 13 groups who entered had results no better than 2002. The contestants provided Windows or Linux algorithms, all of which NIST ran on the same computer using a large database of pictures provided by the government, using various pixel resolutions and lighting angles. These included 36,000 pictures provided by the U.S. State Department taken of people applying for non-immigrant visas at various U.S. consulates in Mexico.
Some of the algorithms took hundreds of hours to find matches in the database, but Phillips noted that speed is less of an issue in real-world biometric security applications. Instead of comparing every face in the database with every other face, security verification usually involves a single comparison—is the person who it is standing in front of the camera saying he is Mr. Smith really the same person as the Mr. Smith whose picture is on file in the computer?
“We fed the algorithms lots of data to get a statistically meaningful answer,” he said. “Our goal was to encourage improvement in the technology, and provide decision makers with numbers that would let them make an educated assessment of the technology itself."
With random lighting on each face (as opposed to tightly controlled studio lighting) the rejection rate rose to about 12 percent for the best algorithms, he noted, although that was still much better than the 2002 result, when the rejection rate was 20 percent.
Get the world’s most fascinating discoveries delivered straight to your inbox.
The best overall results, Phillips said, were by Neven Vision (since acquired by Google Inc.), Viisage Technology (since acquired by L-1 Identity Solutions of Stamford, CT), and Cognitec Systems of Germany. The University of Houston and China’s Tsinghua University also did well, he added.
Phillips noted that all the pictures were “full frontal,” meaning they were taken with the person facing the camera. No tests were made with random camera angles, but he noted that biometric security systems require the person to face the camera anyway. Uncontrolled camera angles might be the subject of a future FRVT, but none is currently scheduled.
