Feds: Accuracy of Face Recognition Software Skyrockets

False ID: Face Recognition on Trial

After months of testing, federal investigators interested in gauging its usefulness in national security applications say face recognition software has improved by a factor of 20 in the last five years.

At least that's the recently reported results of the latest Face Recognition Vendor Test (FRVT) conducted during the past year by the National Institute of Standards and Technology (NIST) to determine which algorithms did the best job of verifying a person's identity by examining his or her face.

Jonathon Phillips, an electrical engineer who directed the test at NIST, explained to LiveScience that a similar FRVT conducted in 2002 showed that the best algorithms failed to make a correct comparison 20 percent of the time. But the rate of false rejections in the latest test was only 1 percent.

At least, he cautioned, those were the results of the best contenders—some of the 13 groups who entered had results no better than 2002. The contestants provided Windows or Linux algorithms, all of which NIST ran on the same computer using a large database of pictures provided by the government, using various pixel resolutions and lighting angles. These included 36,000 pictures provided by the U.S. State Department taken of people applying for non-immigrant visas at various U.S. consulates in Mexico.

Some of the algorithms took hundreds of hours to find matches in the database, but Phillips noted that speed is less of an issue in real-world biometric security applications. Instead of comparing every face in the database with every other face, security verification usually involves a single comparison—is the person who it is standing in front of the camera saying he is Mr. Smith really the same person as the Mr. Smith whose picture is on file in the computer?

“We fed the algorithms lots of data to get a statistically meaningful answer,” he said. “Our goal was to encourage improvement in the technology, and provide decision makers with numbers that would let them make an educated assessment of the technology itself."

With random lighting on each face (as opposed to tightly controlled studio lighting) the rejection rate rose to about 12 percent for the best algorithms, he noted, although that was still much better than the 2002 result, when the rejection rate was 20 percent.

The best overall results, Phillips said, were by Neven Vision (since acquired by Google Inc.), Viisage Technology (since acquired by L-1 Identity Solutions of Stamford, CT), and Cognitec Systems of Germany. The University of Houston and China’s Tsinghua University also did well, he added.

Phillips noted that all the pictures were “full frontal,” meaning they were taken with the person facing the camera. No tests were made with random camera angles, but he noted that biometric security systems require the person to face the camera anyway. Uncontrolled camera angles might be the subject of a future FRVT, but none is currently scheduled.