866-997-4948(US-Canada Toll Free)

Published on : Aug 16, 2017

Recent research has shown that fingerprint examination can produce erroneous results. A report from the National Academy of Sciences has found that results are not necessarily repeatable and that even experienced examiners may disagree with their own conclusions which they drew in the past as they re-examine the same prints later on! This has led to the accusation of innocent people while criminals are roaming free. Scientists are therefore striving to reduce this human error and this week scientists from the National Institute of Standards and Technology (NIST) and Michigan State University have announced the development of an algorithm that automates a key step in the process of fingerprint analysis process. Elham Tabassi, a computer engineer at NIST said that by reducing the human subjectivity, the fingerprint analysis is made more reliable and efficient.

If all fingerprints are of high quality then matching them would be super easy. Computers can easily match two sets of rolled prints which are collected under controlled conditions. Anil Jain, the co-author of the study, who is also a computer scientist at Michigan State University said that at a crime scene there is no one directing the perpetrator on how to leave good prints, which is why the fingerprints (latent prints) left at crime scenes are distorted or partial, and sometimes smudged. Moreover, there are high chances of the prints being left on a background such as a 100$ bill, which makes it difficult to separate the fingerprint from the confusing background. Currently the process of determining a print quality is subjective and different examiners come to varying conclusions. Thus, automating this step will enable examiners to process evidence efficiently and solve crimes quickly.