Friday , September 30 2022

Demonstrate that fingerprint readers can be infringed by fingerprints



[ad_1]

Fingerprints are real or synthetic fingerprints that coincidentally match a large number of real fingerprints. In this work, the team of researchers from New York University (NYU) and State University of Michigan (MSU) created master fingerprint images by a method known as Latent Variable Model and using a Learning Technology Machine. fingerprints, called "DeepMasterPrints," have a 20% efficiency and allow for the reconstruction of fingerprints used in identification systems that can be exploited through an attack similar to "dictionaries attack ".

In a work presented at a Biometric Security Conference (BTAS 2018), experts explain that DeepMasterPrint created experts in two things. On the one hand, for ergonomic reasons, fingerprint sensors are often too small (as in smartphones), which makes them work by using a portion of a user's fingerprint image. Therefore, since identifying small fingerprint identities is not an easy task, as it might be reading a full fingerprint, the probability that a fingerprint portion of a finger does not coincide with another portion of the fingerprint of a different fingerprint finger is high. Aditi Roy, researcher, took into account and introduced the concept of the major fingerprints, which are a set of real or synthetic fingerprints that may coincide with a large number of other fingerprints.

The second thing they have in mind is that some fingerprints have common features between them. This means that a fake footprint containing many common features has more real chances of matching other fingerprints.

From here, the researchers used a type of artificial intelligence algorithm called the "Competitive Gene Network" to artificially create new fingerprints that can match as many fingerprints as possible. In this way they managed to develop a library of artificial fingerprints that act as key keys for a particular biometric identification system. In addition, it is not necessary to have a sample of fingerprints belonging to a particular individual, but it can be executed against anonymous individuals and still have some room for success.

Although it is very difficult for an attacker to use something similar to DeepMasterPrint, because it takes a lot of work to optimize artificial intelligence for a particular system, since each system is different, it is an example of what can happen over time and something you need to know. Something like this saw the Black Hat security conference this year when IBM researchers demonstrated through the demonstration that it was possible to develop malware that uses artificial intelligence to perform attacks based on facial recognition.

[ad_2]
Source link