Fingerprint “Master Keys” Generated with Neural Network Could Break Your Phone’s Password
Running the work with biometrics is always a balancing act.
By using passwords, authentication is simple — either it matches, or it fails
if not correct. But when that "password" is part of a part someone’s
body, whether an iris match, face scan, or just a usual old fingerprint, mobile
systems have to anticipate and account for a hint of wiggle room.
In spite of everything, one would never want that face scan
failing because he got a pimple on his face, or his fingerprint got rejected
just because of touching the sensor at a slightly different angle each time.
However, now a new attack is taking advantage of the flexibility programmed
into user’s mobile phone systems by creating fake "universal"
fingerprints.
These fake (synthetic) fingerprints, which are generally
called “DeepMasterPrints” mostly by the researchers, were created by feeding a
neural network image of the original fingerprints until it could generate its
own.
Most of these prints were then analyzed using the same kind
of certification algorithms employed by the scanners on users’ phones and
customized again and again in slight ways until they passed — even though they
didn't really match.
By doing this several times with a large data set, the team
could become able to come up with fingerprint images that have a sufficient
amount of elements in common with the prints of an average person. These prints
were the mains, over which the researchers claimed that scanners could readily
be tricked into giving a false positive. However, this isn't a matter of
matching to just one individual, either — the “DeepMasterPrints” are intended
to be working equally as well with any user.
How well, exactly? It relies on how challenging or demanding
the scanner you're attempting to fool is. All fingerprint scanners have to
believe some rate of false positives — circumstances where an unauthorized or
illegal print gets interpreted by mistake as authorized or legal. A very broadminded
scanner might let through 1.0% of false positives when the real fingerprints
are in use, but the “DeepMasterPrints” has an ability to fool that kind of
scanner an awful 77% of the time.
The “Stricter Scanners” with not more than 0.1% false
positives are still tricked by the “DeepMasterPrints” more than 22% of the
time; And even ones denying the entire, but the 0.01% of illegal prints, is
always going to fall for “DeepMasterPrints.”
Nothing of this has been quite enough to make us full-on
reject the plan of biometrics and return to just start using PINs and
passwords, but now it's undoubtedly a shocking look into just how much level of
security we're giving up for the sake of expediency and convenience. Thus, all
we can do is hoping that future devices will be built with attacks like this in
mind (programs), and offer some stronger false-positive rejection.
Mathew Anderson is a Microsoft Office expert and has been working in the
technical industry since 2002. As a technical expert, Mathew has written
technical blogs, manuals, white papers, and reviews for many websites such as office.com/setup.
Comments
Post a Comment