Monday, October 31, 2022

Unintentionally helping others steal your biometric data

The pieces of the puzzle

  1. Let's start by stating the obvious: people do upload a lot of videos and images to social media showing their family vacations, new dance movies, and, yes, twerking. These files are publicly available and can be easily gathered. Do you remember the old warning about being very careful about what you share on the internet? The security and privacy concerns were about showing where you live, who are your family members, and when you will be out of your house. Thanks to advancements in AI we can add a new reason to slow down posting so much about ourselves.
  2. Biometric-based authentication is the process of authenticating people based on something you are, i.e. an unique pyshical feature -- fingerprint, iris, or retina to name a few -- instead of something you know (password) or have (token). Some of the applications are multifactor authentication and face recognition, which are used to unlock smart phones and identify people in a crowd.
  3. Deepfake is an evolution of the tradition of inserting (or removing) people in pictures and videos using cropping and blue screens. Benign results have been seen in movies like Zelig and Forrest Gump; George Orwell' 1984 talks about using that for malign purpose, namely rewrite history. The difference is that thanks to AI, deepfake is automated to the point it runs in real time. The classical example of the potential of this technology is a Tom Cruise deepfake video created by Belgian visual effects artist Chris Ume:

    It did not take long for malicious individuals to apply deepfake to create celebrity porn videos, fake news, hoaxes, and financial fraud. What about the average people? They are not famous politician, singer, or athlete; can they shrug it off saying "this does not affect me; I am too small of a target for them to have an interest on" like they have done many times before, or should they be worried? The reality is that

    • Attackers are always looking for opportunities, and will strike at the low hanging fruit.
    • The cost of the resources required to deepfake has dropped a lot in the last few years.

Let's have some fun

How can we combine that? In 2007 (yes, time flies), Microsoft identified the following as the most popular types of biometric authentication devices of the time:

  • Fingerprint scanners
  • Facial pattern recognition devices
  • Hand geometry recognition devices
  • Iris scan identification devices
  • Retinal scan identification devices
Nowadays we can do all of that using just a camera. Let's consider a few applications that are possible today:
  • Videos and pictures collected from your social media provide enough info about your face to unlock your phone.
  • Inserting you in the CCTV records of a riot is just a matter of being able to access said records and change them. Only limiting factor here is bypassing tampering detection, which is not as common as you are led to believe. Yes, we are not at the Ghost In The Shell level, where video streams were being tampered in real time at the camera level, but there is enough knowledge to make some damage right now.
  • Back to those high quality videos found in social media, they are (not may be) good enough to collect your fingeprints or ear shape. The later has been successfully used to identify people in riots while wearing masks.
  • Saving the best for last, imagine someone using deepfake, after collecting your videos for images and voice samples, to have a webconference with your children's school or doctor. I will leave to your imagination to ponder on the consequences of that. Before you say anything, the Tom Cruise video I mentioned early is now old from Moore's Law's point of view.
We could go over an example of how to do that, but that is not the point of this article. If you thought your identity and, as a result, your privacy was at risk, I think we reached a whole new level.

What can be done to minimize exposing biometric data?

Think before posting! This rule has not changed. There are some who argue that millenials and Gen Z crowd are the biggest offenders, but this is just a matter of training. If you have to post, be mindful of what is being exposed. Or, cut down the quality of the pictures a bit so the bad guys do not have a nice clean image to start with. For the images and videos you already posted, once it is out in the internet, there is no coming back.

Make protecting your privacy a priority in your life. If people are going to steal your data, make them work for it.

Further reading

Trendmicro published a great paper on the risks of exposed biometric data.