Avoiding a biometric dystopia
In part one of our two-part series, we explored how biometric authentication methods are being defeated. In the second part, we’ll explore how manipulating biometrics can alter society, and what can be done to avoid a biometric dystopia.
Biometric authentication secures access to most consumer phones, many laptops and PCs, and even physical access to homes and offices. Many of the consequences of defeating biometric authentication are no different than those of defeating other forms of authentication like a password or house key: stolen property, account takeover, fraudulent purchases, identity theft, violation of privacy, etc.
To neutralize the threat of a stolen password or key, we can easily change them at will, but the same can’t be said for our biometry. Modifying our biometry is either so impractical or impossible that most of our biometry is effectively static. This persistence is what makes our biometric data so inherent to us, but it’s also what makes that data so valuable and dangerous. Once that data is known, it can be replayed, shared and manipulated.
Bending biometrics
In courts of law and forensic laboratories around the world, biometrics have the final authority for identifying people. Whether identifying someone recorded in video evidence or human remains, society depends on biometrics when it matters most. If biometric authentication can’t be trusted, attribution can’t be definitively made. Criminals will evade justice, innocent victims will be wrongly convicted, and crimes will be easier to mask.
Such an outcome may seem unrealistic, but advancements in AI have already brought this reality-bending risk to our collective doorstep. A burgeoning new technology known as general adversarial networks (GAN) uses advanced machine learning to create synthetic data derived from a source dataset. In the context of biometrics this could mean, for example, using a source of authentic data like a set of selfie photos. Then GAN could be used to learn every trait of the pictured individual—including their biometry—in order to develop a model that can then be leveraged to generate convincing inauthentic photos of that individual in manipulated or fictitious settings.
GAN is the technology behind so-called deepfakes and it can be applied to not just photos, but also videos, audio and any other digital medium. It can be used to impersonate others, manipulate digital media, and extract information from datasets in ways that have never been possible or practical. We’re only at the infancy of this technology.
In our world of real time social media, deepfakes are especially dangerous. The capability to manipulate and fabricate reality is the dream of propagandists, internet trolls, adversaries, bullies and dictators everywhere. Deepfakes can be used to manipulate citizens, frame political adversaries, and incite violence and division. The ability to manipulate and simulate biometry is a threat to civil society everywhere.
All GAN needs to create this sort of havoc is source data that is already uploaded online every second, recorded by cameras on every street corner and leaked every day on the Internet.
Steps to prevent a biometric dystopia
To prevent the corruption of biometrics from undermining our security and privacy, societies must take proactive and defensive steps to protect the integrity of biometric authentication before it’s too late.
Collecting and storing data
First and foremost, we must regulate and protect sources of biometric data, and hold those that collect and store that data accountable. This must also include a broadening of our understanding of what constitutes sources of biometric data since emerging technologies are making it possible to extract biometry in new and unforeseen ways.
In jaw-dropping research unveiled recently, Massachusetts Institute of Technology researchers demonstrated using AI to reconstruct a person’s face purely from an audio recording of their voice with surprising accuracy. Such advancements further cement the need for secure collection and storage of biometric data.
Securing data
When the collection and storage of biometric data is warranted, that data must be adequately secured using standardized and well-established practices. This means end-to-end cryptography and encryption of data at rest, decentralized storage of data rather than centralized cloud repositories, and the ability to purge all instances of that data at will.
Additional authentication factors
Multifactor authentication must be employed even when biometrics are used. Biometric authentication is not a panacea, and using additional factors of authentication alongside biometrics eliminates a single point of failure.
New technology to fight deepfakes
Lastly, we must develop new technologies and approaches to uncover and stop the spread of deepfakes. Hidden watermarks, digital signatures, distributed ledgers, and even using GAN itself to create models of AI-generated and manipulated content all show promise to help defend ourselves against the threat of deepfakes.
Now is the time to act
Since the dawn of humanity, mankind has been enslaved to biometry. The survival of our cave-dwelling ancestors depended on their subconscious ability to discern friend from foe through their unique physical traits. Our ability to distinguish each other’s biometry enables our species to form the communities and relationships that are central to the human experience. It’s what makes the concept of individuality possible, and without individuality, the notion of privacy is moot. Biometrics are society’s ultimate arbiter of identification and ultimately, they contribute to our perception of reality.
To ensure that the attributes that define us don’t also divide and destroy us, we must act now.