February 24, 2020 | Alumni
A password you can't change: U of T alumnus Karl Martin on how to keep biometric data safe
By Amanda Hacio
Karl Martin (BASc 2001, MASc 2003, PhD 2010), a biometrics expert and alumnus of U of T's Faculty of Applied Science and Engineering, says biometric data is increasingly used as an added layer of security to authenticate users on handheld devices (photo by Lukenn Sabellano via Unsplash)
If you unlock your smartphone with facial recognition or your fingerprint, you’re using biometrics.
In the past few years, biometric data – positioned as an added layer of security to verify a person’s identity using unique physical traits – has become a reliable method of authentication for access to handheld devices.
But there are strings attached to this convenience: there’s a risk of identity theft, data gathering without consent as well as physical and online surveillance.
“If our biometric data is stolen, it’s equivalent to stealing a password that you can’t change,” says Karl Martin (BASc 2001, MASc 2003, PhD 2010), an alumnus of the University of Toronto’s Faculty of Applied Science & Engineering who is also a biometrics expert and entrepreneur.
U of T Engineering writer Amanda Hacio recently spoke with Martin to learn more about the security implications of giving out our most personal and unique data.
What is your experience with biometric data?
I co-founded – and led for many years – the company Nymi, which developed a biometric authentication wristband that simplifies authentication and compliance for workers in regulated industrial settings. The Nymi Band uses fingerprints and the electrocardiogram (ECG) to ensure high trust while maintaining privacy and usability. Prior to this, during both my PhD studies and while running a boutique consulting firm, I was involved in developing systems that used facial, ECG and handwritten-signature recognition.
What were some of the security concerns and challenges you ran up against when creating Nymi?
From the beginning, we recognized the importance of handling biometric data with a high degree of care. We took a stance that the biometric data must only be stored in the local device controlled by the user. We had to ensure that users could trust that no one could access the data.
Additionally, the communication between the wristband and other systems, such as mobile devices and computers, was based on Bluetooth, which is generally considered an unsecure means to communicate information. We had to develop a proprietary protocol that assumed that third parties would be snooping.
What are the security concerns related to biometric data more broadly?
We’re increasingly relying on biometrics to enable easy and reliable authentication to devices and systems. If our biometric data is stolen, it’s equivalent to stealing a password that you can’t change. Our accounts and data become vulnerable to being accessed without our authorization by people impersonating us using our biometric data.
Another danger is unauthorized surveillance. If biometric data is being gathered without our permission, it may be used to monitor and track us through a variety of sensors such as surveillance cameras or our online presence.
If stored locally within a device, it’s less likely to be targeted by attackers since there’s less of an opportunity for a mass, scalable data breach. It’s worth noting, however, that not all device-based storage is created equal. At the secure end, some devices such as Apple’s Touch ID and the Nymi Band use cryptographic hardware for secure storage. On the vulnerable end, a typical app on your phone is not secure and may itself be the source of a breach.
What can companies and users do to make sure the biometric technology they’re using isn’t stealing personal data or information?
I believe that people should not accept applications that move their biometric data into the cloud. Users should demand a “privacy-by-design approach,” which ensures that system design puts user privacy at the forefront. However, individual users are often at a disadvantage with a lack of transparency on how their data is being handled. I believe this is where regulations have a role in ensuring transparency and adoption of best practices when it comes to the design of systems.
What do you do if your data is already in the cloud?
The now classic adage is unfortunately true: Once something is on the internet, it’s there forever. But all is not lost, depending on the situation. If you’re enrolled in a system that stores your biometric data in the cloud, it’s worth un-enrolling yourself and attempting to have your data deleted. Unfortunately, there’s no guarantee that the service provider will comply or execute a secure deletion.
More generally, it’s likely for most of us that our face-image data is already online and associated with our identity through various social media sites. Given that this data is already out there and even being exploited, this is where regulations come into play. We should all consider advocating for regulations that prevent corporations from exploiting our data without our permission. And at least in the short term, we should give preferential treatment to companies and products that follow the Privacy by Design framework.
What unconventional forms of biometrics are being collected that the average person might not be aware of?
Two modalities that are now actively commercialized, but not well known, are electrocardiogram (ECG) and gait – our individually distinct manner of walking. At Nymi, we were the first to fully commercialize ECG recognition into a market-ready product. Gait, while not a strong identifier, can be used in video surveillance along with other factors to identify individuals, often without their knowledge.
As artificial intelligence (AI) becomes more sophisticated, what security concerns do you think will arise with biometric technology?
One of the applications of biometrics is emotion recognition, which can use a variety of signals such as facial expression and heart rate. While this technology is still in its infancy, there are both positive and negative potential implications.
On the positive side, it creates the opportunity to build applications that are adaptive to a user’s state of mind, delivering more customized and relevant experiences. On the negative side, with the proliferation of AI technologies, there is a risk of mass population manipulation such as what we saw with the Cambridge Analytica scandal – should biometric data not be protected and controlled by individuals.