When your flesh becomes someone else’s data, start asking serious questions. This happens all the time in the medical field but there are usually strictly enforced condition of informed consent. These consent conditions are a part of an evolution of ethical frameworks stemming from the horrors revealed during the Nuremberg trials and include the Nuremberg Code of ethics, the Declaration of Helsinki and the Belmont report. These documents, and others, outline the principles of ethical conduct that are not just relevant to medicine and medical research, but to all human interaction where one party wants something (including information) from another – that is the interaction should be based on respect, justice and a clear intention to do no harm.
The right to privacy is enshrined in Article 12 of the Universal Declaration of Human Rights.
This right to privacy must include aspects of our most intimate selves – our corporeality in all its manifestations: Our many types of smiles, the way we rub our heads or stare into the distance, roll our eyes, wrinkle our noses, tap our toes, or shift our lips to let laughter leap from our throats.
Some nations and international organisations have attempted to enshrine these principles in law and regulations regarding the harvesting, use and storage of biometric data. This type of data is not only information about a person — for example, the type of information you share when you set up online accounts — but information directly of the person. Biometric data ‘enables the use of unique physical characteristics — such as a person’s face, voice or fingerprint — for verification or identification purposes’ (Royakkers et al., 2018, p.2). It is information of biological and physical attributes which can be linked to behavioural data.
The European Union’s (EU) General Data Protection Regulation (GDPR) define biometric data as ‘personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person…’.
This can include data on physical attributes involving facial, voice or fingerprints recognition technology or behavioural data such as gait, head, body and arm movements, keystroke behaviours, and in the near future, eye tracking and pupil dilation to assess the emotional state or engagement of users. A key ethical issue with this behavioural biometrics is that users may not even know this type of data is being harvested because they do not need to go through an extra step of providing intentional access to it; for example by allowing a machine to get a clear face view or voice signature or palm print. This makes the covert harvesting and nefarious use of behavioural data an area for serious concern for anybody interested in the areas of privacy and law, technology ethics and human rights (both in terms of privacy and bodily integrity)
Given widespread concerns about the harvesting and use of such data, what international approaches are there to governing biometrics? Arguably, the EUs GDPR provides one of the most comprehensive laws which includes biometrics. Approved by the EU Parliament on 14 April 2016, it comes into full effect on 25 May 2018, when heavy fines may be given out to organisations who do not comply with the regulation. Importantly, the GDPR enshrines stronger consent conditions:
‘Consent must be clear and distinguishable from other matters and provided in an intelligible and easily accessible form, using clear and plain language. It must be as easy to withdraw consent as it is to give it.’
The GDPR outlines a series of ‘Data Subject Rights’ some of which are:
- Right of an individual to be notified of a data breach, without undue delay (within 72 hours), where the breach puts at risk the rights and freedoms of individuals.
- Right to access (free of charge) to obtain from the data controller verification of what personal data is being processed, where and for what purpose, and to receive a copy of that data in electronic form free of charge.
- Right to be forgotten (erasure erasure) subject to data controllers balancing an individual’s rights to the public interest (this promises to be an interesting area to watch).
- Privacy by design where data protection is included from the onset of the designing of systems, instead of being an afterthought.
Beyond the EU, laws and regulations relating to biometrics look a little different. For example, in the US there is no federal law regarding biometric data but certain States have enacted their own legislation. For example, in 2008 Illinois passed the Biometric Information Privacy Act (BIPA) which provided rules for the collection and use of biometric data. BIPA prohibits the selling or profiting from the biometric data and with strict conditions (including consent) related to disclosure of the data. Texas and Washington State have also enacted biometric data legislation.
An area that has received little international attention but which highlights future concerns is the way the US schooling system is dealt with in biometric legislation. News stories are emerging of US schools investing in facial recognition technology for security reasons (and there are also stories about a Chinese school using similar technology to monitor student engagement!). An article published by the American Bar Association, provides a good summary of current State legislation relating to biometric data and schools:
“California law prohibits operators of websites geared towards K-12 school purposes from selling students’ biometric data and restricts their use. Delaware has a similar law. In North Carolina and West Virginia, student biometric data may not be kept in the student data systems. Illinois law prohibits school districts from collecting biometric information from students without parental consent, and they must stop using such information when the student graduates, leaves the school district, or when the district received a written request from the student and all biometric information must be destroyed within 30 days of discontinued use. The school district may only use biometric information for student identification or fraud prevention and may not sell or disclose to third parties without parental consent or pursuant to a court order. Arizona, Wisconsin, Louisiana, and Kansas have similar laws. Colorado law prohibits its Department of Education from collecting student biometric information unless required by state or federal law. A new Florida law enacted in 2014 goes even further than the foregoing state laws by prohibiting schools from collecting, obtaining, or retaining biometric information from students, their parents, or their siblings.”
In the Australian context, there is no specific law covering biometric data at a national level; however the issue was considered in depth by the Australian Law Reform Commission in 2008. I recently wrote to the Office of the Australian Information Commissioner to ask about the status of biometrics and privacy legislation. Their response reveals that biometrics would be considered sensitive data under the Privacy Act (1988).
This is how how the Australian Privacy Act (1988) positions biometrics as sensitive data:
The Office of the Privacy Commissioner of Canada states the use of biometric data by the federal government falls under the provisions of the Privacy Act and that biometric data collected, used or disclosed by private-sector organizations falls under the jurisdiction of the Personal Information Protection and Electronic Documents Act, or PIPEDA.
The Canadian Privacy Commissioner specifies several principles including that: people should be informed if their personal information is being collected; personal information should only be used for the purpose for which it was collected; personal information should only be collected for a clearly identified purpose. The Commissioner offers the following sage advice that there must be a clear justification using biometrics as a solution to a problem:
Legal battles and profound ethical objections regarding the use of biometrics continue to emerging globally. For example, human rights groups have recently raised serious concerns about India’s Aadhaar digital identity program including cases where poor citizens have been denied subsidized food because of data linking issues.
As the technology for/of biometrics continues to slice our bodies into pieces of data we must begin to ask whether now is the time to defend our right to bodily integrity. While feminism continues to fight for the right of women to maintain control over their own bodies, the rise of biometrics challenges us all to extend the battle. As the National Research Council (US) Whither Biometrics Committee (citing van der Ploeg) suggests:
‘(The rise of the) “informatized” body—a body that is represented not by human-observable anatomical and physical features but by the digital information about the body housed in databases….has implications for how we ultimately perceive and conceive of the individual.’
If don’t begin to ask ourselves what the implications of biometrics are for our understanding of ourselves and others as human, we may well be reduced to disassembled doll parts in the flows of data that attempt to capture, caress and corral our lives.
Royakkers, L., Timmer, J., Kool, L., & van Est, R. (2018). Societal and ethical issues of digitization. Ethics and Information Technology, 1-16. https://doi.org/10.1007/s10676-018-9452-x
Featured image: Jules Morgan, Doll parts, https://flic.kr/p/6vc9LF