Panopticon .. that’s a new word for me. It is satisfying to say and particularly in the context of the area of automatic facial recognition (AFR), gratifying to learn about.
I was drawn to the Wikipedia page on the Panopticon by this article on Medium by a contributor Meltem Demirors.
The panoptican is a type of building and a form of control. Jeremy Bentham, a social theorist in the 18th century, came up with a design for a prison whereby all the inmates could be observed by a single watchman. As stated in the Wikipedia entry:
Although it is physically impossible for the single watchman to observe all the inmates’ cells at once, the fact that the inmates cannot know when they are being watched means that they are motivated to act as though they are being watched at all times. Thus, they are effectively compelled to regulate their own behaviour.
The use of AFR is a form of panoptican. The threat of observation and monitoring is bound to influence how we regulate our behaviour. There are a multitude of protests happening right now – how will the increasing (and currently unregulated use) of AFR in public spaces impact on people’s desire to be counted in the future?
Demirors speaks of the necessary privacies that we should demand in the face of pervading surveillance.
- Privacy in Economic Interactions, meaning who we send money to, how, when, in what amount, and why is something we have a right to keep private to ourselves and the recipient
- Privacy in Movement, meaning we should be able to move about physical, digital, and virtual space with anonymity, and we should be able to enter and leave spaces, whether in real life or online, without giving out identifying information
- Privacy in Communications, meaning we should be able to conduct conversations with certainty that they will remain private, and that we should be able to abstract our identity from our communications both in the physical world and online
I would add a fourth perhaps.. Privacy in Protest, meaning we should be able to attend a gathering with like-minded others to raise awareness on a critical issue, and that we should be able to do this without fear of our biometric information (or our children’s or young people’s) being added to a database for purposes beyond our knowledge or control.
The protest, the peace gathering, the vigil, the action – all forms of action which are there to empower people to stand up and speak out on behalf of others, could be made impotent under the threat of surveillance and its unknown implications.
This tweet generated over 200 replies … with a real split between those who will NEVER fly again and those who are either resigned to the intrinsic use of AFR technologies for travel or think the security afforded by the technologies is a good thing.
AFR technologies have been used in Canadian airports since 2017. Our EU biometric passports utilise AFR technology. These e-passports have a chip in them with the holder’s facial biometric. And, as I said in a previous post, research in to AFR began in the 1960s with the work of Woody Bledsoe, Helen Chan Wolf, and Charles Bisson.
Despite the seemingly unstoppable rollout of AFR for border security, what can be questioned is its use in public life. Should we have AFR in the high street or in hospitals? Should it be used at football games or at music concerts? It is already happening in some of these places. That is why Ed Bridges, represented by Liberty, is taking South Wales Police to court for their use of AFR in public spaces. ‘South Wales Police has used facial recognition in public spaces on at least 22 occasions since May 2017. Ed believes his face was scanned by South Wales Police at both a peaceful anti-arms protest and while doing his Christmas shopping.’ (Liberty website accessed 25-04-19).
Ed Bridges says:
“Without warning the police have used this invasive technology on peaceful protesters and thousands of people going about their daily business, providing no explanation of how it works and no opportunity for us to consent. The police’s indiscriminate use of facial recognition technology on our streets makes our privacy rights worthless and will force us all to alter our behaviour – it needs to be challenged and it needs to stop.”
The key here is that the indiscriminate use of AFR, without regulation or limits, will force us all to alter our behaviour. It thwarts our right to autonomous action and therefore, limits our individual and collective potential to envision and create a better future.
Postscript: That ‘final’ sentence clearly indicates my inherent bias and so in the interests of balance this is a link from digital security company Gemalto with an article published this month on the current trends in AFR (its a good read and worth going to) https://www.gemalto.com/govt/biometrics/facial-recognition.
BTW: many thanks to Dr Ian Cook who forwarded me the tweet. It provided many hours of onward links.
I spent Tuesday morning making my face into its own dot-to-dot drawing.
Using this diagram as a guide (taken from the Wonderworks Museum information panel), I drew dots on my face that align with its form and structure. Places such as: REyebrowEnd, REyebrowMid, NoseBridge, LOrbitalUpper, LEar, LOrbitalLower, RJawEnd, RMidForehead… and so on. And, then I drew lines between the dots.
Despite feeling like I was getting ready for an off the wall Halloween party, this was a useful exercise.
The measuring and scanning and recording of our faces is an intimate activity. By spending about 30 minutes firstly drawing dots and then joining them together I spent more time looking at myself than in the past 10 years. (It is little embarrassing. But undoubtedly funny too.)
This dot-to-dot exercise emphasised to me what it means to have your face captured and scrutinised, and brought my head-space rational thinking into a bodily-felt emotion-inducing space.
My initial research into automatic facial recognition (AFR) has thrown up a long list of links. This blog is going to be really useful in helping me to select the ones I have the most to learn from.
There is a mass of information online partly because the research in AFR began in the 1960s, and partly because governments appear very enthusiastic to support its development. The main aim is to make AFR more efficient and effective – to bring its success rate up to 100%.
I had no idea there were so many different forms of AFR. Basically the software needs to measure the face and then compare these measurements to a database of faces. There are many ways to capture a face, such as: plotting points and making measurements, using infrared light, using 3D scanning, analysing skin texture.
The databases themselves throw up a multitude of problems. The early databases consist almost entirely of white males (reflecting, I guess, the associates of the computer science researchers), with later databases featuring women and people who aren’t white caucasian. See this article in the Guardian.
This link below lists databases from universities that are are available for use. http://www.face-rec.org/databases/ – a fascinating insight into the mechanics of AFR research – I also find myself questioning the ethics of how these databases are compiled in the first instance. One dataset consists of women (and presumably young women and girls) who are there doing YouTube make up tutorials. Once online they have no consent with where their image goes or how it is used.