The changing contours of the face: facial estrangement?  

[Gavin Smith]

**

Another key issue that increasing use of FRT sets in train is its capacity to render the face into a tool or medium of mass surveillance, an entity that can betray whatwho and where you are. This has all manner of implications for our ability to retain a sense of anonymity as we engage in everyday life. But it can also have implications for various civil rights, especially when the technology arbitrarily identifies and tracks us, or when it is utilised for making decisions that can effect our actions and mobility. Moreover, the consequence of being mis-identified, the so-called false positive or false negative, can have potentially serious consequences in the security field when one is wrongfully arrested or mistakenly given a banning order from a social space or activity. The case of non-white Uber drivers being prevented from working as a consequence of not being recognised by the company’s mobile facial recognition software is instructive in this regard. But beyond this, as the technology engages in both the simultaneous scanning and comparison of faces in real time (actual faces and simulated others), it comes to transform the meaning and phenomenology of the face in quite significant ways. In the process, the face becomes more than a material, unique, expressive and indeterminable medium – a thing that we are physically attached to, communicate through and exercise sovereignty over. Instead, the face becomes more akin to a virtualised interface, an entity that we are increasingly dispossessed and detached from as its geometric surface is flattened into a set of machine-readable binary numbers, and as it is effectively rendered into a proxy interface of measurement, identification and governance. In this way, the technology functions to place what might be thought of as increasing virtual borders around social spaces and agents, and it renders the physical face into a virtual passcode. Who you are and what you can do is contingent on what your face communicates about your biography and position. So if you have been designated a problem patron by a nightlife venue in a central Brisbane precinct at some point in time – and this detail has been recorded on your database-situated face print, so you may acquire (or flag up) an orange rating as your face resurfaces on another occasion at a different premise. And this historic rating will likely reduce your chances of being admitted entry, notwithstanding the fact that your conduct now complies with the prescribed social conventions of the context. These systems can be codified in ways that mean they have long and fairly static memories. And they function to engineer lists of individuals who are to be recognised because they are deemed to be problematic in some way – and thus deserving of some corrective action, or because they are categorised as being of higher value by a company and thus deserving of specialised treatment.  

The kinds of issues discussed above relate to broader biopolitical concerns regarding the increasing, and largely unregulated, use of FRT in various sectors, and attendant struggles over bodily materials that are re-purposed in technocratic ways for processes of governance. Important questions remain that pertain to relations of ownership and informed consent when sensitive social and biological phenomena, such as faces, are colonised by digital technologies and then datafied for political and economic ends. Members of the public are purposively left in the dark both by the biometric industry and those utilising the tech about how and when their face data is being collected and acted on, and by whom and for what purposes. Similarly, those seeking to better understand how these systems actually work in practice through research are also prevented entry to observe daily operations, as management worry about how the technology might be re-presented in the public sphere. For these and other reasons, FRT operates in a context of limited accountability, where information on the technical, procedural and operating dimensions of the tech is scant to absent, and users of the tech can essentially evade being answerable for the harms they may cause for the subjects of their biometric gaze. Another concerning aspect of the tech is when it gets leveraged for inferential ends. This is when it is used to make calculated and dehumanised determinations about a person’s demographic status or emotional state, and to target responses accordingly based on these categorisations. In these kinds of application, we can perceive the return to action, in highly technical forms, of the discredited and racialized episteme of physiognomy, where determinative judgements are taken on tenuous physiological data or behavioural signatures that are risk coded.  

In all of these applications, members of the public – be they workers, consumers or children – are not provided with the information required to make an informed decision about their participation in a system, and nor are they given details about which database or parties their faceprints are shared to. In this way, individuals are given limited opportunity or choice to opt in or out. And yet, people are intrinsically and biologically attached to their face and these systems operate in ways that both exploit and undermine this relation and sense of facial autonomy. In cases where a system leverages power over the mobility of an individual by virtue of what their face reveals or signals about them, we can easily imagine relations of facial estrangement, insecurity and alienation emerging, where a person comes to perceive their face in reconfigured ways, as a source of risk or an object in need of masking. And yet, faces are not merely passive or innate objects for mechanized modes of scanning and tracking as the industry configures them, but rather, they are complex masks that are imbued with considerable symbolic meaning and socio-cultural significance. In this way, facial recognition programs fundamentally impinge on the right to the face, and for this reason – and the myriad ways they unilaterally mine and exploit the face – we can expect these technologies of governance to mobilise forms of dissent. 

Comments are closed.

Create a website or blog at WordPress.com

Up ↑

%d bloggers like this: