You no longer own your face

Quite aghast to chance upon this article the other day.

Granted, judging from the article’s leisurely tone, it is addressed towards audience who are not informed in the many ways a facial recognition system could harm everyone. It is said to be ‘brilliant’, it can ‘help’ track student’s attendance, it can ‘help’ eliminate the archaic system of signing your name on a piece of paper or tapping your ID card on the scanner. You could also know where your students have been, for it also could track their locations real-time! (Now this is where I scream: NOOO)

The framing of the positivity of the program in this article here is where it can go wrong: when researchers or engineers can manipulate the lack of tech literacy among the public.

My first thought upon reading this article is about the limitation and/or ignorance of the university ethics boards. Probably blinded by how ‘groundbreaking’ this innovation is that it could elevate the university’s status, the boards decide to approve the project from the point of view of ‘what’ (the facial recognition system), the ‘how’ (biometrics software for identification) and a very flawed, specific ‘why’ (tracking students’ attendance) — disregarding the consequences far beyond the researchers can imagine.

Programs like facial recognition system are fed with dataset, as mentioned “utilising existing images from a given database in order to accurately log a student’s presence in class” — this is how they would know who came to class and how many days. Before collecting these dataset, researchers are supposedly to seek informed consent from students, complete with specific scope and limitations to what their dataset will be used for. And rightfully, if the dataset is collected just for that particular research project, then that’s the only use for that dataset, not for other projects. The method of collecting the dataset has to be also specific and in a way that the students would know, otherwise it would end up like the case of faculty of three universities who faced backlash after creating databases using surveillance footage as students as they walked in campus. They admitted to deleting the dataset, but because it’s Internet, the dataset lives and was found to be used on more than 100 machine-learning projects, and other projects which included universities, start-ups, and institutions worldwide including SenseTime and Megvii, Chinese surveillance firms linked to the state repression of Muslim minorities in China.

In an article ‘You no longer own your face’ by Sidney Fussell:

Every time a data set is accessed for a new project, the intention, scope, and potential for harm changes. The portability and pliability of data meet the speed of the internet, massively expanding the possibilities of any one research project, and scaling the risk far beyond what any one university can be held accountable for. For better or worse, they can only regulate the intentions of the original researchers.

Data privacy and concerns are just one aspect of it. More questions to ask would be: what would you do with the information about some of your students skipping classes? What about the non-inclusivity of other factors hindering your students to attend classes regularly — students with mental or physical health problems, students living in poverty, students with disabilities of any kind etc.

It would sound almost entitled to say this, but — please consult social scientists too! Here are things computing students can start to read before they embark on any research project — not only for AI, — especially ones involving the marginalised group:

Related reads:

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s