No more whoopsies

I came across this very insightful talk by Ethan Marcotte called the World Wide Work, in which he talked about automation, power, justice, and labour in the tech industry. They all might be big words to some of you, but because this is because they are just as important, and Ethan made some good points in the power of collective in making sure we as the people in the tech industry no longer overlook the harmful impacts we could inflict on other people as a result of our products and processes. It was explained in many forms of analogy — one of them refers to my favourite natural formation of all, the murmuration of starlings — and in stressing our very own individual potential in the industry and how we can become stronger and even more capable of making change as a whole community, like starlings as well, “individually, starlings are beautiful. Collectively, starlings become a wonder.” But that’s not all about it, Marcotte navigated the issue around the tendency of treating web and technology industry as apolitical, and enlightening us on how the power and greed of the elites — who owned the means, of whom we served, and who might also be some of us — could in return will be unintentionally reflected in the technology we build.

Unintentionally.

Not long after I posted the link to Marcotte’s talk on Twitter, The Engine Room co-founder, Alix Dunn posted a separate tweet referring to Tim Berners-Lee’s talk at ODI Summit, in which he said, “People are people and systems are weird, so we need to think about unintended consequences of the things we build.” Dunn quote tweeted, “I wish we could stop using the phrase ‘unintended consequences’ about harm in tech. It implies a) that positive intent of the maker matters and that it should be assumed b) that ‘thinking’ about possible harms is the bar for responsibility and c) that whoopsies is sufficient.” I read through it again and it gives me a change of perspective, the same way I came across how we should not talk about these harms as biases — for it insinuates individual accountability instead of the systemic flaw to which it originated — and thought about it, ‘unintended’, who am I kidding?

I felt like times and again when I talked about ‘whoopsies’ in tech (according to Dunn) I would often use ‘unintended’ or ‘nothing out of malice’ to hedge and pardon our doing. That’s internalised. As someone previously / on break from tech, I recognised that the ‘whoopsies’ happened and more so than often “it was never out of malice” — here we kept telling ourselves we are not avowed racists or classists like Robert Moses, who built an overpass bridge so low that buses which carry people of colour and less affluent could not go through to access the beautiful parks of Long Island — we are not bad people, it’s a thing we overlook, we didn’t mean it, we are just doing our job. But we have to recognise that in today’s world where there’s a wealth of information and resources coming from all directions that could educate us, there is no more room for apologies. We need to make space to recognise what harms we can inflict upon others in many ways. ‘Whoopsies’ are the typos you found on your papers the day of your submission, ‘whoopsies’ are when you accidentally stepped on a colleague’s shoes. ‘Whoopsies’ are not for when you run over people with your company’s self-driving car and certainly not for when you murdered someone in cold blood. (Serious question: what kind of moral compass these tech CEOs have? No wonder we are so screwed!) And that’s our job to make sure that in our products, our processes, our policies, our culture etc. that we do not overlook people who might not be able to experience the world as privileged as we do.

Earlier this month also I came across this article on the importance of empathy by product manager Can Duruk, recounting his moral dilemma between navigating work as sets of data and placing himself in the shoes of the users for the products he was involved in doing (oh the familiarity). He tells about a documentary on air traffic controllers he watched:

In it, the controller was talking about how he had to stop thinking of the planes as giant aluminum tubes full of people because that’s the only way he could function. You could see him tensed up as he spoke. He didn’t mean to belittle the hundreds of lives he was responsible for, he kept repeating. Just the opposite. He meant that the only way he could keep his composure, the only he could function with so many lives on the line was to not think about them.

Needless to say, I am horrified. But I was glad that in some ways, like Duruk, I too understand how he might have come to that solution. But we decide that this is a terrible, horrible idea, this idea of not thinking of people — one that doesn’t warrant a ‘whoopsie’ — even though none of us had ever been responsible for the thousands of lives of others. But if we think about it, maybe we do — perhaps not all thousands at the same time, but in a chain of events? I hate to think about it, but I hate more if I spend my life not caring.

Some related, some not:

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s