I had been reading Ibram X. Kendi’s How To Be An Antiracist over the weekend and was struck at how much of his central argument — that there is no such thing as neutrality — should always be one of the tenets and cautionary tales when dealing with technology. According to Kendi, one should strive to be antiracist, which is more than just, well, not being racist. Not being racist is the barest minimum anyone could do in a racist, non-neutral system, hence it is a stance that still in some way reinforces the status quo.
I remember spending years in tech industry being internalised with the idea that our products were built on a neutral ground. The codes, being just words and numbers, were pretty much objective so there was no argument that it is productive enough to discard human biases and judgements out of the equation. But then again I kept thinking, if there is such term as ‘tech for good’, is there such thing as ‘tech for evil’, albeit subtlely?
And then again technology, just like any other creation, is a product of human beings. When this is the premise, it means that any form of technology is encoded with years of human biases and structural inequalities, some of them unintended, but some of them much more deliberate than the rest — which means they can never be neutral. Of course none of these ideas are new, but it seemed groundbreaking to me when I first discovered it as a first-year PhD student who returned to school after years of spending time in an industry I believed to contribute so much to the advancement of humankind, only to realise it had also inflicted harm on so many people for as long as the field had ever come into existence.
But the good news is that both the tech scholars and practitioners are also becoming more aware of this issue. In that sense, there is so many work now being done to mitigate the gap between our technologies and the harms it produced from their creators’ ingrained biases. There were still some disagreements between them, sure, but there is great progress in acknowledging that there is no such thing as neutrality in technology. Right now, I am observing all of these developments with very great hope that from all of this, we can incrementally learn how to do better.
Reading in my tab:
- Don’t ask if artificial intelligence is good or fair, ask how it shifts power.
- When scholars collaborate with tech companies, how reliable are the findings?
- K-pop activism and US politics, explained.
- “Given the unemployment landscape we’re facing, however, we need to acknowledge and plan for the reality of a rapidly expanding gig economy. Instead of hoping in vain for gig employers to reclassify their workers as employees, we should accept that the gig model will only become more entrenched, and as such we should focus on expanding the temporary gains gig workers have seen during the pandemic into a permanent social safety net.” Gig workers are here to stay. Give them benefits.
- “In normal conversations with other people, we might choose to code-switch, alternating between languages, accents or ways of speaking, depending on one’s audience. But with automated speech recognition programs, there is no code-switching—either you assimilate, or you are not understood. This effectively censors voices that are not part of the ‘standard’ languages or accents used to create these technologies.” Speech recognition tech is yet another example of bias.
- Defund facial recognition!
- “But what are the things that make people panic? A sense of emergency, of unsustainability, of escalating harm and violence, of impending and irrevocable loss. That is definitely the backdrop. The already vast amount of human suffering in this pandemic is about to skyrocket as employment and housing protections are lifted; we’re going to see racial and economic inequality reinscribed so painfully with these huge disparities in the pandemic experiences of school-age children. The needless death will continue for months, all these incremental gains that people have struggled their whole lives to claim have been wiped out in a single season, and in the meantime Joe Biden is the Democratic nominee.” An interview with Jia Tolentino on the discipline of hope.
- “In a way, quarantine marks the triumph of humanity above a human — that survival of the former might mean inconvenience, suffering, or even demise of the latter.” To believe in quarantine meant to have faith in afterward.
- Why time feels so weird in 2020.
- I wish someone had told me this years ago, “This type of guilt tends to go away really quickly once you’re gone. You’re going to move on to the next phase in your career, your staff will move on to theirs, and you’re going to look back and wonder why you stayed so long. Take care of your health, and don’t feel guilty.”
- “This is the end, we leave the rest to you.” See Kaveh Akbar’s analysis of this 500-year-old Mesoamerican poem by the Nahuatl people.
- Reading: Sasha Costanza-Chock’s Design Justice, and finished Marjan Kamali’s The Stationery Shop of Tehran, of whose accounts on Persian food made my mouth water.
- Listening: “Well, before we release a product or make a design decision, we get together in a room and think really hard about what could go wrong with this product. But, it’s very difficult to imagine what might go wrong for people whose lives don’t look very much like the people who are inside that room doing that imagining.” Tech companies should make it someone’s job to think about ethics.
- Viewing: Open a new window somewhere in the world. I love this project so much.
- Food & Drink: I ordered margarita pizza and iced vanilla latte.