“Move fast and break things” is an abomination if your goal is to create a healthy society. Taking shortcuts may be financially profitable in the short-term, but the cost to society is too great to be justified. In a healthy society, we accommodate differently abled people through accessibility standards, not because it’s financially prudent but because it’s the right thing to do. In a healthy society, we make certain that the vulnerable amongst us are not harassed into silence because that is not the value behind free speech. In a healthy society, we strategically design to increase social cohesion because binaries are machine logic not human logic.
The problem is that digital design isn’t cynical enough.
First, the Internet is an amoral force that reduces friction, not an inevitable force for good. Second, sometimes different cultures simply have fundamentally different values. Third, if values are going to be preserved, they must be a leading factor in economic entanglement, not a trailing one. This is the point that Clinton got the most wrong: money, like tech, is amoral. If we insist it matters most our own morals will inevitably disappear.
Students’, educators’ and regulators’ critical resistance to edtech is likely to grow as we learn more about the ways it works, how it treats data, and in come cases how dysfunctional it is.
Increasingly, journalists are on to edtech, and are feeding into the growing sense of frustration and resistance by demonstrating these technologies don’t even fairly do what they claim to do.
So, there is a rising wave of edtech resistance from a wide variety of perspectives—from activists to students, journalists to regulators, and legal experts to ethicists.
Without a grounding in theory or knowledge or ethics or care, the Silicon Valley machine rewards stupid and dangerous ideas, propping up and propped up by ridiculous, self-serving men. There won’t ever be a reckoning if we’re nice.
Source: HEWN, No. 321
we cannot presume that the adjective “open” is sufficient when it comes to re-orienting our technologies towards justice.
Source: HEWN, No. 321
The plutocrat-backed neoliberal technocracy is being manufactured at universities around the world, and its corrupt ideology is being laundered by publications and think tanks funded by these same, unethical billionaires. And plenty of folks look the other way because they’re more committed to being in networks with the “innovators” than they are in building a world that is caring and just.
Source: HEWN, No. 320
Change also means that the ideas and concerns of all people need to be a part of the design phase and the auditing of systems, even if this slows down the process. We need to bring back and reinvigorate the profession of quality assurance so that products are not launched without systematic consideration of the harms that might occur. Call it security or call it safety, but it requires focusing on inclusion. After all, whether we like it or not, the tech industry is now in the business of global governance.
Move fast and break things” is an abomination if your goal is to create a healthy society.
A key question to building any software in the modern age is: “In the wrong hands, who could this harm?”
Decades ago, software seemed harmless. In 2019, when facial recognition is used to deport refugees and data provided by online services have been used to jail journalists, understanding who you’re building for, and who your software could harm, are vital. These are ideas that need to be incorporated not just into the strategies of our companies and the design processes of our product managers, but the daily development processes of our engineers. These are questions that need to be asked over and over again.
It’s no longer enough to build with rough consensus and running code. Empathy and humanity need to be first-class skills for every team – and the tools we use should reflect this.
After watching a few reviews of the new Leatherman multi-tool line during a logged out session, YouTube served me far right conspiracy theories and fear mongering as ads.