Students’, educators’ and regulators’ critical resistance to edtech is likely to grow as we learn more about the ways it works, how it treats data, and in come cases how dysfunctional it is.
Increasingly, journalists are on to edtech, and are feeding into the growing sense of frustration and resistance by demonstrating these technologies don’t even fairly do what they claim to do.
So, there is a rising wave of edtech resistance from a wide variety of perspectives—from activists to students, journalists to regulators, and legal experts to ethicists.
Without a grounding in theory or knowledge or ethics or care, the Silicon Valley machine rewards stupid and dangerous ideas, propping up and propped up by ridiculous, self-serving men. There won’t ever be a reckoning if we’re nice.
Source: HEWN, No. 321
we cannot presume that the adjective “open” is sufficient when it comes to re-orienting our technologies towards justice.
Source: HEWN, No. 321
The plutocrat-backed neoliberal technocracy is being manufactured at universities around the world, and its corrupt ideology is being laundered by publications and think tanks funded by these same, unethical billionaires. And plenty of folks look the other way because they’re more committed to being in networks with the “innovators” than they are in building a world that is caring and just.
Source: HEWN, No. 320
Change also means that the ideas and concerns of all people need to be a part of the design phase and the auditing of systems, even if this slows down the process. We need to bring back and reinvigorate the profession of quality assurance so that products are not launched without systematic consideration of the harms that might occur. Call it security or call it safety, but it requires focusing on inclusion. After all, whether we like it or not, the tech industry is now in the business of global governance.
Move fast and break things” is an abomination if your goal is to create a healthy society.
A key question to building any software in the modern age is: “In the wrong hands, who could this harm?”
Decades ago, software seemed harmless. In 2019, when facial recognition is used to deport refugees and data provided by online services have been used to jail journalists, understanding who you’re building for, and who your software could harm, are vital. These are ideas that need to be incorporated not just into the strategies of our companies and the design processes of our product managers, but the daily development processes of our engineers. These are questions that need to be asked over and over again.
It’s no longer enough to build with rough consensus and running code. Empathy and humanity need to be first-class skills for every team – and the tools we use should reflect this.
After watching a few reviews of the new Leatherman multi-tool line during a logged out session, YouTube served me far right conspiracy theories and fear mongering as ads.
Why is fairness to people with disabilities a different problem from fairness concerning other protected attributes like race and gender?
Disability status is much more diverse and complex in the ways that it affects people. A lot of systems will model race or gender as a simple variable with a small number of possible values. But when it comes to disability, there are so many different forms and different levels of severity. Some of them are permanent, some are temporary. Any one of us might join or leave this category at any time in our lives. It’s a dynamic thing.
I think the more general challenge for the AI community is how to handle outliers, because machine-learning systems—they learn norms, right? They optimize for norms and don’t treat outliers in any special way. But oftentimes people with disabilities don’t fit the norm. The way that machine learning judges people by who it thinks they’re similar to—even when it may never have seen anybody similar to you—is a fundamental limitation in terms of fair treatment for people with disabilities.
We accelerated progress itself, at least the capitalist and dystopian parts. Sometimes I’m proud, although just as often I’m ashamed. I am proudshamed.
I miss making things. I miss coding. I liked having power over machines. But power over humans is often awkward and sometimes painful to wield. I wish we’d built a better industry.
I believe the relative ease — not to mention the lack of tangible cost — of software updates has created a cultural laziness within the software engineering community. Moreover, because more and more of the hardware that we create is monitored and controlled by software, that cultural laziness is now creeping into hardware engineering — like building airliners. Less thought is now given to getting a design correct and simple up front because it’s so easy to fix what you didn’t get right later.
Every time a software update gets pushed to my Tesla, to the Garmin flight computers in my Cessna, to my Nest thermostat, and to the TVs in my house, I’m reminded that none of those things were complete when they left the factory — because their builders realized they didn’t have to be complete. The job could be done at any time in the future with a software update.