When we develop and use educational technologies that monitor a student’s every moment in school and online, we groom that student for a lifetime of surveillance
Nevertheless, those who work in and work with education technology need to confront and resist this architecture – the “surveillance dataism,” to borrow Morozov’s phrase – even if (especially if) the outcomes promised are purportedly “for the good of the student.”
Students’, educators’ and regulators’ critical resistance to edtech is likely to grow as we learn more about the ways it works, how it treats data, and in come cases how dysfunctional it is.
Increasingly, journalists are on to edtech, and are feeding into the growing sense of frustration and resistance by demonstrating these technologies don’t even fairly do what they claim to do.
So, there is a rising wave of edtech resistance from a wide variety of perspectives—from activists to students, journalists to regulators, and legal experts to ethicists.
When we draft students into education technologies and enlist their labor without consent or even their ability to choose, we enact a pedagogy of extraction and exploitation. It’s time to stop.
“We should be thinking about how to have pedagogies of care, not of surveillance,” she said. “I’m not sure we can ask Google to be a part of that.”
Personalized learning – the kind hyped these days by Mark Zuckerberg and many others in Silicon Valley – is just the latest version of Skinner’s behavioral technology. Personalized learning relies on data extraction and analysis; it urges and rewards students and promises everyone will reach “mastery.” It gives the illusion of freedom and autonomy perhaps – at least in its name; but personalized learning is fundamentally about conditioning and control.
those who work in and work with education technology need to confront and resist this architecture – the “surveillance dataism,” to borrow Morozov’s phrase – even if (especially if) the outcomes promised are purportedly “for the good of the student.”
A pedagogy controlled by algorithms can never be a pedagogy of care, integrity, or trust.
Teachers: stop uncritically adopting and promoting Google products, for crying out loud. It doesn’t make you innovative or progressive. It makes you a shill for surveillance capitalism. You’re not preparing your students for a better future simply by using the latest shiny tech. You’re aiding a company — indeed a system — that’s stealing their future.
Knowledge production has a new police force: digital technology.
Source: HEWN, No. 317
I think there’s something wrapped up in all this marketing and mythology that might explain in part why the tech industry (and, good grief, the ed-tech industry) is so incredibly and dangerously dull. You can’t build thinking machines (or teaching machines for that matter) if you’re obsessed with data but have no ideas.
Source: HEWN, No. 296
I maintain that behaviorism never really went away and, despite all the talk otherwise, it remains central to computing — particularly educational computing. And as Shoshana Zuboff argues, of course, behaviorism remains central to surveillance capitalism.
Source: HEWN, No. 314
Engineering is a social production not merely a scientific or technological one. And educational engineering is not just a profession; it is an explicitly commercial endeavor. For engineers, as historian David Noble has pointed out, are not only “the foremost agents of modern technology,” but also “the agents of corporate capital.” “Learning engineers,” largely untethered from history and purposefully severed from the kind of commitment to democratic practices urged by Dewey, are now poised to be the agents of surveillance capital.
Source: HEWN, No. 312