“We should be thinking about how to have pedagogies of care, not of surveillance,” she said. “I’m not sure we can ask Google to be a part of that.”
A pedagogy controlled by algorithms can never be a pedagogy of care, integrity, or trust.
Teachers: stop uncritically adopting and promoting Google products, for crying out loud. It doesn’t make you innovative or progressive. It makes you a shill for surveillance capitalism. You’re not preparing your students for a better future simply by using the latest shiny tech. You’re aiding a company — indeed a system — that’s stealing their future.
Knowledge production has a new police force: digital technology.
Source: HEWN, No. 317
I maintain that behaviorism never really went away and, despite all the talk otherwise, it remains central to computing — particularly educational computing. And as Shoshana Zuboff argues, of course, behaviorism remains central to surveillance capitalism.
Source: HEWN, No. 314
Surveillance capitalism now claims private human experience as free raw material for translation into behavioral predictions that are bought and sold in a new kind of private marketplace.
Ed-tech relies on amnesia.
Ed-tech is a confidence game. That’s why it’s so full of marketers and grifters and thugs. (The same goes for “tech” at large.)
Source: HEWN, No. 297
Despite scant evidence in support of the psychopedagogies of mindsets, mindfulness, wellness, and grit, the ed-tech industry (press) markets these as solutions to racial and gender inequality (among other things), as the psychotechnologies of personalization are now increasingly intertwined not just with surveillance and with behavioral data analytics, but with genomics as well. “Why Progressives Should Embrace the Genetics of Education,” a NYT op-ed piece argued in July, perhaps forgetting that education’s progressives (including Montessori) have been down this path before.
“Does It Make More Sense to Invest in School Security or SEL?” Edsurge asked its readers this summer. Those are the choices – surveillance or surveillance.
What an utter failure of imagination.
I told her that I don’t believe in hope and I don’t believe in hopelessness; I believe in compassion and pragmatism. Hope can be lethal when you are fighting an autocracy. Hope is inextricable from time, and as anyone who has studied the entrenchment of dictators knows, the longer they stay in, the harder it is to get them out. Every day passed is damage done.
We were never going to be okay because America had never been okay. In January, 2017, America emerged from an election that not only brought an unworthy leader, but exploited every pre-existing crisis in U.S. history: racism, income inequality, geographic inequality, misogyny, xenophobia, battles over surveillance and privacy, and so on.
The assurance that “the child will be the customer” underscores the belief – shared by many in and out of education reform and education technology – that education is simply a transaction: an individual’s decision-making in a “marketplace of ideas.” (There is no community, no public responsibility, no larger civic impulse for early childhood education here. It’s all about private schools offering private, individual benefits.)
This idea that “the child will be the customer” is, of course, also a nod to “personalized learning” as well, as is the invocation of a “Montessori-inspired” model. As the customer, the child will be tracked and analyzed, her preferences noted so as to make better recommendations to up-sell her on the most suitable products. And if nothing else, Montessori education in the United States is full of product recommendations.
The fact that an organization that should be leading the effective, thoughtful, responsible use of technology in education implemented such a fad at an event for educators is troubling. The ISTE Expo Halls were a frenzy of Apple, Google, Microsoft and others creating demand for their “learning opportunities” and giveaways with massive lines of early morning attendees hoping for tickets, invites, tokens. The whole time, throughout the Convention Centre, the Big Players deployed troops to frantically scan the QR codes of individuals waiting in line. So what exactly does this evidence tell us about personalized learning and how instructive will it be to ISTE’s sponsors when they receive this data? How will this data shape education? What does it tell us about learning, about institutional deprivation in the teaching profession? Is this about improving learning or the relentless drive of the ed tech industry?
At one expo stand we spoke with a thoughtful educator who asked if we were interested in the “monitor” function of the software on display. We asked what this did. “It allows you to monitor the activities of your students while they use the software. You can see if they are on-task.” We groaned. “Well, you are clearly not American,” came the reaction. Is the mindless use of personal data really going to result in such unfortunate generalisations? As we were leaving the booth the attempt to scan our badges failed. The blank spaces on our badges were noted gravely. Knowing glances were exchanged. We were part of The Others.
Everyone involved in education needs to take a stand against this kind of “personalized learning”. Forego the tee-shirt, the exclusive “hands-on” session invitation, offers to see the School of the Future, the stickbait badges, the free chargers.
Remember who schools are for. Before it’s too late.
When people location trackers are marketed as ‘smart badges’ by trusted brands (like ISTE), when their operations are not explained, and when the technology is obfuscated, people become de-sensitized to practices they may otherwise object to.
Did the people in these pictures know they were socializing and learning in an environment where each of their movements were tracked within a meter of accuracy? Did they understand how these data will be used, how it is secured, and with whom it will be shared? Do they each think the cost-benefit of sharing these location data are worth the yet-to-be-sent conference summary emails? Did the surveillance system actually allow ISTE to make adjustments in real-time to popular sessions that were turning away participants?
Yet, despite the email and the physical signs at the registration desk, many people asserted to me that they never received notice of the use of the ‘smart badge’. Of those that did, many had no idea how it worked. They thought it was just a QR code for vendors to scan. Many didn’t understand that it was a battery-powered transmitter without an off-switch. Not a one was happy upon learning what I discovered.
Did ISTE offer enough information to participants so they could make an informed judgment about the value of wearing the badge vs. the potential risks? Should wearing the badges have been an opt-in vs. opt-out decision for participants? For educators trying to manage the privacy and security risks of edtech in their own classrooms, what lesson does this incident impart about best practices and informed consent?
As ISTE and its members collectively mull through these questions, I look forward to hearing about the reactions to the after-conference reports that participants are slated to receive about their movements and presumed interests. Will folks feel like it added value? I bet for some segment of participants receiving that email will trigger concerns they didn’t even know enough to worry about in the first place. It will be the first time they realized their movements have been tracked.