it’s tempting straight away to see a whole range of educational platforms and apps as condensed forms of surveillance capitalism (though we might just as easily invoke ‘platform capitalism’). The classroom behaviour monitoring app ClassDojo, for example, is a paradigmatic example of a successful Silicon Valley edtech business, with vast collections of student behavioural data that it is monetizing by selling premium features for use at home and offering behaviour reports to subscribing parents. With its emphasis on positive behavioural reinforcement through reward points, it represents a marriage of Silicon Valley design with Skinner’s aspiration to create ‘technologies of behaviour’. ClassDojo amply illustrates the combination of behavioural data extraction, behaviourist psychology and monetization strategies that underpin surveillance capitalism as Zuboff presents it.

Zuboff then goes beyond human-machine confluences in the workplace to consider the instrumentation and orchestration of other types of human behaviour. Drawing parallels with the behaviourism of Skinner, she argues that digitally-enforced forms of ‘behavioral modification’ can operate ‘just beyond the threshold of human awareness to induce, reward, goad, punish, and reinforce behaviour consistent with “correct policies”’, where ‘corporate objectives define the “policies” toward which confluent behaviour harmoniously streams’ (413). Under conditions of surveillance capitalism, Skinner’s behaviourism and Pentland’s social physics spill out of the lab into homes, workplaces, and all the public and private space of everyday life–ultimately turning the world into a gigantic data science lab for social and behavioural experimentation, tuning and engineering.

For surveillance capitalists human learning is inferior to machine learning, and urgently needs to be improved by gathering together humans and machines into symbiotic systems of behavioural control and management.

With the advance of AI-based technologies into schools and universities, policy researchers may need to start interrogating the policies encoded in the software as well as the policies inscribed in government texts. These new programmable policies potentially have a much more direct influence on ‘correct’ behaviours and maximum outcomes by instrumenting and orchestrating activities, tasks and behaviours in educational institutions.

Source: Learning from surveillance capitalism | code acts in education

Ed-tech relies on amnesia.

Ed-tech is a confidence game. That’s why it’s so full of marketers and grifters and thugs. (The same goes for “tech” at large.)

Source: HEWN, No. 297

Despite scant evidence in support of the psychopedagogies of mindsets, mindfulness, wellness, and grit, the ed-tech industry (press) markets these as solutions to racial and gender inequality (among other things), as the psychotechnologies of personalization are now increasingly intertwined not just with surveillance and with behavioral data analytics, but with genomics as well. “Why Progressives Should Embrace the Genetics of Education,” a NYT op-ed piece argued in July, perhaps forgetting that education’s progressives (including Montessori) have been down this path before.

Does It Make More Sense to Invest in School Security or SEL?” Edsurge asked its readers this summer. Those are the choices – surveillance or surveillance.

What an utter failure of imagination.

Source: The Stories We Were Told about Education Technology (2018)

Rewards ARE COERCIVE. They ARE MANIPULATIVE. They ARE CONTROLLING.

Giving contingent rewards is not compassionate, kind, or a loving action. Kids understand this fact, and they fight against it. So when a child accuses me of manipulating them, they are right.

I have been through my approach before. I will refer you here and here to learn what I recommend regarding PBIS and here for behavior management in general. Overall, what I wholeheartedly believe is that we need to stop using external motivation as a way of getting kids to engage. We are depriving them of learning for themselves how to act and behave because they want to be good and because they like how they feel when they do the right thing. We are teaching them that, at least in school, their primary motivation to complete work should be to receive a reward from teachers and other adults. We teach them to distrust their intrinsic motivations and desires. We are robbing them of the ability to develop their socioemotional sense of self on their terms.

Source: External Incentives DECREASE Intrinsic Motivation: Implications for Classroom Management – Why Haven’t They Done That Yet?

Dr. Hunsaker is a behavioral neuroscientist and special education teacher. I’ve shared his work before. Some of my favorites:

See also,

But the goal of disinformation isn’t really around these individual transactions. The goal of disinformation is to, over time, change our psychological set-points. To the researcher looking at individuals at specific points in time, the homeostasis looks protective – fire up Mechanical Turk, see what people believe, give them information or disinformation, see what changes. What you’ll find is nothing changes – set-points are remarkably resilient.

But underneath that, from year to year, is drift. And its the drift that matters.

Source: The Homeostatic Fallacy and Misinformation Literacy | Hapgood

Via:

Most of the reading I’m doing right now in my final weeks of research I’d describe as “contextual” – that is, I’m reading the bestsellers and articles that reflect ideas influencing and influenced by and adjacent to teaching machines and behaviorism in the 1950s and 1960s. Needless to say, I’ve been reading a lot about cybernetics – something that totally colored how I thought about the article Mike Caulfield published this week on “The Homeostatic Fallacy and Misinformation Literacy.” Homeostasis is a cornerstone of cybernetic (and information) theory. And yet here we are, thanks to data-driven “feedback,” all out of whack.

I think there’s something wrapped up in all this marketing and mythology that might explain in part why the tech industry (and, good grief, the ed-tech industry) is so incredibly and dangerously dull. You can’t build thinking machines (or teaching machines for that matter) if you’re obsessed with data but have no ideas.

Source: HEWN, No. 296

I updated “Persuasion and Operant Conditioning: The Influence of B. F. Skinner in Big Tech and Ed-tech” with selections from “A Call for Critical Instructional Design”.

Operant conditioning and the manipulation of response to stimuli are at the heart of theories that support instructional design. But more, they form the foundation of almost all educational technology-from the VLE or LMS to algorithms for adaptive learning. Building upon behaviorism, Silicon Valley-often in collaboration with venture capitalists with a stake in the education market-have begun to realize Skinner’s teaching machines in today’s schools and universities.

And there’s the rub. When we went online to teach, we went online almost entirely without any other theories to support us besides instructional design. We went online first assuming that learning could be a calculated, brokered, duplicatable experience. For some reason, we took one look at the early internet and forgot about all the nuance of teaching, all the strange chaos of learning, and surrendered to a philosophy of see, do, hit submit.

The problem we face is not just coded into the VLE, either. It’s not just coded into Facebook and Twitter and the way we send an e-mail or the machines we use to send text messages. It’s coded into us. We believe that online learning happens this way. We believe that discussions should be posted once and replied to twice. We believe that efficiency is a virtue, that automated proctors and plagiarism detection services are necessary-and more than necessary, helpful.

But these are not things that are true, they are things that are sold.

Source: A Call for Critical Instructional Design

Operant conditioning and the manipulation of response to stimuli are at the heart of theories that support instructional design. But more, they form the foundation of almost all educational technology-from the VLE or LMS to algorithms for adaptive learning. Building upon behaviorism, Silicon Valley-often in collaboration with venture capitalists with a stake in the education market-have begun to realize Skinner’s teaching machines in today’s schools and universities.

And there’s the rub. When we went online to teach, we went online almost entirely without any other theories to support us besides instructional design. We went online first assuming that learning could be a calculated, brokered, duplicatable experience. For some reason, we took one look at the early internet and forgot about all the nuance of teaching, all the strange chaos of learning, and surrendered to a philosophy of see, do, hit submit.

The problem we face is not just coded into the VLE, either. It’s not just coded into Facebook and Twitter and the way we send an e-mail or the machines we use to send text messages. It’s coded into us. We believe that online learning happens this way. We believe that discussions should be posted once and replied to twice. We believe that efficiency is a virtue, that automated proctors and plagiarism detection services are necessary-and more than necessary, helpful.

But these are not things that are true, they are things that are sold.

Source: A Call for Critical Instructional Design