Nevertheless, those who work in and work with education technology need to confront and resist this architecture – the “surveillance dataism,” to borrow Morozov’s phrase – even if (especially if) the outcomes promised are purportedly “for the good of the student.”

Source: Audrey Watters — Education Technology and The Age of Surveillance Capitalism (A Review of Shoshana Zuboff, The Age of Surveillance Capitalism) | boundary 2

Personalized learning – the kind hyped these days by Mark Zuckerberg and many others in Silicon Valley – is just the latest version of Skinner’s behavioral technology. Personalized learning relies on data extraction and analysis; it urges and rewards students and promises everyone will reach “mastery.” It gives the illusion of freedom and autonomy perhaps – at least in its name; but personalized learning is fundamentally about conditioning and control.

Source: Audrey Watters — Education Technology and The Age of Surveillance Capitalism (A Review of Shoshana Zuboff, The Age of Surveillance Capitalism) | boundary 2

Personalized learning – the kind hyped these days by Mark Zuckerberg and many others in Silicon Valley – is just the latest version of Skinner’s behavioral technology. Personalized learning relies on data extraction and analysis; it urges and rewards students and promises everyone will reach “mastery.” It gives the illusion of freedom and autonomy perhaps – at least in its name; but personalized learning is fundamentally about conditioning and control.

those who work in and work with education technology need to confront and resist this architecture – the “surveillance dataism,” to borrow Morozov’s phrase – even if (especially if) the outcomes promised are purportedly “for the good of the student.”

Source: Education Technology and The Age of Surveillance Capitalism

A pedagogy controlled by algorithms can never be a pedagogy of care, integrity, or trust.

Teachers: stop uncritically adopting and promoting Google products, for crying out loud. It doesn’t make you innovative or progressive. It makes you a shill for surveillance capitalism. You’re not preparing your students for a better future simply by using the latest shiny tech. You’re aiding a company — indeed a system — that’s stealing their future.

Knowledge production has a new police force: digital technology.

Source: HEWN, No. 317

I maintain that behaviorism never really went away and, despite all the talk otherwise, it remains central to computing — particularly educational computing. And as Shoshana Zuboff argues, of course, behaviorism remains central to surveillance capitalism.

Source: HEWN, No. 314

Engineering is a social production not merely a scientific or technological one. And educational engineering is not just a profession; it is an explicitly commercial endeavor. For engineers, as historian David Noble has pointed out, are not only “the foremost agents of modern technology,” but also “the agents of corporate capital.” “Learning engineers,” largely untethered from history and purposefully severed from the kind of commitment to democratic practices urged by Dewey, are now poised to be the agents of surveillance capital.

Source: HEWN, No. 312

it’s tempting straight away to see a whole range of educational platforms and apps as condensed forms of surveillance capitalism (though we might just as easily invoke ‘platform capitalism’). The classroom behaviour monitoring app ClassDojo, for example, is a paradigmatic example of a successful Silicon Valley edtech business, with vast collections of student behavioural data that it is monetizing by selling premium features for use at home and offering behaviour reports to subscribing parents. With its emphasis on positive behavioural reinforcement through reward points, it represents a marriage of Silicon Valley design with Skinner’s aspiration to create ‘technologies of behaviour’. ClassDojo amply illustrates the combination of behavioural data extraction, behaviourist psychology and monetization strategies that underpin surveillance capitalism as Zuboff presents it.

Zuboff then goes beyond human-machine confluences in the workplace to consider the instrumentation and orchestration of other types of human behaviour. Drawing parallels with the behaviourism of Skinner, she argues that digitally-enforced forms of ‘behavioral modification’ can operate ‘just beyond the threshold of human awareness to induce, reward, goad, punish, and reinforce behaviour consistent with “correct policies”’, where ‘corporate objectives define the “policies” toward which confluent behaviour harmoniously streams’ (413). Under conditions of surveillance capitalism, Skinner’s behaviourism and Pentland’s social physics spill out of the lab into homes, workplaces, and all the public and private space of everyday life–ultimately turning the world into a gigantic data science lab for social and behavioural experimentation, tuning and engineering.

For surveillance capitalists human learning is inferior to machine learning, and urgently needs to be improved by gathering together humans and machines into symbiotic systems of behavioural control and management.

With the advance of AI-based technologies into schools and universities, policy researchers may need to start interrogating the policies encoded in the software as well as the policies inscribed in government texts. These new programmable policies potentially have a much more direct influence on ‘correct’ behaviours and maximum outcomes by instrumenting and orchestrating activities, tasks and behaviours in educational institutions.

Source: Learning from surveillance capitalism | code acts in education

Writing long before Mark Zuckerberg was born, and anxiously gazing towards the computer dominated future, the social critic Lewis Mumford tried to understand why people would willingly (even eagerly) embrace technologies with severe downsides. To Mumford there were two types of technologies: democratic ones (such as bicycles) that strengthened personal autonomy; and authoritarian ones (such as computers) that ultimately came to exert total power over their users. In seeking to explain why people, and a society, would opt for authoritarian technologies over democratic ones, Mumford argued that authoritarian technologies (which he also called megatechnics) operate as a wonderful bribe. What this bribe represented was a way in which these technologies, in exchange for acquiescence, offered people a share of the impressive things these technologies could produce. Writing in 1970, Mumford warned that accepting the bribe gradually led to the elimination of alternatives to it, and he noted that for those who accept the bribe, “their ‘real’ life will be confined within the frame of a television screen” (Mumford, 331) – though today we might just as easily say “within the frame of a computer or smartphone screen.” And as he glumly continued, “to enjoy total automation, a significant portion of the population is already willing to become automatons” (Mumford, 332). Granted, as Mumford also noted, it was not that everything offered by the bribe was rubbish, rather “if one examines separately only the immediate products of megatechnics, these claims, these promises, are valid, and these achievements are genuine” but what Mumford highlighted was that “all these goods remain valuable only if more important human concerns are not overlooked or eradicated” (Mumford, 333).

Facebook is an excellent example of this bribe at work.

Platforms like Facebook, Google, Amazon, Twitter, and the like are all the bribes that convince people not to war against computerized control by offering them a little share of the goodies. A turn of phrase that Mumford returned to repeatedly throughout his oeuvre is the difference between “the good life” and “the goods life” – and he argued that things such as the bribe were the tools by which people came to mistake “the goods life” for “the good life.”

Source: Facebook – to delete, or not to delete? | LibrarianShipwreck

“I’ve long tracked Facebook’s serial admission to having SIGINT visibility that nearly rivals the NSA: knowing that Facebook had intelligence corroborating NSA’s judgment that GRU was behind the DNC hack was one reason I was ultimately convinced of the IC’s claims, in spite of initial questions.”

Source: Yet More Proof Facebook’s Surveillance Capitalism Is Good at Surveilling — Even Russian Hackers – emptywheel