it’s tempting straight away to see a whole range of educational platforms and apps as condensed forms of surveillance capitalism (though we might just as easily invoke ‘platform capitalism’). The classroom behaviour monitoring app ClassDojo, for example, is a paradigmatic example of a successful Silicon Valley edtech business, with vast collections of student behavioural data that it is monetizing by selling premium features for use at home and offering behaviour reports to subscribing parents. With its emphasis on positive behavioural reinforcement through reward points, it represents a marriage of Silicon Valley design with Skinner’s aspiration to create ‘technologies of behaviour’. ClassDojo amply illustrates the combination of behavioural data extraction, behaviourist psychology and monetization strategies that underpin surveillance capitalism as Zuboff presents it.
Zuboff then goes beyond human-machine confluences in the workplace to consider the instrumentation and orchestration of other types of human behaviour. Drawing parallels with the behaviourism of Skinner, she argues that digitally-enforced forms of ‘behavioral modification’ can operate ‘just beyond the threshold of human awareness to induce, reward, goad, punish, and reinforce behaviour consistent with “correct policies”’, where ‘corporate objectives define the “policies” toward which confluent behaviour harmoniously streams’ (413). Under conditions of surveillance capitalism, Skinner’s behaviourism and Pentland’s social physics spill out of the lab into homes, workplaces, and all the public and private space of everyday life–ultimately turning the world into a gigantic data science lab for social and behavioural experimentation, tuning and engineering.
For surveillance capitalists human learning is inferior to machine learning, and urgently needs to be improved by gathering together humans and machines into symbiotic systems of behavioural control and management.
With the advance of AI-based technologies into schools and universities, policy researchers may need to start interrogating the policies encoded in the software as well as the policies inscribed in government texts. These new programmable policies potentially have a much more direct influence on ‘correct’ behaviours and maximum outcomes by instrumenting and orchestrating activities, tasks and behaviours in educational institutions.
Source: Learning from surveillance capitalism | code acts in education
Surveillance capitalism now claims private human experience as free raw material for translation into behavioral predictions that are bought and sold in a new kind of private marketplace.
Source: ‘Surveillance capitalism’ has gone rogue. We must curb its excesses.
Surveillance capitalism is in our schools. It’s a feature of behaviorist ed-tech and “personalized learning”. Students are being commodified.
Writing long before Mark Zuckerberg was born, and anxiously gazing towards the computer dominated future, the social critic Lewis Mumford tried to understand why people would willingly (even eagerly) embrace technologies with severe downsides. To Mumford there were two types of technologies: democratic ones (such as bicycles) that strengthened personal autonomy; and authoritarian ones (such as computers) that ultimately came to exert total power over their users. In seeking to explain why people, and a society, would opt for authoritarian technologies over democratic ones, Mumford argued that authoritarian technologies (which he also called megatechnics) operate as a wonderful bribe. What this bribe represented was a way in which these technologies, in exchange for acquiescence, offered people a share of the impressive things these technologies could produce. Writing in 1970, Mumford warned that accepting the bribe gradually led to the elimination of alternatives to it, and he noted that for those who accept the bribe, “their ‘real’ life will be confined within the frame of a television screen” (Mumford, 331) – though today we might just as easily say “within the frame of a computer or smartphone screen.” And as he glumly continued, “to enjoy total automation, a significant portion of the population is already willing to become automatons” (Mumford, 332). Granted, as Mumford also noted, it was not that everything offered by the bribe was rubbish, rather “if one examines separately only the immediate products of megatechnics, these claims, these promises, are valid, and these achievements are genuine” but what Mumford highlighted was that “all these goods remain valuable only if more important human concerns are not overlooked or eradicated” (Mumford, 333).
Facebook is an excellent example of this bribe at work.
Platforms like Facebook, Google, Amazon, Twitter, and the like are all the bribes that convince people not to war against computerized control by offering them a little share of the goodies. A turn of phrase that Mumford returned to repeatedly throughout his oeuvre is the difference between “the good life” and “the goods life” – and he argued that things such as the bribe were the tools by which people came to mistake “the goods life” for “the good life.”
Source: Facebook – to delete, or not to delete? | LibrarianShipwreck
“I’ve long tracked Facebook’s serial admission to having SIGINT visibility that nearly rivals the NSA: knowing that Facebook had intelligence corroborating NSA’s judgment that GRU was behind the DNC hack was one reason I was ultimately convinced of the IC’s claims, in spite of initial questions.”
Source: Yet More Proof Facebook’s Surveillance Capitalism Is Good at Surveilling — Even Russian Hackers – emptywheel