Surveillance capitalism now claims private human experience as free raw material for translation into behavioral predictions that are bought and sold in a new kind of private marketplace.
Ed-tech relies on amnesia.
Ed-tech is a confidence game. That’s why it’s so full of marketers and grifters and thugs. (The same goes for “tech” at large.)
Source: HEWN, No. 297
Despite scant evidence in support of the psychopedagogies of mindsets, mindfulness, wellness, and grit, the ed-tech industry (press) markets these as solutions to racial and gender inequality (among other things), as the psychotechnologies of personalization are now increasingly intertwined not just with surveillance and with behavioral data analytics, but with genomics as well. “Why Progressives Should Embrace the Genetics of Education,” a NYT op-ed piece argued in July, perhaps forgetting that education’s progressives (including Montessori) have been down this path before.
“Does It Make More Sense to Invest in School Security or SEL?” Edsurge asked its readers this summer. Those are the choices – surveillance or surveillance.
What an utter failure of imagination.
We are confronted by the complicated/complex division everyday in education. Do I want to know if a medical students has remembered the nine steps of a process of inquiry to work with a patient or do I want to know if they built a good raport? How often do we choose the thing that is easier to measure… simply because we can verify that our grading is ‘fair’. How often do we get caught in conversations around how ‘rigourous’ an assessment is when what we really mean is ‘how easy is it to defend to a parent who’s going to complain about a child’s grade’.
Learning, like love, can’t have a lean six sigma chart designed for it. Once we’ve identified something in our education space as complex (as opposed to complicated) a new set of tools has to emerge. We have to have deep conversations about what our goals are. We need to talk about what our values are and how they translate to our lives. And then we need to engage with our system in a broad based, patient way that allows us to make change. As Snowden would put it, Probe, Sense and Respond. Try some things, see how they work, iterate and try again. You’re never going to get to best practice, because the situation is always changing.
We need to understand that our protectionist strategies (limiting screen time, web blocking apps) just further put dangerous and mean activities our children on the internet further underground. We need training, we need dialogue, we need courage… but most of all we all need to get together and decide that our goal is to try and make the internet a better place… rather than trying to hide from it. No LSS approach is going to do that. Only human approaches… only messy results.
we all need to get together and decide that our goal is to try and make the internet a better place… rather than trying to hide from it.
No LSS approach is going to do that. Only human approaches… only messy results.
Once we’ve identified something in our education space as complex (as opposed to complicated) a new set of tools has to emerge.
Human approaches, tools for complexity:
- Design is Tested at the Edges: Intersectionality, The Social Model of Disability, and Design for Real Life
- Equity Literate Education: Fix Injustice, Not Kids
- The right to learn differently should be a universal human right that’s not mediated by a diagnosis.
- Compassion is not coddling. Design for real life.
- Education, Neurodiversity, the Social Model of Disability, and Real Life
We are confronted by the complicated/complex division everyday in education.
Start “foregrounding complexity as the baseline”. Behaviorism and mindset marketing are not human approaches because they bikeshed human complexity. They are convenient detours, not direct confrontations.
There is no path toward educational justice that contains convenient detours around direct confrontations with injustice. The desperate search for these detours, often in the form of models or frameworks or concepts that were not developed as paths to justice, is the greatest evidence of the collective desire among those who count on injustice to give them an advantage to retain that advantage. If a direct confrontation of injustice is missing from our strategies or initiatives or movements, that means we are recreating the conditions we’re pretending to want to destroy.
How often do we choose the thing that is easier to measure… simply because we can verify that our grading is ‘fair’. How often do we get caught in conversations around how ‘rigourous’ an assessment is when what we really mean is ‘how easy is it to defend to a parent who’s going to complain about a child’s grade’.
Picking the easy to measure things and building pedagogy and culture around data and behaviorism disguises the ways they kill us.
The irony of turning schools into therapeutic institutions when they generate so much stress and anxiety seems lost on policy-makers who express concern about children’s mental health
Here’s a rule of thumb for you: An individual’s enthusiasm about the employment of “data” in education is directly proportional to his or her distance from actual students. Policy makers and economists commonly refer to children in the aggregate, apparently viewing them mostly as a source of numbers to be crunched. They do this even more than consultants and superintendents, who do it more than principals, who do it more than teachers. The best teachers, in fact, tend to recoil from earnest talk about the benefits of “data-driven instruction,” the use of “data coaches,” “data walls,” and the like.
Making matters worse, the data in question typically are just standardized test scores – even though, as I’ve explained elsewhere, that’s not the only reason to be disturbed by this datamongering. And it doesn’t help when the process of quantifying kids (and learning) is festooned with adjectives such as “personalized” or “customized.”
But here’s today’s question: If collecting and sorting through data about students makes us uneasy, how should we feel about the growing role of Big Data?
Part of the problem is that we end up ignoring or minimizing the significance of whatever doesn’t lend itself to data analytics. It’s rather like the old joke about the guy searching for his lost keys at night near a street light even though that’s not where he’d dropped them. (“But the light is so much better here!”) No wonder education research – increasingly undertaken by economists – increasingly relies on huge data sets consisting of standardized test results. Those scores may be lousy representations of learning – and, indeed, egregiously misleading. But, by gum, they sure are readily available.
“What’s left out?”, then, is one critical question to ask. Another is: “Who benefits from it?” Noam Scheiber, a reporter who covers workplace issues, recently observed that big data is “massively increasing the power asymmetry between exploiters and exploitees.” (For more on this, check out Cathy O’Neil’s book Weapons of Math Destruction).
Anyone who has observed the enthusiasm for training students to show more “grit” or develop a “growth mindset” should know what it means to focus on fixing the kid so he or she can better adapt to the system rather than asking inconvenient questions about the system itself. Big data basically gives us more information, based on grades, about which kids need fixing (and how and when), making it even less likely that anyone would think to challenge the destructive effects of – and explore alternatives to – the practice of grading students.
Predictive analytics allows administrators to believe they’re keeping a watchful eye on their charges when in fact they’re learning nothing about each student’s experience of college, his or her needs, fears, hopes, beliefs, and state of mind. Creating a “personalized” data set underscores just how _im_personal the interaction with students is, and it may even compound that problem. At the same time that this approach reduces human beings to a pile of academic performance data, it also discourages critical thought about how the system, including teaching and evaluation, affects those human beings.
Our public school policymakers want us to do the later. In fact, they have a whole pedagogical justification for ignoring the needs of children.
And it goes something like this:
That child isn’t learning? If she just worked harder, she would.
It’s the political equivalent of “pull yourself up by your own bootstraps” applied to the classroom.
And it’s super helpful for politicians reluctant to allocate tax dollars to actually help kids succeed.
But what no one wants to admit is that grit is… well… shit.
It’s just an excuse for a society that refuses to help those most in need.
Yet when anyone suggests offering help to even the playing field – to make things more fair – a plethora of policy wonks wag their fingers and say, “No way! They did it to themselves.”
It’s typical “blame the victim” pathology to say that some kids get all the love, time and resources they need while others can do without – they just need more “grit” and a “growth mindset.”
As with the corporate flavor, ed-tech mindfulness, like other mindset marketing, disguises the ways they kill us.
Source: Mindfulness in Education – rnbn
And so we can firmly put the insistence on data-driven instruction in the trash bin of bad ideas.
It is unscientific, unproven, harmful, reductive, dehumanizing and contradictory.
The next time you hear an administrator or principal pull out this chestnut, take out one of these counterarguments and roast it on an open fire.
No more data-driven instruction.
Focus instead on student-driven learning.
I think there’s a lot to say about machine learning and the push for “personalization” in education. And the historian in me cannot help but add that folks have trying to “personalize” education using machines for about a century now. The folks building these machines have, for a very long time, believed that collecting the student data generated while using the machines will help them improve their “programmed instruction” – this decades before Mark Zuckerberg was born.
I think we can talk about the labor issues – how this continues to shift expertise and decision making in the classroom, for starters, but also how students’ data and students’ work is being utilized for commercial purposes. I think we can talk about privacy and security issues – how sloppily we know that these companies, and unfortunately our schools as well, handle student and teacher information.
Anytime you hear someone say “personalization” or “AI” or “algorithmic,” I urge you to replace that phrase with “prediction.”
Sadly though, the social, political, and economic narrative of schooling in the past has been grounded in a “soft eugenics” belief that while some children have the capacity to become whatever they choose to be in life, others do not. This plays out in the decisions that educators make, often based on decontextualized data and confirmation biases that stem from immersion in traditions of education that did the same to us. Even if lip service is given to words such as equity, accessibility, inclusivity, empathy, cultural responsiveness, and connected relationships, schooling today is still far more likely to support practices from the past that have created school cultures in which none of those words define who educators really are, no matter what they aspire to be.
Consider how the “habitable world” concept developed by Rosemarie Garland‐Thomson, Emory University researcher and professor, sits at the core of the philosophy of educators who developed and now sustain the structures and processes of schooling that impact young people such as Kolion (Garland‐Thomson 2017b). Garland‐Thomson views public, political, and organizational philosophy as representative of one of “two forms of world‐building, inclusive and eugenic” (Garland‐Thomson 2017a). Unfortunately, often it’s the soft educational eugenics philosophy that is most often expressed in practice, if not in words, across the nation’s schools rather than the creation of habitable worlds that are inclusive of all learners.
If we want our schools to be learning spaces that reveal the strengths of children to us, we have to create a bandwidth of opportunities that do so. That means making decisions differently, decisions driven from values that support equity, accessibility, inclusivity, empathy, cultural responsiveness, and connected relationships inside the ecosystem. Those are the words representative of habitable worlds, not words such as sort, select, remediate, suspend, or fail.
The solution, the way decisions are best made, lies in empowering teachers and students to make choices. Any systemic or institutional decision made for “all kids” or “most kids” or based on quantitative research will – guaranteed – be the wrong decision. Any decision based in “miracle narratives” will be at least as bad. We are not discussing “the average child” or “the average dyslexic” (neither of which exists), nor are we going to base policy on the exceptional case. Instead, we will “solve this” by making individual decisions with individual students. (Socol 2008)
The assurance that “the child will be the customer” underscores the belief – shared by many in and out of education reform and education technology – that education is simply a transaction: an individual’s decision-making in a “marketplace of ideas.” (There is no community, no public responsibility, no larger civic impulse for early childhood education here. It’s all about private schools offering private, individual benefits.)
This idea that “the child will be the customer” is, of course, also a nod to “personalized learning” as well, as is the invocation of a “Montessori-inspired” model. As the customer, the child will be tracked and analyzed, her preferences noted so as to make better recommendations to up-sell her on the most suitable products. And if nothing else, Montessori education in the United States is full of product recommendations.
The image of data-intensive startup pre-schools with young children receiving ‘recommended for you’ content as infant customers of ed-tech products is troubling. It suggests that from their earliest years children will become targets of intensive datafication and consumer-style profiling. As Michelle Willson argues in her article on algorithmic profiling and prediction of children, they ‘portend a future for these children as citizens and consumers already captured, modelled, managed by and normalised to embrace algorithmic manipulation’.
The tech elite now making a power-grab for public education probably has little to fear from FBI warnings about education technology. The FBI is primarily concerned with potentially malicious uses of sensitive student information by cybercriminals. There’s nothing criminal about creating Montessori-inspired preschool networks, using ClassDojo as a vehicle to build a liberal society, reimagining high school as personalized learning, or reshaping universities as AI-enhanced factories for producing labour market outcomes-unless you consider all of this a kind of theft of public education for private commercial advantage and influence.
The fact that an organization that should be leading the effective, thoughtful, responsible use of technology in education implemented such a fad at an event for educators is troubling. The ISTE Expo Halls were a frenzy of Apple, Google, Microsoft and others creating demand for their “learning opportunities” and giveaways with massive lines of early morning attendees hoping for tickets, invites, tokens. The whole time, throughout the Convention Centre, the Big Players deployed troops to frantically scan the QR codes of individuals waiting in line. So what exactly does this evidence tell us about personalized learning and how instructive will it be to ISTE’s sponsors when they receive this data? How will this data shape education? What does it tell us about learning, about institutional deprivation in the teaching profession? Is this about improving learning or the relentless drive of the ed tech industry?
At one expo stand we spoke with a thoughtful educator who asked if we were interested in the “monitor” function of the software on display. We asked what this did. “It allows you to monitor the activities of your students while they use the software. You can see if they are on-task.” We groaned. “Well, you are clearly not American,” came the reaction. Is the mindless use of personal data really going to result in such unfortunate generalisations? As we were leaving the booth the attempt to scan our badges failed. The blank spaces on our badges were noted gravely. Knowing glances were exchanged. We were part of The Others.
Everyone involved in education needs to take a stand against this kind of “personalized learning”. Forego the tee-shirt, the exclusive “hands-on” session invitation, offers to see the School of the Future, the stickbait badges, the free chargers.
Remember who schools are for. Before it’s too late.