Operant conditioning and the manipulation of response to stimuli are at the heart of theories that support instructional design. But more, they form the foundation of almost all educational technology-from the VLE or LMS to algorithms for adaptive learning. Building upon behaviorism, Silicon Valley-often in collaboration with venture capitalists with a stake in the education market-have begun to realize Skinner’s teaching machines in today’s schools and universities.
And there’s the rub. When we went online to teach, we went online almost entirely without any other theories to support us besides instructional design. We went online first assuming that learning could be a calculated, brokered, duplicatable experience. For some reason, we took one look at the early internet and forgot about all the nuance of teaching, all the strange chaos of learning, and surrendered to a philosophy of see, do, hit submit.
The problem we face is not just coded into the VLE, either. It’s not just coded into Facebook and Twitter and the way we send an e-mail or the machines we use to send text messages. It’s coded into us. We believe that online learning happens this way. We believe that discussions should be posted once and replied to twice. We believe that efficiency is a virtue, that automated proctors and plagiarism detection services are necessary-and more than necessary, helpful.
But these are not things that are true, they are things that are sold.
Tag: algorithms
Yet somehow the algorithm had correctly identified this as the thing likeliest to make me click, then followed me across continents to ensure that I did.
It made me think of the old “Terminator” movies, except instead of a killer robot sent to find Sarah Connor, it’s a sophisticated set of programs ruthlessly pursuing our attention. And exploiting our most human frailties to do it.
Source: Social Media’s Re-engineering Effect, From Myanmar to Germany – The New York Times
demand algorithmic transparency in all software systems used by public entities, including schools.
Source: Machine Teaching, Machine Learning, and the History of the Future of Public Education
Anytime you hear someone say “personalization” or “AI” or “algorithmic,” I urge you to replace that phrase with “prediction.”
Source: Machine Teaching, Machine Learning, and the History of the Future of Public Education
The image of data-intensive startup pre-schools with young children receiving ‘recommended for you’ content as infant customers of ed-tech products is troubling. It suggests that from their earliest years children will become targets of intensive datafication and consumer-style profiling. As Michelle Willson argues in her article on algorithmic profiling and prediction of children, they ‘portend a future for these children as citizens and consumers already captured, modelled, managed by and normalised to embrace algorithmic manipulation’.
The tech elite now making a power-grab for public education probably has little to fear from FBI warnings about education technology. The FBI is primarily concerned with potentially malicious uses of sensitive student information by cybercriminals. There’s nothing criminal about creating Montessori-inspired preschool networks, using ClassDojo as a vehicle to build a liberal society, reimagining high school as personalized learning, or reshaping universities as AI-enhanced factories for producing labour market outcomes-unless you consider all of this a kind of theft of public education for private commercial advantage and influence.
Source: The tech elite is making a power-grab for public education | code acts in education
Can the gig economy and the algorithm ever provide high quality preschool? For all the flaws in the public school system, it’s important to remember: there is no accountability in billionaires’ educational philanthropy.
there will be much dying: even more so than during the worst conflicts of the 20th century. But rather than conventional wars (“nation vs nation”) it’ll be “us vs them”, where “us” and “them” will be defined by whichever dehumanized enemy your network filter bubble points you at
You don’t need to build concentration camps with barbed wire fences and guards if you can turn your entire society into a machine-mediated panopticon with automated penalties for non-compliance.
Tomorrow’s genocides will be decentralized and algorithmically tweaked, quite possibly executed without human intervention.
Forget barbed wire, concentration camps, gas chambers and gallows, and Hugo Boss uniforms. That’s the 20th century pattern of centralized, industrialized genocide. In the 21st century deep-learning mediated AI era, we have the tools to inflict agile, decentralized genocide as a cloud service on our victims.
Trump has discovered that in times of insecurity, the spectacle of cruelty provides a shared common focus for his supporters.
What’s new is the speed and specificity with which the cruelty can be applied, and the ability to redirect it in a matter of hours—increasing the sense of insecurity, which in turn drives social conservativism and support for violent self-defense.
Take a step back and remind yourself of Facebook’s attempts to bring you news ended in fake news becoming the norm. Their algorithms created not friendship but hate bubbles. And now the same company will create an algorithm of love?
Source: Dating with Facebook: What’s love got to do with it? – Om on Tech
It’s important, she says, that the technology came out of a lengthy process of understanding what challenges existed in the system. “I think if you look at certain attempts to use technology within complex bureaucratic systems, you’ll very often have people writing a beautiful algorithm, but for a problem that’s the wrong problem,” Pahlka says. “What I’m proud of our team doing is the work to figure out where the real problem is.”
Source: This algorithm is quickly clearing old marijuana convictions in San Francisco
“To that end, I propose a Data Bill of Rights. It should have two components: The first would specify how much control we may exert over how our individual information is used for important decisions, and the second would introduce federally enforced rules on how algorithms should be monitored more generally.”
Source: Congress Is Missing the Point on Facebook – Bloomberg