The first is that implementation is policy. Whatever gets decided at various times by leadership (in this case, first to separate families, then to reunite them), what happens in real life is often determined less by policy than by software. And until the government starts to think of technology as a dynamic service, imperfect but ever-evolving, not just a static tool you buy from a vendor, that won’t change.

The second lesson has to do with how Silicon Valley — made up of people who very much think of technology as something you do — should think about its role in fighting injustice.

This is one of the lessons you can’t escape if you work on government tech. When government is impaired, who gets hurt? More often than not, the most vulnerable people.

In order to properly administer a social safety net, a just criminal justice system, and hundreds of other functions that constitute a functioning democracy, we must build government’s technology capabilities. In doing that, we run the risk of also increasing government’s effectiveness to do harm.

Which is why Silicon Valley can’t limit its leverage over government to software. Software doesn’t have values. People do. If the people who build and finance software (in Silicon Valley and elsewhere) really want government that aligns with their values, and they must offer government not just their software, but their time, their skills, and part of their careers. The best way to reclaim government is to become part of it.

Source: Border family separation and how computer software can make policy – Vox

Because endless growth and data collection is the foundation of their business, and that necessitates doing gross invasive things to their users.

They need you to feed the beast, and they certainly don’t want you to think about it. So they use cartoon animals and sneaky happy paths to make sure you stay blissfully ignorant.

Using software is inherently a handshake agreement between you and the service provider. It’s not unlike paying for a physical service.

The problem is, many of the dominant software makers are abusing your handshake in increasingly dastardly ways. They treat their customers like sitting ducks — just a bunch of dumb animals waiting to be harvested. And when growth slows, they resort to deceptive and creepy tactics to keep the trend lines pointing skyward.

Source: How You’re Being Manipulated By Software – Signal v. Noise

The image of data-intensive startup pre-schools with young children receiving ‘recommended for you’ content as infant customers of ed-tech products is troubling. It suggests that from their earliest years children will become targets of intensive datafication and consumer-style profiling. As Michelle Willson argues in her article on algorithmic profiling and prediction of children, they ‘portend a future for these children as citizens and consumers already captured, modelled, managed by and normalised to embrace algorithmic manipulation’.

The tech elite now making a power-grab for public education probably has little to fear from FBI warnings about education technology. The FBI is primarily concerned with potentially malicious uses of sensitive student information by cybercriminals. There’s nothing criminal about creating Montessori-inspired preschool networks, using ClassDojo as a vehicle to build a liberal society, reimagining high school as personalized learning, or reshaping universities as AI-enhanced factories for producing labour market outcomes-unless you consider all of this a kind of theft of public education for private commercial advantage and influence.

Source: The tech elite is making a power-grab for public education | code acts in education

The fact that an organization that should be leading the effective, thoughtful, responsible use of technology in education implemented such a fad at an event for educators is troubling. The ISTE Expo Halls were a frenzy of Apple, Google, Microsoft and others creating demand for their “learning opportunities” and giveaways with massive lines of early morning attendees hoping for tickets, invites, tokens. The whole time, throughout the Convention Centre, the Big Players deployed troops to frantically scan the QR codes of individuals waiting in line. So what exactly does this evidence tell us about personalized learning and how instructive will it be to ISTE’s sponsors when they receive this data? How will this data shape education? What does it tell us about learning, about institutional deprivation in the teaching profession? Is this about improving learning or the relentless drive of the ed tech industry?

At one expo stand we spoke with a thoughtful educator who asked if we were interested in the “monitor” function of the software on display. We asked what this did. “It allows you to monitor the activities of your students while they use the software. You can see if they are on-task.” We groaned. “Well, you are clearly not American,” came the reaction. Is the mindless use of personal data really going to result in such unfortunate generalisations? As we were leaving the booth the attempt to scan our badges failed. The blank spaces on our badges were noted gravely. Knowing glances were exchanged. We were part of The Others.

Everyone involved in education needs to take a stand against this kind of “personalized learning”. Forego the tee-shirt, the exclusive “hands-on” session invitation, offers to see the School of the Future, the stickbait badges, the free chargers.

Remember who schools are for. Before it’s too late.

Source: ISTE, Digital Tracking, and the Myth of Personalized Learning – maelstrom