During our research, we also found ourselves reflecting on the unique position of the school as an institution tasked not only with educating its students but also with managing their personal data. Couldn’t one then argue that, since the school is a microcosm of the wider society, the school’s own data protection regime could be explained to children as a deliberate pedagogical strategy? Rather than something quietly managed by the GDPR compliance officer and conveyed as a matter of administrative necessity to parents, the school’s approach to data protection could be explained to students so they could learn about the management of data that is important to them (their grades, attendance, special needs, mental health, biometrics).
I think there’s a lot to say about machine learning and the push for “personalization” in education. And the historian in me cannot help but add that folks have trying to “personalize” education using machines for about a century now. The folks building these machines have, for a very long time, believed that collecting the student data generated while using the machines will help them improve their “programmed instruction” – this decades before Mark Zuckerberg was born.
I think we can talk about the labor issues – how this continues to shift expertise and decision making in the classroom, for starters, but also how students’ data and students’ work is being utilized for commercial purposes. I think we can talk about privacy and security issues – how sloppily we know that these companies, and unfortunately our schools as well, handle student and teacher information.
I told her that I don’t believe in hope and I don’t believe in hopelessness; I believe in compassion and pragmatism. Hope can be lethal when you are fighting an autocracy. Hope is inextricable from time, and as anyone who has studied the entrenchment of dictators knows, the longer they stay in, the harder it is to get them out. Every day passed is damage done.
We were never going to be okay because America had never been okay. In January, 2017, America emerged from an election that not only brought an unworthy leader, but exploited every pre-existing crisis in U.S. history: racism, income inequality, geographic inequality, misogyny, xenophobia, battles over surveillance and privacy, and so on.
We emphasize budgeting and password management in our home curriculum.
Give every dollar a job, and never reuse or share passwords.
When people location trackers are marketed as ‘smart badges’ by trusted brands (like ISTE), when their operations are not explained, and when the technology is obfuscated, people become de-sensitized to practices they may otherwise object to.
Did the people in these pictures know they were socializing and learning in an environment where each of their movements were tracked within a meter of accuracy? Did they understand how these data will be used, how it is secured, and with whom it will be shared? Do they each think the cost-benefit of sharing these location data are worth the yet-to-be-sent conference summary emails? Did the surveillance system actually allow ISTE to make adjustments in real-time to popular sessions that were turning away participants?
Yet, despite the email and the physical signs at the registration desk, many people asserted to me that they never received notice of the use of the ‘smart badge’. Of those that did, many had no idea how it worked. They thought it was just a QR code for vendors to scan. Many didn’t understand that it was a battery-powered transmitter without an off-switch. Not a one was happy upon learning what I discovered.
Did ISTE offer enough information to participants so they could make an informed judgment about the value of wearing the badge vs. the potential risks? Should wearing the badges have been an opt-in vs. opt-out decision for participants? For educators trying to manage the privacy and security risks of edtech in their own classrooms, what lesson does this incident impart about best practices and informed consent?
As ISTE and its members collectively mull through these questions, I look forward to hearing about the reactions to the after-conference reports that participants are slated to receive about their movements and presumed interests. Will folks feel like it added value? I bet for some segment of participants receiving that email will trigger concerns they didn’t even know enough to worry about in the first place. It will be the first time they realized their movements have been tracked.
My greatest objection to being tagged like livestock was that it would only be a short matter of time before some bonehead referred to the fantabulous “Smart Badges” as educational technology. When I mentioned this to my friend Chris Lehmann, he told me that it already had.
Q: Why is ISTE using smart badges?
A: ISTE recognizes the value of personalized learning and wants to do all we can to create custom and individualized educational experiences for each of our attendees. Smart badges will allow us to provide you with your own “ISTE 2018 Journey” post conference. The journey will detail the sessions you attended and the resources you collected. It’s like taking notes with your feet! Additionally, this data will allow the ISTE team to further personalize the conference experience now and in the future. This aggregate data, combined with registration information, will provide more comprehensive insights into attendee patterns and activities.
Therein lies the problem. Tracking students legs, bums, or corneas is not education. It is not personalization, a fantasy that after decades has produced little more than dispensing a multiple-choice question based on how well you answered another multiple-choice question. Personalized learning is at best machine-based testing. It has little to do with teaching beyond automation and nothing to do with learning. Yet, ISTE’s largest corporate sponsors profit greatly by this hideous handful of magic beans.
The greatest threat of the ISTE “Smart Badges” is the denaturing of educational computing’s powerful potential and the organization’s misanthropic service of corporate sponsors, often in ways detrimental to its members – the ones who justify its tax-exempt status.
I updated “Mindset Marketing, Behaviorism, and Deficit Ideology” with selections from “How (and Why) Ed-Tech Companies Are Tracking Students’ Feelings – Education Week” and “Are Students Benefiting From the Growth Mindset Model?”.
Overall, weak effects across both analyses indicate that mindset alone fails to facilitate significant shifts in student academic performance and in-school success. While mindsets, also referred to as implicit theories, may influence educational trajectories, there are likely other factors that are better at predicting student success, such as school and classroom characteristics.
Mindset interventions have gained traction in recent years because they’re intuitive and marketable. The idea that confidence facilitates success is accessible and, as a result, it is incorporated into many programs designed to support students. Unfortunately, programs advertised to promote a growth mindset in students are often poorly developed, ineffective, or lack empirical support.
The notion that a growth mindset is enough to inspire success in students is also problematic, in that it disregards powerful circumstantial features of students’ in-school experiences, such as nutrition, poverty, instructional quality, psychosocial stress, external pressures, abuse, etc. Although future research may serve to disentangle the components of certain growth mindset programs that are effective and help to eliminate pieces that are not, perhaps the abundant resources devoted to growth mindset program development and research would be more appropriately applied to other efforts to improve in-school instructional quality and social-emotional supports.
“If you generate detailed information about students’ feelings, then it becomes possible to target them in sophisticated ways in order to nudge them to behave in ways that conform with a particular, idealized model of a ‘good student,'” Williamson said.
Government agencies and Silicon Valley companies deciding how students should be thinking and what they should be feeling-then collecting massive amounts of data and deploying invisible algorithms to enact that agenda-is something to be fought now, before the horse is all the way out of the barn.
It’s hard to explain to regular people how much technology barely works, how much the infrastructure of our lives is held together by the IT equivalent of baling wire.
Computers, and computing, are broken.
The NSA is doing so well because software is bullshit.
Computers have gotten incredibly complex, while people have remained the same gray mud with pretensions of godhood.
Now imagine billions of little unknowable boxes within boxes constantly trying to talk and coordinate tasks at around the same time, sharing bits of data and passing commands around from the smallest little program to something huge, like a browser - that’s the internet. All of that has to happen nearly simultaneously and smoothly, or you throw a hissy fit because the shopping cart forgot about your movie tickets.
We often point out that the phone you mostly play casual games on and keep dropping in the toilet at bars is more powerful than all the computing we used to go to space for decades.
NASA had a huge staff of geniuses to understand and care for their software. Your phone has you.
The number of people whose job it is to make software secure can practically fit in a large bar, and I’ve watched them drink. It’s not comforting. It isn’t a matter of if you get owned, only a matter of when.
This is because all computers are reliably this bad: the ones in hospitals and governments and banks, the ones in your phone, the ones that control light switches and smart meters and air traffic control systems. Industrial computers that maintain infrastructure and manufacturing are even worse. I don’t know all the details, but those who do are the most alcoholic and nihilistic people in computer security. Another friend of mine accidentally shut down a factory with a malformed ping at the beginning of a pen test. For those of you who don’t know, a ping is just about the smallest request you can send to another computer on the network. It took them a day to turn everything back on.
When we tell you to apply updates we are not telling you to mend your ship. We are telling you to keep bailing before the water gets to your neck.
Executable mail attachments (which includes things like Word, Excel, and PDFs) you get just about everyday could be from anyone - people can write anything they want in that From: field of emails, and any of those attachments could take over your computer as handily as an 0day. This is probably how your grandmother ended up working for Russian criminals, and why your competitors anticipate all your product plans. But if you refuse to open attachments you aren’t going to be able to keep an office job in the modern world. There’s your choice: constantly risk clicking on dangerous malware, or live under an overpass, leaving notes on the lawn of your former house telling your children you love them and miss them.
I updated “Sex Ed: Toxic Masculinity, Emotional Expression, Online Privacy, Identity Management, Dress Codes, Bodily Autonomy, and Purity Culture”, “Privacy and Passwords”, and “Communication is oxygen.” with a selection from “On Privacy – Human Systems – Medium”.
Living Privately. - Building and maintaining a sense of what to show in each social environment. - Discovering and creating new environments in which we can show more of ourselves. - Assessing where you can grow new parts of yourself which aren’t (yet) for public display.
“We’re finally getting the stage where a large portion of the population can’t really ignore the fact that they’re using free services in return for pervasive and always-on surveillance.”