there will be much dying: even more so than during the worst conflicts of the 20th century. But rather than conventional wars (“nation vs nation”) it’ll be “us vs them”, where “us” and “them” will be defined by whichever dehumanized enemy your network filter bubble points you at

You don’t need to build concentration camps with barbed wire fences and guards if you can turn your entire society into a machine-mediated panopticon with automated penalties for non-compliance.

Tomorrow’s genocides will be decentralized and algorithmically tweaked, quite possibly executed without human intervention.

Forget barbed wire, concentration camps, gas chambers and gallows, and Hugo Boss uniforms. That’s the 20th century pattern of centralized, industrialized genocide. In the 21st century deep-learning mediated AI era, we have the tools to inflict agile, decentralized genocide as a cloud service on our victims.

Trump has discovered that in times of insecurity, the spectacle of cruelty provides a shared common focus for his supporters.

What’s new is the speed and specificity with which the cruelty can be applied, and the ability to redirect it in a matter of hours—increasing the sense of insecurity, which in turn drives social conservativism and support for violent self-defense.

Source: Happy 21st Century! – Charlie’s Diary

Deciding what to believe based on other people’s opinions is not only not journalistic, it’s arguably hostile to the press as a democratic institution. The truth may be nuanced, but reportable facts are often quite straightforward. As any journalist can tell you, the best answer to the question “what happened?” is not why don’t you ask a bunch of your friends what they think, organize their views along a spectrum, and then decide where to plant yourself.

Source: Mark Zuckerberg Doesn’t Understand Journalism – The Atlantic

It’s important, she says, that the technology came out of a lengthy process of understanding what challenges existed in the system. “I think if you look at certain attempts to use technology within complex bureaucratic systems, you’ll very often have people writing a beautiful algorithm, but for a problem that’s the wrong problem,” Pahlka says. “What I’m proud of our team doing is the work to figure out where the real problem is.”

Source: This algorithm is quickly clearing old marijuana convictions in San Francisco

It was hard to accept some of the (valid) criticism, especially the idea that women and people of color felt particularly unwelcome. There’s a weird paradox with bias. Those of us who have privilege, but care deeply about reducing bias should be uniquely positioned to help, but we struggle the hardest to recognize that we are (unintentionally) biased ourselves.² As it happens, making people feel left out is a deep personal fear of mine. (There is probably a seriously repressed playground kickball thing in my past somewhere.) Ironically, that made it harder for me to accept the possibility that something I work on could make outsiders feel unwanted. So I focused on what we were proud of: We _are one of the only large sites where it’s practically impossible to find a single slur – our community takes them down in minutes. _We _don’t tolerate our female users being called “sweetie” or getting hit on. But _we _weren’t listening. Many people, especially those in marginalized groups _do _feel less welcome. _We know because they tell us.

Source: Stack Overflow Isn’t Very Welcoming. It’s Time for That to Change. – Stack Overflow Blog

That there are two philosophies does not necessarily mean that one is right and one is wrong: the reality is we need both. Some problems are best solved by human ingenuity, enabled by the likes of Microsoft and Apple; others by collective action. That, though, gets at why Google and Facebook are fundamentally more dangerous: collective action is traditionally the domain of governments, the best form of which is bounded by the popular will. Google and Facebook, on the other hand, are accountable to no one. Both deserve all of the recent scrutiny they have attracted, and arguably deserve more.

That scrutiny, though, and whatever regulations that result, must keep in mind this philosophical divide: platforms that create new possibilities – and not just Apple and Microsoft! – are the single most important economic force when it comes to countering the oncoming wave of computers doing people’s jobs, and lazily written regulation that targets aggregators but constricts platforms will inevitably do more harm than good.

Source: Tech’s Two Philosophies – Stratechery by Ben Thompson

We should preserve the past on the web, and learn to make sites even more future-safe. The web is where human knowlege accumulates. #

It’s time for tech people to have values, as journalism, medicine and law do. Deliberately taking features out of the web, claiming pieces of the web as corporate property, forcing the history offline, all are terrible abuses of what make the Internet great. An ethical technologist would refuse to do this work.#

When we teach people to create technology, they should learn to respect and enhance the things that make the Internet great, not help modern day robber barons appropriate them. #

The Internet is a place for the people, like parks, libraries, museums, historic places. It’s okay if corporations want to exploit the net, like DisneyLand or cruise lines, but not at the expense of the natural features of the net. #

Source: Scripting News: The Internet is going the wrong way

I updated “Books that influenced my views on education and learning ” with some new books.