The Big Switch

Great excerpt from Nicholas Carr’s new book, The Big Switch:

It’s a stirring thought, but like most myths it’s at best a half-truth and at worst a delusion. Computer systems in general and the Internet in particular put enormous power into the hands of individuals, but they put even greater power into the hands of companies, governments, and other institutions whose business it is to control individuals. Computer systems are not at their core technologies of emancipation. They are technologies of control. They were designed as tools for monitoring and influencing human behavior, for controlling what people do and how they do it. As we spend more time online, filling databases with the details of our lives and desires, software programs will grow ever more capable of discovering and exploiting subtle patterns in our behavior. The people or organizations using the programs will be able to discern what we want, what motivates us, and how we’re likely to react to various stimuli. They will, to use a cliché that happens in this case to be true, know more about us than we know about ourselves.

Read more from this chapter on his blog.

Lots of great tech reading I need to do, including the new Rushkoff and Lanier books…

The Pitfalls (& Opportunities) of News Algorithms

Fantastic article {neiman journalism lab} about Google News and the issues we face when  algorithms pick our news for us. The article details the specific ways algorithms chooses what to filter:

For example, unless directly programmed to do so, the Google News algorithm won’t play favorites when picking representative articles for a cluster on a local political campaign — it’s essentially non-partisan. But one of its criteria for choosing articles is “frequency of appearance.” That may seem neutral — but if one of the candidates in that race consistently got slightly more media coverage (i.e. higher “frequency of appearance”), that criterion could make Google News’ output appear partisan.

In other words, even seemingly objective measures like the number of times an article is shared through social media doesn’t factor in potential biases like socioeconomic status and access to the internet/social media sites.

But perhaps smarter filters could learn to outsmart themselves:

For instance, at the Korea Advanced Institute of Science and Technology (KAIST), Souneil Park and his collaborators have been experimenting with aggregation algorithms that feed into a news presentation called NewsCube, which nudges users towards consuming a greater variety of perspectives. Forget leaving things to chance with serendipity — their research is working on actively biasing your exposure to news in a beneficial way. Richard Thaler and Cass Sunstein, in their book Nudge, call this kind of influence “libertarian paternalism” — biasing experiences to correct for cognitive deficiencies in human reasoning. Not only can algorithms bias the content that we consume — someday they might do so in a way that makes us smarter and less prone to our own shortcomings in reasoning. Perhaps an algorithm could even slowly push extremists towards the center by exposing them to increasingly moderate versions of their ideas.

More on news algorithms here, here and here.