I’ve written a lot (here, informally) on the subject of computational control of society. I’m not the only one, of course. There has in the past few years been a growing fear that one day artificial intelligence might control everything. I’ve argued that this is akin to older fears that, under capitalism, instrumentality would run amok.
Recently, thinking a little more seriously about what’s implied by an economy of control, I’ve been coming around to a quite different conclusion. What if the general tendency of these algorithmic systems is not the enslavement of humanity but rather the opening up of freedom and opportunity? This is not a critical attitude and might be seen as a simple shilling for industrial powers, so let me pose the point slightly more controversially. What if the result of these systems is to provide so much freedom and opportunity that it undermines the structure that makes social action significant? The “control” of these systems could just be the result of our being exposed, at last, to our individual insignificance in the face of each other.
As a foil, I’ll refer again to Frank Pasquale’s The Black Box Society, which I’ve begun to read again at the prompting of Pasquale himself. It is a rare and wonderful thing for the author of a book you’ve written rude things about to write you and tell you you’ve misrepresented the work. So often I assume nobody’s actually reading what I write, making this a lonely vocation indeed. Now I know that at least somebody gives a damn.
In Chapter 3, Pasquale writes:
“The power to include, exclude, and rank [in search results] is the power to ensure which public impressions become permanent and which remain fleeting. That is why search services, social and not, are ‘must-have’ properties for advertisers as well as users. As such, they have made very deep inroads indeed into the sphere of cultural, economic, and political influence that was once dominated by broadcast networks, radio stations, and newspapers. But their dominance is so complete, and their technology so complex, that they have escaped pressures for transparency and accountability that kept traditional media answerable to the public.”
As a continuation of the “technics-out-of-control” meme, there’s an intuitive thrust to this argument. But looking at the literal meaning of the sentences, none of it is actually true!
Let’s look at some of the reasons why these claims are false:
- There are multiple competing search engines, and switching costs are very low. There are Google and Bing and Duck Duck Go, but there’s also more specialized search engines for particular kinds of things. Literally every branded shopping website has a search engine that includes only what it chooses to include. This market pressure for search drives search engines generally to provide people with the answers they are looking for.
- While there is a certain amount of curation that goes into search results, the famous early ranking logic which made large scale search possible used mainly data created as part of the content itself (hyperlinks in the case of Google’s PageRank) or usage (engagement in the case of Facebook’s EdgeRank). To the extent that these algorithms have changed, much of it has been because they have had to cave to public pressure, in the form of market pressure. Many of these changes are based on dynamic socially created data as well (such as spam flagging). Far from being manipulated by a secret powerful force, search engine results are always a dynamic, social accomplishment that is a reflection of the public.
- Alternative media forms, such as broadcast radio, print journalism, cable television, storefront advertisting, and so on still exist and have an influence over people’s decisions. No single digital technology ensures anything! A new restaurant that opens up in a neighborhood is free to gain a local reputation in the old fashioned way. And then these same systems for ranking and search incentivize the discovery over these local gems by design. The information economy doesn’t waste opportunities like this!
So what’s the problem? If algorithms aren’t controlling society, but rather are facilitating its self-awareness, maybe these kinds of polemics are just way off base.