a new kind of scientism
by Sebastian Benthall
Thinking it over, there are a number of problems with my last post. One was the claim that the scientism addressed by Horkheimer in 1947 is the same as the scientism of today.
Scientism is a pejorative term for the belief that science defines reality and/or is a solution to all problems. It’s not in common use now, but maybe it should be among the critical thinkers of today.
Frankfurt School thinkers like Horkheimer and Habermas used “scientism” to criticize the positivists, the 20th century philosophical school that sought to reduce all science and epistemology to formal empirical methods, and to reduce all phenomena, including social phenomena, to empirical science modeled on physics.
Lots of people find this idea offensive for one reason or another. I’d argue that it’s a lot like the idea that algorithms can capture all of social reality or perform the work of scientists. In some sense, “data science” is a contemporary positivism, and the use of “algorithms” to mediate social reality depends on a positivist epistemology.
I don’t know any computer scientists that believe in the omnipotence of algorithms. I did get an invitation to this event at UC Berkeley the other day, though:
This Saturday, at [redacted], we will celebrate the first 8 years of the [redacted].
Current students, recent grads from Berkeley and Stanford, and a group of entrepreneurs from Taiwan will get together with members of the Social Data Lab. Speakers include [redacted], former Palantir financial products lead and course assistant of the [redacted]. He will reflect on how data has been driving transforming innovation. There will be break-out sessions on sign flips, on predictions for 2020, and on why big data is the new religion, and what data scientists need to learn to become the new high priests. [emphasis mine]
I suppose you could call that scientistic rhetoric, though honestly it’s so preposterous I don’t know what to think.
Though I would recommend to the critical set the term “scientism”, I’m ambivalent about whether it’s appropriate to call the contemporary emphasis on algorithms scientistic for the following reason: it might be that ‘data science’ processes are better than the procedures developed for the advancement of physics in the mid-20th century because they stand on sixty years of foundational mathematical work with modeling cognition as an important aim. Recall that the AI research program didn’t start until Chomsky took down Skinner. Horkheimer quotes Dewey commenting that until naturalist researchers were able to use their methods to understand cognition, they wouldn’t be able to develop (this is my paraphrase:) a totalizing system. But the foundational mathematics of information theory, Bayesian statistics, etc. are robust enough or could be robust enough to simply be universally intersubjectively valid. That would mean data science would stand on transcendental not socially contingent grounds.
That would open up a whole host of problems that take us even further back than Horkheimer to early modern philosophers like Kant. I don’t want to go there right now. There’s still plenty to work with in Horkheimer, and in “Conflicting panaceas” he points to one of the critical problems, which is how to reconcile lived reality in its contingency with the formal requirements of positivist or, in the contemporary data scientific case, algorithmic epistemology.