Process theory; generative epistemology; configurative ontology: notes on Cederman, part 1

by Sebastian Benthall

I’ve recently had recommended to me the work of L.E. Cederman, who I’ve come to understand is a well-respected and significant figure in computational social science, especially agent based modeling. In particular, I’ve been referred to this paper on the theoretical foundations of computational sociology:

Cederman, L.E., 2005. Computational models of social forms: Advancing generative process theory 1. American Journal of Sociology, 110(4), pp.864-893. (link)

This is a paper I wish I had encountered years ago. I’ve written much here about my struggles with “interdisciplinary” research. In short: I’ve been trying to study social phenomena with scientific rigor. This is a very old problem fraught with division. On top of that, there’s been, it seems, an epistemological upset because of advances in data collection and processing that poses a practical challenge to a lot of established disciplines. On top of this, the social phenomena I’m interested in most tend to involve the interaction between people and technology, which brings with it an association with disciplines specialized to that domain (HCI, STS) that for me have not made my research any more straightforward. After trying for some time to do the work I wanted to do under the new heading of data science, I did not find what I was looking intellectually in that emerging field, however important the practical skill-set involved has been to me.

Computational social science, I’ve convinced myself if not others, is where the answers lie. My hope for it is that as a new discipline, it’s able to break away from dogmas that limited other disciplines and trapped their ambitions in endless methodological debates. What is being offered, I’ve imagined, in computational social science is the possibility of a new paradigm, or at least a viable alternative one. Cederman’s 2005 paper holds out the promise for just that.

Let me address for now some highlights of his vision of social science and how they relate to the other. I hope to come to the rest in a later post.

Sociological process theory. This is a position in sociological theory that Cederman attributes to 19th century sociologist Georg Simmel. The core of this position is that social reality is not fixed, but rather result of an ongoing process of social interactions that give rise to social forms.

“The large systems and the super-individual organizations that customarily come to mind when we think of society, are nothing but immediate interactions that occur among men constantly every minute, but that have become crystallized as permanent fields, as autonomous phenomena.” (Simmel quoted in Wolf 1950, quoted in Cederman 2005)

There is a lot to this claim. If one is coming from the field of Human Computer Interaction (HCI), what may seem most striking about it is how well it resonates with a scholarly tradition that is most frequently positioned as a countercurrent to an unthinking positivism in design. Lucy Suchman, Etienne Wenger, and Jean Lave are scholars that come to mind as representative of this way of thinking. Much of the intellectual thrust of Simmel can be found in Paul Dourish’s criticism of positivist understandings of “context” in HCI.

For Dourish, the intellectual ground of this position is phenomenological social science, often associated with ethnomethodology. Simmel predates phenomenology but is a neo-Kantian, a contemporary of Weber, and a critic of the positivism of his day (the original positivism). As a social scientific tradition, it has had its successors (maybe most notably George Herbert Mead) but has submerged under other theoretical traditions. From Cederman’s analysis, one gathers that this is largely due to process theory’s inability to ground itself in rigorous method. Its early proponents were fond of metaphorical writing in a way that didn’t age well. Cederman pays homage to the sociological process theory’s origins, but quickly moves to discuss an epistemological position that complements it. Notably, this position is neither positivist, nor phenomenological, nor critical (in the Frankfurt School sense), but something else: generative epistemology.

Generative epistemology. Cederman positions generative epistemology primarily in opposition to positivism and particularly a facet of positivism that he calls “nomothetic explanation”: explanation in terms of laws and regularities. The latter is considered the gold standard of natural science and the social sciences that attempt to mimic them. This tendency is independent of whether the inquiry is qualitative or quantitative. Both comparative analysis and statistical control look for a conjunction of factors that is regularly predictive of some outcome. (Cederman’s sources on this: (Gary) King, Keohane, and Verba (1994), and Goldthorpe, 1997. The Gary King cited is I assume the same Gary King who goes on to run Harvard’s IQSS; I hope to return to this question of positivism in computational social science in later writing. I tend to disagree with the idea that ‘data science’ or ‘big data’ has primarily a positivist tendency.)

Cederman describes the ‘process theorist’s’ alternative as based on abduction, not induction. Recall that ‘abduction’ was Peirce’s term for ‘inference to the best explanation’. The goal is to take an observed sociological phenomenon and explain its generation by accounting for how it is socially produced. The preference for generative explanation, in Simmel, comes in part from a pessimism about isolating regularities in complex social systems. Through this theorization, knowledge is gained; the knowledge gained is a theoretical advance that makes a social phenomenon less ‘puzzling’.

“The construction of generative explanations based on abductive inference is an inherently theoretical endeavor (McMullin, 1964). Instead of subsuming observations under laws, the main explanatory goal is to make a puzzling phenomenon less puzzling, something that inevitably requires the introduction of new knowledge through theoretical innovation.”

The specifics of the associated method are less clear than the motivation for this epistemology. Many early process theorists resorted to metaphors. But where all this is going is into the construction of models, and especially computational models, as a way of presenting and testing generative theories. Models generate forms through logical operations based on a number of parameters. A comparison between the logical form and the empirical form is made. If it favorable, then the empirical form can be characterized as the result of a process described by the variables and model. (Barth, 1981)

Cederman draws from Barth (1981) and Thomas Fararo (1989) to ally himself with ‘realist’ social science. The term is clarified later: ‘realism’ is opposed to ‘instrumentalism’, a reference that cuts to one of core epistemological debates in computational methods. An instrumental method, such as a machine learning ensemble, may provide a very instrumental model for purposes of prediction and control that nevertheless does not capture what’s really going on in the underlying process. Realist mathematical sociology, on the other hand, attempts to capture the reality of the process generating the social phenomenon in the precise language of processing, mathematics/computation. The underlying metaphysical point is one that many people would rather not attend to. For now, we will follow Cederman’s logic to a different ontological point.

Configurative ontology. Sociological process theory requires explanations to be specify the process that generates the social form observed. The entities, relations, and mechanisms may be unobserved or even unobservable. Postivists, Cederman argues, will often take the social forms to be variables themselves and undertheorize how the variables have been generated, since they care only about predicting actual outcomes. Whereas positivists study ‘correlations’ among elements, Simmel studies ‘sociations’, the interactions that result in those elements. The ontology, then, is that social forms are “configurations of social interactions and actors that together constitute the structures in which they are embedded.

In this view, variables, such as would be used in some more positivist social scientific study, “merely measure dimensions of social forms; they cannot represent the forms themselves except in very simple cases.” While a variable based analysis detaches a social phenomenon from space and time, “social forms always possess a duration in time and an extension in space.

Aside from a deep resonance with Dourish’s critique of ‘contextual computing’ (noted above), this argument once again recalls much of what now comes under the expansive notion of ‘criticism’ of social sciences. Ethnomethodology and ethnography more general are now often raised as an alternative to simplistic positivist methods. In my experience at Berkeley and exposure so far to the important academic debates, the most noisy contest is between allegedly positivist or instrumentalist (they are different, surely) quantitative methods and phenomenological ethnographic methods. Indeed, it is the latter who more often now claim the mantle of ‘realism’. What is different about Cederman’s case in this paper is that he is setting up a foundation for realist sociology that is nevertheless mathematized and computational.

What I am looking for in this paper, and haven’t found yet, is an account of how these ‘realist’ models of social processes are tested for their correspondence to empirical social form. Here is where I believe there is an opportunity that I have not yet seen fully engaged.

Advertisements