Tag: beniger

economy of control

We call it a “crisis” when the predictions of our trusted elites are violated in one way or another. We expect, for good reason, things to more or less continue as they are. They’ve evolved to be this way, haven’t they? The older the institution, the more robust to change it must be.

I’ve gotten comfortable in my short life with the global institutions that appeared to be the apex of societal organization. Under these conditions, I found James Beniger‘s work to be particularly appealing, as it predicts the growth of information processing apparati (some combination of information worker and information technology) as formerly independent components of society integrate. I’m of the class of people that benefits from this kind of centralization of control, so I was happy to believe that this was an inevitable outcome according to physical law.

Now I’m not so sure.

I am not sure I’ve really changed my mind fundamentally. This extreme Beniger view is too much like Nick Bostrom’s superintelligence argument in form, and I’ve already thought hard about why that argument is not good. That reasoning stopped at the point of noting how superintelligence “takeoff” is limited by data collection. But I did not go to the next and probably more important step, which is the problem of aleatoric uncertainty in a world with multiple agents. We’re far more likely to get into a situation with multi-polar large intelligences that are themselves fraught with principle-agent problems, because that’s actually the status quo.

I’ve been prodded to revisit The Black Box Society, which I’ve dealt with inadequately. Its beefier chapters deal with a lot of the specific economic and regulatory recent history of the information economy of the United States, which is a good complement to Beniger and a good resource for the study of competing intelligences within a single economy, though I find this data a but clouded by the polemical writing.

“Economy” is the key word here. Pure, Arendtian politics and technics have not blended easily, but what they’ve turned into is a self-regulatory system with structure and agency. More than that, the structure is for sale, and so is the agency. What is interesting about the information economy is, and I guess I’m trying to coin a phrase here, is that it is an economy of control. The “good” being produced, sold, and bought, is control.

There’s a lot of interesting research about information goods. But I’ve never heard of a “control good”. But this is what we are talking about when we talk about software, data collection, managerial labor, and the conflicts and compromises that it creates.

I have a few intuitions about where this goes, but not as many as I’d like. I think this is because the economy of control is quite messy and hard to reason about.

Horkheimer on engineers

Horkheimer’s comment on engineers:

It is true that the engineer, perhaps the symbol of this age, is not so exclusively bent on profitmaking as the industrialist or the merchant. Because his function is more directly connected with the requirements of the production job itself, his commands bear the mark of greater objectivity. His subordinates recognize that at least some of his orders are in the nature of things and therefore rational in a universal sense. But at bottom this rationality, too, pertains to domination, not reason. The engineer is not interested in understanding things for their own sake or the sake of insight, but in accordance to their being fitted into a scheme, no matter how alien to their own inner structure; this holds for living beings as well as for inanimate things. The engineer’s mind is that of industrialism in its streamlined form. His purposeful rule would make men an agglomeration of instruments without a purpose of their own.

This paragraph sums up much of what Horkheimer stands for. His criticism of engineers, the catalysts of industrialism, is not that they are incorrect. It is that their instrumental rationality is not humanely purposeful.

This humane purposefulness, for Horkheimer, is born out of individual contemplation. Though he recognizes that this has been a standpoint of the privileged (c.f. Arendt on the Greek polis), he sees industrialism as successful in bringing many people out of a place of necessity but at the cost of marginalizing and trivializing all individual contemplation. The result is an efficient machine with nobody in charge. This bodes ill because such a machine is vulnerable to being co-opted by an irrational despot or charlatan. Individuality, free of material necessity and also free of the machine that liberated it from that necessity, is the origin of moral judgement that prevents fascist rule.

This is very different from the picture of individuality Fred Turner presents in The Democratic Surround. In his account of how United States propaganda created a “national character” that was both individual enough to be anti-fascist and united enough to fight fascism, he emphasizes the role of art installations that encourage the view to stitch themselves synthetically into a large picture of the nation. One is unique within a larger, diverse…well, we might use the word society, borrowing from Arendt, who was also writing in the mid-century.

If this is all true, then this dates a transition in American culture from one of individuality to one of society. This coincides with the tendency of information organization traced assiduously by Beniger.

We can perhaps trace an epicycle of this process in the history of the Internet. In it’s “wild west” early days, when John Perry Barlow could write about the freedom of cyberspace, it was a place primarily occupied by the privileged few. Interestingly, many of these were engineers, and so were (I’ll assume for the sake of argument) but materially independent and not exclusively focused on profit-making. Hence the early Internet was not unlike the ancient polis, a place where free people could attempt words and deeds that would immortalize them.

As the Internet became more widely used and commercialized, it became more and more part of the profiteering machine of capitalism. So today we see it’s wildness curtailed by the demands of society (which includes an appeal to an ethics sensitive both to disparities in wealth and differences in the body, both part of the “private” realm in antiquity but an element of public concern in modern society.)

resisting the power of organizations

“From the day of his birth, the individual is made to feel there is only one way of getting along in this world–that of giving up hope in his ultimate self-realization. This he can achieve solely by imitation. He continuously responds to what he perceives about him, not only consciously but with his whole being, emulating the traits and attitudes represented by all the collectivities that enmesh him–his play group, his classmates, his athletic team, and all the other groups that, as has been pointed out, enforce a more strict conformity, a more radical surrender through complete assimilation, than any father or teacher in the nineteenth century could impose. By echoing, repeating, imitating his surroundings, by adapting himself to all the powerful groups to which he eventually belongs, by transforming himself from a human being into a member of organizations, by sacrificing his potentialities for the sake of readiness and ability to conform to and gain influence in such organizations, he manages to survive. It is survival achieved by the oldest biological means necessary, mimicry.” – Horkheimer, “Rise and Decline of the Individual”, Eclipse of Reason, 1947

Returning to Horkheimer‘s Eclipse of Reason (1947) after studying Beniger‘s Control Revolution (1986) serves to deepen ones respect for Horkheimer.

The two writers are for the most part in agreement as to the facts. It is a testament to their significance and honesty as writers that they are not quibbling about the nature of reality but rather are reflecting seriously upon it. But whereas maintains a purely pragmatic, unideological perspective, Horkheimer (forty years earlier) correctly attributes this pragmatic perspective to the class of business managers to whom Beniger’s work is directed.

Unlike more contemporary critiques, Horkheimer’s position is not to dismiss this perspective as ideological. He is not working within the postmodern context that sees all knowledge as contestable because it is situated. Rather, he is working with the mid-20th acknowledgment that objectivity is power. This is a necessary step in the criticality of the Frankfurt School, which is concerned largely with the way (real) power shapes society and identity.

It would be inaccurate to say that Beniger celebrates the organization. His history traces the development of social organization as evolving organism. Its expanding capacity for information processing is a result of the crisis of control unleashed by the integration of its energetic constituent components. Globalization (if we can extend Beniger’s story to include globalization) is the progressive organization of organizations of organization. It is interesting that this progression of organization is a strike against Weiner’s prediction of the need for society to arm itself against entropy. This conundrum is one we will need to address in later work.

For now, it is notable that Horkheimer appears to be responding to just the same historical developments later articulated by Beniger. Only Horkeimer is writing not as a descriptive scientist but as a philosopher engaged in the process of human meaning-making. This positions him to discuss the rise and decline of the individual in the era of increasingly powerful organizations.

Horkheimer sees the individual as positioned at the nexus of many powerful organizations to which he must adapt through mimicry for the sake of survival. His authentic identity is accomplished only when alone because submission to organizational norms is necessary for survival or the accumulation of organizational power. In an era where pragmatic ability to manipulate people, not spiritual ideals, qualifies one for organization power, the submissive man represses his indignation and rage at this condition and becomes an automoton of the system.

Which system? All systems. Part of the brilliance of both Horkheimer and Beniger is their ability to generalize over many systems to see their common effect on their constituents.

I have not read Horkheimer’s solution the individual’s problem of how to maintain his individuality despite the powerful organizations which demand mimicry of him. This is a pressing question when organizations are becoming ever more powerful by using the tools of data science. My own hypotheses, which is still in need of scientific validation, is that the solution lies in the intersecting agency implied by the complex topology of the organization of organizations.

is science ideological?

In a previous post, I argued that Beniger is an unideological social scientist because he grounds his social scientific theory in robust theory from the natural and formal sciences, like theory of computation and mathematical biology. Astute commenter mg has questioned this assertion.

Does firm scientific grounding absolve a theoretical inquiry from ideology – what about the ideological framework that the science itself has grown in and is embedded in? Can we ascribe such neutrality to science?

This is a good question.

To answer it, it would be good to have a working definition of ideology. I really like one suggested by this passage from Habermas, which I have used elsewhere.

The concept of knowledge-constitutive human interests already conjoins the two elements whose relation still has to be explained: knowledge and interest. From everyday experience we know that ideas serve often enough to furnish our actions with justifying motives in place of the real ones. What is called rationalization at this level is called ideology at the level of collective action. In both cases the manifest content of statements is falsified by consciousness’ unreflected tie to interests, despite its illusion of autonomy. The discipline of trained thought thus correctly aims at excluding such interests. In all the sciences routines have been developed that guard against the subjectivity of opinion, and a new discipline, the sociology of knowledge, has emerged to counter the uncontrolled influence of interests on a deeper level, which derive less from the individual than from the objective situation of social groups.

If we were to extract a definition of ideology from this passage, it would be something like this: an ideology is:

  1. an expression of motives that serves to justify collective action by a social group
  2. …that is false because it is unreflective of the social group’s real interests.

I maintain that the theories that Beniger uses to frame his history of technology are unideological because they are not expressions of motives. They are descriptive claims whose validity has been tested thoroughly be multiple independent social groups with conflicting interests. It’s this validity within and despite the contest of interests which gives scientific understanding its neutrality.

Related: Brookfield’s “Contesting Criticality: Epistemological and Practical Contradictions in Critical Reflection” (here), which I think is excellent, succinctly describes the intellectual history of criticality and how contemporary usage of it blends three distinct traditions:

  1. a Marxist view of ideology as the result of objectively true capitalistic social relations,
  2. a psychoanalytic view of ideology as a result of trauma or childhood,
  3. and a pragmatic/constructivist/postmodern view of all knowledge being situated.

Brookfield’s point is that an unreflective combination of these three perspectives is incoherent both theoretically and practically. That’s because while the first two schools of thought (which Habermas combines, above–later Frankfurt School writers deftly combined Marxism is psychoanalysis) both maintain an objectivist view of knowledge, the constructivists reject this in favor of a subjectivist view. Since discussion of “ideology” comes to us from the objectivist tradition, there is a contradiction in the view that all science is ideological. Calling something ‘ideological’ or ‘hegemonic’ requires that you take a stand on something, such as the possibility of an alternative social system.

I really like Beniger

I’ve been a fan of Castells for some time but reading Ampuja and Koivisto’s critique of him is driving home my new appreciation of Beniger‘s The Control Revolution (1986).

One reason why I like Beniger is that his book is an account of social history and its relationship with technology that is firmly grounded in empirically and formally validated scientific theory. That is, rather than using as a baseline any political ideological framework, Beniger grounds his analysis in an understanding of the algorithm based in Church and Turing, and understanding of biological evolution grounded in biology, and so on.

This allows him to extend ideas about programming and control from DNA to culture to bureaucracy to computers in a way that is straightforward and plausible. His goal is, admirably, to get people to see the changes that technology drives in society as a continuation of a long regular process rather than a reason to be upset or a transformation to hype up.

I think there is something fundamentally correct about this approach. I mean that with the full force of the word correct. I want to go so far as to argue that Beniger (at least as of Chapter 3…) is an unideological theory of history and society that is grounded in generalizable and universally valid scientific theory.

I would be interested to read a substantive critique of Beniger arguing otherwise. Does anybody know if one exists?

intersecting agencies and cybersecurity #RSAC

I recurring theme in my reading lately (such as, Beniger‘s The Control Revolution, Horkheimer‘s Eclipse of Reason, and Norbert Wiener’s Cybernetics work) is the problem of two ways of reconciling explanations of how-things-came-to-be:

  • Natural selection. Here a number of autonomous, uncoordinated agents with some exogenously given variability encounter obstacles that limit their reproduction or survival. The fittest survive. Adaptation is due to random exploration at the level of the exogenous specification of the agent, if at all. In unconstrained cases, randomness rules and there is no logic to reality.
  • Purpose. Here there is a teleological explanation based on a goal some agent has “in mind”. The goal is coupled with a controlling mechanism that influences or steers outcomes towards that goal. Adaptation is part of the endogenous process of agency itself.

Reconciling these two kinds of description is not easy. A point Beniger makes is that differences between social theories in the 20th century can be read as differences in the divisions of where one demarcates agents within a larger system.

This week at the RSA Conference, Amit Yoran, President of RSA, gave a keynote speech about the change in mindset of security professionals. Just the day before I had attended a talk on “Security Basics” to reacquaint myself with the field. In it, there was a lot of discussion of how a security professional needs to establish “the perimeter” of their organization’s network. In this framing, a network is like the nervous system of the macro-agent that is an organization. The security professional’s role is to preserve the integrity of the organization’s information systems. Even in this talk on “the basics”, the speaker acknowledged that a determined attacker will always get into your network because of the limitations of the affordances of defense, the economic incentives of attackers, and the constantly “evolving” nature of the technology. I was struck in particular by this speaker’s detachment from the arms race of cybersecurity. The goal-driven adversariality of the agents involved in cybersecurity was taken as a given; as a consequence, the system evolves through a process of natural selection. The role of the security professional is to adapt to an exogenously-given ecosystem of threats in a purposeful way.

Amit Yoran’s proposed escape from the “Dark Ages” of cybersecurity got away from this framing in at least one way. For Yoran, thinking about the perimeter is obsolete. Because the attacker will always be able to infiltrate, the emphasis must be on monitoring normal behavior within your organization–say, which resources are accessed and how often–and detecting deviance through pervasive surveillance and fast computing. Yoran’s vision replaces the “perimeter” with an all-seeing eye. The organization that one can protect is the organization that one can survey as if it was exogenously given, so that changes within it can be detected and audited.

We can speculate about how an organization’s members will feel about such pervasive monitoring and auditing of activity. The interests of the individual members of a (sociotechnical) organization, the interests of the organization as a whole, and the interests of sub-organizations within an organization can be either in accord or in conflict. An “adversary” within an organization can be conceived of as an agent within a supervening organization that acts against the latter’s interests. Like a cancer.

But viewing organizations purely hierarchically like this leaves something out. Just as human beings are capable of more complex, high-dimensional, and conflicted motivations than any one of the organs or cells in our bodies, so too should we expect the interests of organizations to be wide and perhaps beyond the understanding of anyone within it. That includes the executives or the security professionals, which RSA Conference blogger Tony Kontzer suggests should be increasingly one and the same. (What security professional would disagree?)

What if the evolution of cybersecurity results in the evolution of a new kind of agency?

As we start to think of new strategies for information-sharing between cybersecurity-interested organizations, we have to consider how agents supervene on other agents in possibly surprising ways. An evolutionary mechanism may be a part of the very mechanism of purposive control used by a super-agent. For example, an executive might have two competing security teams and reward them separately. A nation might have an enormous ecosystem of security companies within its perimeter (…) that it plays off of each other to improve the robustness of its internal economy, providing for it the way kombucha drinkers foster their own vibrant ecosystem of gut fauna.

Still stranger, we might discover ways that purposive agents intersect at the neuronal level, like Siamese twins. Indeed, this is what happens when two companies share generic networking infrastructure. Such mereological complexity is sure to affect the incentives of everyone involved.

Here’s the rub: every seam in the topology of agency, at every level of abstraction, is another potential vector of attack. If our understanding of the organizational agent becomes more complex as we abandon the idea of the organizational perimeter, that complexity provides new ways to infiltrate. Or, to put it in the Enlightened terms more aligned with Yoran’s vision, the complexity of the system with it multitudinous and intersecting purposive agents will become harder and harder to watch for infiltrators.

If a security-driven agent is driven by its need to predict and audit activity within itself, then those agents will let a level complexity within themselves that is bounded by their own capacity to compute. This point was driven home clearly by Dana Wolf’s excellent talk on Monday, “Security Enforcement (re)Explained”. She outlined several ways that the computationally difficult cybersecurity functions–such as anti-virus and firewall technology–are being moved to the Cloud, where elasticity of compute resources theoretically makes it easier to cope with these resource demands. I’m left wondering: does the end-game of cybersecurity come down to the market dynamics of computational asymmetry?

This blog post has been written for research purposes associated with the Center for Long-Term Cybersecurity.

Beniger on anomie and technophobia

The School of Information Classics group has moved on to a new book: James Beniger’s 1986 The Control Revolution: Technological and Economic Origins of the Information Society. I’m just a few chapters in but already it is a lucid and compelling account of how the societal transformations due to information technology that are announced bewilderingly every decade are an extension of a process that began in the Industrial Revolution and just has not stopped.

It’s a dense book with a lot of interesting material in it. One early section discusses Durkheim’s ideas about the division of labor and its effect on society.

In a nutshell, the argument is that with industrialization, barriers to transportation and communication break down and local markets merge into national and global markets. This induces cycles of market disruption where because producers and consumers cannot communicate directly, producers need to “trust to chance” by embracing a potentially limitless market. This creates and unregulated economy prone to crisis. This sounds a little like venture capital fueled Silicon Valley.

The consequence of greater specialization and division of labor is a greater need for communication between the specialized components of society. This is the problem of integration, and it affects both the material and the social. The specifically, the magnitude and complexity of material flows result in a sharpening division of labor. When properly integrated, the different ‘organs’ of society gain in social solidarity. But if communication between the organs is insufficient, then the result is a pathological breakdown of norms and sense of social purpose: anomie.

The state of anomie is impossible wherever solidary organs are sufficiently in contact or sufficiently prolonged. In effect, being continguous, they are quickly warned, in each circumstance, of the need which they have of one another, and, consequently, they have a lively and continuous sentiment of their mutual dependence… But, on the contrary, if some opaque environment is interposed, then only stimuli of a certain intensity can be communicated from one organ to another. Relations, being rare, are not repeated enough to be determined; each time there ensues new groping. The lines of passage taken by the streams of movement cannot deepen because the streams themselves are too intermittent. If some rules do come to constitute them, they are, however, general and vague.

An interesting question is to what extent Beniger’s thinking about the control revolution extend to today and the future. An interesting sub-question is to what extent Durkheim’s thinking is relevant today or in the future. I’ll hazard a guess that’s informed partly by Adam Elkus’s interesting thoughts about pervasive information asymmetry.

An issue of increasing significance as communication technology improves is that the bottlenecks to communication become less technological and more about our limitations as human beings to sense, process, and emit information. These cognitive limitations are being overwhelmed by the technologically enabled access to information. Meanwhile, there is a division of labor between those that do the intellectually demanding work of creating and maintaining technology and those that do the intellectually demanding work of creating and maintaining cultural artifacts. As intellectual work demands the specialization of limited cognitive resources, this results in conflicts of professional identity due to anomie.

Long story short: Anomie is why academic politics are so bad. It’s also why conferences specializing in different intellectual functions can harbor a kind of latent animosity towards each other.