Digifesto

Category: economics

Open Source Computational Economics: The State of the Art

Last week I spoke at PyData NYC 2023 about “Computational Open Source Economics: The State of the Art”.

It was a very nice conference, packed with practical guidance on using Python in machine learning workflows, interesting people, and some talks that were further afield. Mine was the most ‘academic’ talk that I saw there: it concerns recent developments in computational economics and what that means for open source economics tooling.

The talk discussed DYNARE, a widely known toolkit for representative agent modeling in a DSGE framework, and also more recently developed packages such as QuantEcon, Dolo, and HARK. It then outline how dynamic programming solutions to high-dimensional heterogeneous agent problems have ran into computational complexity constraints. Then, excitingly, how deep learning has been used to solve these models very efficiently, which greatly expands the scope of what can be modeled! This part of the talk drew heavily on Maliar, Maliar, and Winant (2021) and Chen, Didisheim, and Scheidegger (2023).

The talk concluded with some predictions about where computational economics is going. More standardized ways of formulating problems, coupled with reliable methods for encoding these problems into deep learning training routines, is a promising path forward for exploring a wide range of new models.

Slides are included below.

References

Chen, H., Didisheim, A., & Scheidegger, S. (2021). Deep Surrogates for Finance: With an Application to Option Pricing. Available at SSRN 3782722.

Maliar, L., Maliar, S., & Winant, P. (2021). Deep learning for solving dynamic economic models. Journal of Monetary Economics, 122, 76-101.

Practical social forecasting

I was once long ago asked to write a review of Philip Tetlock’s Expert Political Judgment: How Good Is It? How Can We Know? (2006) and was, like a lot of people, very impressed. If you’re not familiar with the book, the gist is that Tetlock, a psychologist, runs a 20 year study asking everybody who could plausibly be called a “political expert” to predict future events, and then scores them using a very reasonable Bayesian scoring system. He then searches the data for insights about what makes for good political forecasting ability. He finds it to be quite rare, but correlated with humbler and more flexible styles of thinking. Tetlock has gone on to pursue and publish about this line of research. There are now forecasting competitions, and the book Superforecasting. Tetlock has a following.

What I caught my attention in the original book, which was somewhat downplayed in the research program as a whole, is that rather simple statistical models, with two or three regressed variables, performed very well in comparison to even the best human experts. In a Bayesian sense, they were at least as good as the best people. These simple models tended towards guessing something close to the base rate of an event, whereas even the best humans tended to believe their own case-specific reasoning somewhat more than they perhaps should have.

This could be seen as a manifestation of the “bias/variance tradeoff” in (machine and other) learning. A learning system must either have a lot of concentration in the probability mass of its prior (bias) or it must spread this mass quite thin (variance). Roughly, a learning system is a good one for its context if, and maybe only if, its prior is a good enough fit for the environment that it’s in. There’s no free lunch. So the only way to improve social scientific forecasting is to encode more domain specific knowledge into the learning system. Or so I thought until recently.

For the past few years I have been working on computational economics tools that enable modelers to imagine and test theories about the dynamics behind our economic observations. This is a rather challenging and rewarding field to work in, especially right now, when the field of Economics is rapidly absorbing new idea from computer science and statistics. Last August, I had the privilege to attend a summer school and conference on the theme of “Deep Learning for Solving and Estimating Dynamic Models” put on by the Econometric Society DSE Summer School. It was awesome.

The biggest, least subtle, takeaway from the summer school and conference is that deep learning is going to be a big deal for Economics, because these techniques make it feasible to solve and estimate models with much higher dimensionality than has been possible with prior methods. By “solve”, I mean coming to conclusions, for a given model of a bunch of agents interacting with each other through, for example, a market, with some notion of their own reward structure, what the equilibrium dynamics of that system are. Solving these kinds of stochastic dynamic control problems, especially when there is nontrivial endogenous aggregation of agent behavior, is computationally quite difficult. But there are cool ways of encoding the equilibrium conditions of the model, or the optimality conditions of the agents involved, into the loss function of a neural network so that the deep learning training architecture works as a model solver. By “estimate”, I mean identify, for a give model, the parameterization of the model that produces results that make some empirical calibration targets maximally likely.

But maybe more foundationally exciting than seeing these results — which were very great — was the work that demonstrated some practical consequences of the double descent phenomenon in deep learning.

Double descent has been discussed, I guess, since 2018 but it has only recently gotten on my radar. It explains a lot about how and why deep learning has blown so many prior machine learning results out of the water. The core idea is that when a neural network is overparameterized — has so many degrees of freedom that, when trained, it can entirely interpolate (reproduce) the training data — it begins to perform better than any underparameterized model.

The underlying reasons for this are deep and somewhat mysterious. I have an intuition about it that I’m not sure checks out properly mathematically, but I will jot it down here anyway. There are some results suggesting that an infinitely parameterized neural network, of a certain kind, is equivalent to a Gaussian Process, a collection of random variables such that any infinite collection of them is a multivariate normal distribution. If the best model that we can ever train is an even largely and more complex Gaussain process, then this suggests that the Central Limit Theorem is once again the rule that explains the world as we see it, but in a far more textured and interesting way than is obvious. The problem with the Central Limit Theory and normal distributions is that they are not explainable — the explanation for the phenomenon is always a plethora of tiny factors, none of which are sufficient individually. And yet, because it is a foundational mathematical rule, it is always available as an explanation for any phenomenon we can experience. A perfect null hypothesis. Which turns out to be the best forecasting tool available?

It’s humbling material to work with, in any case.

References

Azinovic, Marlon and Gaegauf, Luca and Scheidegger, Simon, Deep Equilibrium Nets (May 24, 2019). Available at SSRN: https://ssrn.com/abstract=3393482 or http://dx.doi.org/10.2139/ssrn.3393482

Kelly, Bryan T. and Malamud, Semyon and Zhou, Kangying, The Virtue of Complexity in Return Prediction (December 13, 2021). Swiss Finance Institute Research Paper No. 21-90, Journal of Finance, forthcoming, Available at SSRN: https://ssrn.com/abstract=3984925 or http://dx.doi.org/10.2139/ssrn.3984925

Nakkiran, P., Kaplun, G., Bansal, Y., Yang, T., Barak, B. and Sutskever, I., 2021. Deep double descent: Where bigger models and more data hurt. Journal of Statistical Mechanics: Theory and Experiment, 2021(12), p.124003.

Research update: to study the economy of personal data

I have not been writing here for some time because of strokes of good luck that have been keeping me busy.

I’ve been awarded a Social Behavioral and Economic Sciences (SBE) Post-Doctoral Research Fellowship (“SPRF” in total) by the National Science Foundation.

This is a lot of words to write out, but they sum up to a significant change in my research and role that I’m still adjusting to.

First, I believe this means that I am a social scientist of some kind. What kind? It’s not clear. If I could have my choice, it would be “economist”. But since Economics is a field widely known for gatekeeping, and I do not have an Economic degree, I’m not sure I can get away with this.

Nevertheless, my SPRF research project is an investigation into the economics of data (especially personal data) using methods that are build on those used in orthodox and heterodox economics.

The study of the economics of personal data is coming from my dissertation work and the ongoing policy research I’ve done at NYU School of Law’s Information Law Institute. Though my work has touched on many other fields — computer science and the design of information systems; sociology and the study of race and networked publics; philosophy and law — at the end of the day the drivers of “technology’s” impact on society are businesses operating according to an economic logic. This is something that everybody knows, but that few academic researchers are in a position to admit, because many of the scholars who think seriously about these issues are coming from other disciplines.

For better or for worse, I have trouble sticking to a tunnel of which I can’t see the intellectual daylight at the end.

So how can we study the economy of personal data?

I would argue — that this is something that most Economists will balk at — that the tools currently available to study this economy are insufficient for the task. Who am I to say such a thing? Nobody special.

But weighing in my favor is the argument that the even the tools used by Economists to study the macroeconomy are insufficient for the task. This point was made decisively by the 2008 Financial Crisis, which blindsided the economic establishment. One of the reasons why Economics failed was because the discipline had deeply entrenched oversimplified assumptions in their economic models. One of these was representative agent modeling, which presumed to model the enter economy with a single “representative agent” for a sector or domain. This makes the economist’s calculations easier but is clearly unrealistic, and indeed it’s the differences between agents that create much of the dynamism and pitfalls of the economy. Hence the rise in heterogeneous agent modeling (HAM), which is explicit about the differences between agents with respect to things like, for example, wealth, risk aversion, discount factor, level of education, and so on.

It was my extraordinary good fortune to find an entry into the world of HAM via the Econ-ARK software project (Carroll et al, 2018; Benthall and Seth, 2020), which needed a software engineer enthusiastic about open source scientific tools at a moment when I was searching for a job. Econ-ARK’s HAM toolkit, HARK, has come a long way since I joined the project in late 2019. And it still has quite a ways to go. But it’s been a tremendously rewarding project to be involved with, in no small part because it has been a hands-on introduction to the nitty-gritty of contemporary Economics methods.

It’s these tools which I will be extending with insights from my other work, which is grounded more in computer science and legal scholarship, in order to model the data economy. Naturally, the economy for personal data depends on the heterogeneity of consumers — it is those differences that make a difference between consumers that make the trade in personal information possible and relevant. And while there are many notational and conventional differences between the orthodox Economics methods and the causal Bayesian frameworks that I’ve worked in before, these methods in fact share a logical core that makes them commensurable.

I’ve mentioned both orthodox and heterodox economics. By this I mean to draw a distinction between the core of the Economics discipline, which in my understanding is still tied to rational expectations and general equilibria — meaning the idea that agents know what to expect from the market and act accordingly — and heterodox views that find these assumptions to be dangerously unrealistic. This is truly a sore spot for Economics. As the trenchant critiques of Mirowski and Nik-Kah (2017) reveal, these core assumptions commit Economists to many absurd conclusions; however, they are loathe to abandon them lest they lose the tight form of rigor which they have demanding to maintain a kind of standardization within the discipline. Rational expectations aligns economics with engineering disciplines, like control theory and artificial intelligence, which makes their methods more in-demand. Equilibrium theories give Economics a normative force and excuses when its predictions do not pan out. However, the 2008 Financial Crisis embarassed these methods, and now the emerging HAM techniqes include not only a broadened from of rational agent modeling, but also a much looser paradigm of Agent-Based Modeling (ABM) that allow for more realistic dynamics with boundedly rational agents (Bookstaber, 2017).

Today, the biggest forces in the economy are precisely those that have marshaled information to their advantage in a world with heterogeneous agents (Benthall and Goldenfein, 2021). Economic agents differ both horizontally — like consumers of different demographic categories such as race and sex — and vertically — as consumers and producers of information services have different relationships to personal data. As I explore in forthcoming work with Salome Viljoen (2021), the monetization of personal data has always been tied to the financial system, first via credit reporting, and later through the financialization of consumer behavior through digital advertising networks. And yet the macroeconomic impact of the industries that profit from these information flows, which now account for the largest global companies, is not understood because of disciplinary blinders that Economics has had for decades and is only now trying to shed.

I’m convinced the research is well motivated. The objection, which comes from my most well-meaning mentors, is that the work is too difficult or in fact impossible. Introducing heterogeneously bounded rationality into economic modeling creates a great deal of modeling and computational complexity. Calibrating, simulating, and testing such models is expensive, and progress requires a great deal of technical thinking about how to compute results efficiently. There are also many social and disciplinary obstacles to this kind of work: for the reasons discussed above, it’s not clear where this work belongs.

However, I consider myself immensely fortunate to have a real, substantive, difficult problem to work on, and enough confidence from the National Science Foundation that they support my trying to solve it. It’s an opportunity of a lifetime and, to be honest, as a researcher who has often felt at the fringes of a viable scholarly career, a real break. The next steps are exciting and I can’t wait to see what’s around the corner.

References

Benthall, S., & Goldenfein, J. (2021, May). Artificial Intelligence and the Purpose of Social Systems. In Proceedings of the 2021 AAAI/ACM Conference on AI Ethics and Society (AIES’21).

Benthall, S., & Seth, M. (2020). Software Engineering as Research Method: Aligning Roles in Econ-ARK.

Benthall, S. & Viljoen, S. (2021) Data Market Discipline: From Financial Regulation to Data Governance. J. Int’l & Comparative Law https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3774418

Bookstaber, R. (2017). The end of theory. Princeton University Press.

Carroll, C. D., Kaufman, A. M., Kazil, J. L., Palmer, N. M., & White, M. N. (2018). The Econ-ARK and HARK: Open Source Tools for Computational Economics. In Proceedings of the 17th Python in Science Conference (pp. 25-30).

Mirowski, P., & Nik-Khah, E. (2017). The knowledge we have lost in information: the history of information in modern economics. Oxford University Press.

Crypto, macro, and information law

Dogecoin is in the news this week because of Elon Musk’s pump and dump in the latest of notable asset bubbles fueled in small part by Internet-informed, perhaps frivolous, day-traders. The phenomenon reminds me of this curious essay about viral art. It concludes:

The doge meme is a Goldsmithian piece, passing ephemerally through a network of peers. In a LaBeoufian moment, Jackson Palmer invented Dogecoin, capturing the meme and using it to leverage networks of power. Now it is art post-LaBeouf in its greatest form: authorless art as economic power, transmitted over networks. As the synthesized culmination of the traditions of economics and Western literature, DogeCoin is one of the greatest achievements in the history of art, if not the greatest.

This paragraph is perhaps best understood, if at all, as an abstruse joke. The essay itself is most likely not written by “Niklos Szabo”, easily conflated with Nick Szabo, one of the deeper thinkers behind cryptocurrency more generally. The real Szabo has written much more seriously and presciently about culture and the economy. As an aside, I believe Szabo’s writings about book consciousness prefigure Hildebrandt’s work on the role of the printing press as a medium contributing to the particular character of text-driven law. However if the enduring success of cryptocurrencies validates Szabo’s economics more than his cultural theory. His 2002 paper “Shelling out: the origins of money” is a compelling history of currency. Notably, it is not a work of formal economic theory. Rather, it draws on historical and anthropological examples to get at the fundamentals of the role currency plays in society. This study leads to the conclusion that currency must be costly to created and transferable with relatively low transaction costs. Bitcoin, for example, was designed to have these qualities.

What Szabo does not discuss in “Shelling out” is the other thing Bitcoin is most known for, which is speculative asset bubble pricing. Cryptocurrency has lurched into the mainstream in fits of speculative enthusiasm followed by crashes and breakdowns. It is risky.

Salome Viljoen and I are writing about financial regulations as part of our “Data Market Discipline” project. One takeaway from this work is that the major financial regulations in the United States were responses to devastating financial crises, such as the Great Depression and the 2008 financial crisis, which were triggered by the collapse of an asset bubble. So while currency is an old invention and the invention of new currencies is interesting, the project of maintaining a stable financial system is a relatively more recent legal project and an unfinished one at that. It is so much more unfinished for cryptocurrencies, that are not controlled by a central banking system, than for national fiat currencies for which, for example, interest rates can be used as a calibrating tool.

These are not idle theoretical points. Rather, they are at the heart of questions surrounding the recovery of the economy from COVID-related setbacks. Money from stimulus checks going to people who have no reason to increase their consumption (cf. Carroll et al., 2020) is perhaps responsible for the influx of retail investment into equities markets and, in particular, Reddit-coordinated asset bubbles such as the ones we’re seeing recently with Gamestop and Dogecoin. The next stimulus package being prepared by the Biden administration has sounded alarms from parts of the Economics establishment that it will spur inflation, while Janet Yellen has argued that this outcome can be prevented using standard monetary policy tools such as the increase of interest rates. Arguably, the recent rising price of Bitcoin is due to this threat of macro-economic stability of the dollar-denominated financial system.

I don’t mean any of this conclusively. Rather, I’m writing this to register my growing realization that the myriad Internet effects on culture, economy, and the law are often much more motivated by movements in internationally coupled financial systems than “technology policy” specialists or “public interest technologists” are inclined to admit. We are inclined, because of our training in something else — whether it be computer science, environmental law, political philosophy, or whatever — to seek out metaphors from our own domain of expertise. But many of the most trenchant analyses of why the current technological landscape seems a bit off come down to failures of the price mechanism in the digital economy. I’m thinking of Kapczynski ‘s (2011) critique of the price mechanism in relation to intellectual property, and Strandburg’s (2013) analysis of the failure of pricing in on-line services. We have on the one hand the increasingly misconceptualized “Silicon Valley”‘s commitment to a “free market” and on the other hand few of the conditions under which a “free market” is classically considered to be efficient. The data economy does not meet even classically liberal (let alone New, more egalitarian, Liberal) standards of justice. And liberal legal theory is not equipped, Jake Goldenfein and I have argued, to grapple with this reality.

What progress can be made?

Maybe there is something somebody with enormous wealth or institutional power could do to change the situation. I’m not one of those people. However, there is some evidence to support the point that at the root of these problems is a conceptual, intellectual failing to understand what’s going on at the root of it.

In some recent work with Kathy Strandburg, we are examining the conceptual roots of the highly influential Law and Economics (L&E) branch of legal scholarship. This field absorbs the techniques of neoclassical economics and develops them into actionable policy proposals and legal rules of thumb. It has come under political criticism from the recently formed Law and Political Economy (LPE) movement. Interestingly, it has also been critique from a “Law and Macroeconomics” perspective, which argues that L&E should really be called “law and microeconomics”, because of its inability to internalize macroeconomic concepts such as the business cycle or changes in monetary policy.

Among the assumptions at the roots of L&E are notions of optimality and efficiency that make somewhat naive assumptions about the nature of price and money. For example, Kaldor-Hicks efficiency, a relaxation of Pareto efficiency used in welfare economics as applied to L&E, allows for transactions that alter the situations of agents so long as one agent, who gains, could theoretically compensate the other for their losses (see Feldman, 1998), . This concept is used to consider social welfare optimal, resolving the neoclassical problem of the incomparability of individual utilities through an implicit pricing mechanism. This leads L&E to favor “wealth maximizing” policies.

However, grounding legal theory in the idea of a robust price mechanism capable of subsuming all differences in individual preferences is quite naive in a digital economy that is always already at the intersection of many different currencies (including cryptocurrency), variable and politically vulnerable systems of credit and debt, and characterized by markets that do not that the legal scaffolding needed to drive them towards “true” prices. If Mirowski and Nik-Khah (2017) are correct and Economists have abandoned earlier notions of “truth” to faith in the market’s price as a “truth” derived from streams of information, something is indeed amiss. Data is not a commodity, and regulations that treat data flows as commodity exchanges not well matched to the reality. In the Hayekian model, price is the signal that combines available information. In the data economy, the complexity topology of real data flows belies simplistic views of “the market”.

What tech law needs is a new economic model, one that, just as general relativity in physics showed how classical mechanics was a special case of more complex universal laws, reveals how data, intellectual property, and price are connected in ways that go beyond the classical liberal imagination.

References

Benthall, Sebastian and Viljoen, Salome, Data Market Discipline: From Financial Regulation to Data Governance (January 27, 2021). J. Int’l & Comparative Law – (2021)

Carroll, C. D., Crawley, E., Slacalek, J., & White, M. N. (2020). Modeling the consumption response to the CARES Act (No. w27876). National Bureau of Economic Research.

Feldman, A. M. (1998). Kaldor-hicks compensation. The new Palgrave Dictionary of economics and the law2, 417-421.

Hildebrandt, M. (2015). Smart technologies and the end (s) of law: novel entanglements of law and technology. Edward Elgar Publishing.

Kapczynski, A. (2011). The cost of price: Why and how to get beyond intellectual property internalism. UCLA L. Rev.59, 970.

Mirowski, P., & Nik-Khah, E. (2017). The knowledge we have lost in information: the history of information in modern economics. Oxford University Press.

Shekman, David. “Gamestop and the Surrounding Legal Questions.” Medium, Medium, 5 Feb. 2021, medium.com/@shekman27/gamestop-and-the-surrounding-legal-questions-fc0d1dc142d7.

Strandburg, K. J. (2013). Free fall: The online market’s consumer preference disconnect. U. Chi. Legal F., 95.

Szabo, N. (2002). Shelling out: the origins of money. Satoshi Nakamoto Institute.

Szabo, Niklos. “Art Post-LaBeouf.” Medium, Medium, 22 Sept. 2014, medium.com/@niklosszabo/art-post-labeouf-b7de5732020c.

Schumpeter on Marx as Prophet and Sociologist

Continuing to read Schumpeter’s Capitalism, Socialism, and Democracy (1942) As I mentioned in a previous post, I was surprised to find that, in a book I thought would tell me about the currently prevailing theory of platform monopoly and competition, Schumpeter’s first few chapters are devoted entirely to a consideration of Karl Marx.

Schumpeter’s treatment of Marx is the epitome of respectful disagreement. Each chapter in his treatment, “Marx the Prophet”, “Marx the Sociologist”, “Marx the Economist”, and “Marx the Teacher”, is brimming with praise and reverence for the intellectual accomplishments of Marx. Schumpeter is particularly sensitive to the value of Marx’s contributions in their historical context: they exceeded what came before it and introduced many critical new ideas and questions.

Contextualizing it thus, Schumpeter then engages in a deep intellectual critique of Marx, pointing out many inconsistencies and omissions of the doctrine and of the contemporary Marxist or Marxian tendencies of his time.

Marx the Prophet”

Schumpeter’s first chapter on Marx addresses the question of why Marx has such a devoted following, one that exceeds that of any other social scientist or economist. He does not find it plausible that Marx has been so attractive because of his purely intellectual analysis. The popular following of Marx far exceeds those who have engaged deeply with Marx’s work. So Schumpeter’s analysis is about the emotional power of Marxian thought. This is a humanistic discussion foremost. It is a discussion, quite literally and unmetaphorically, of religion.

In one important sense, Marxism is a religion. To the believer it presents, first, a system of ultimate ends that embody the meaning of life and are absolute standards by which to judge events and actions; and, secondly, a guide to those ends which implies a plan to salvation and the indication of the evil from which mankind, or a chosen section of mankind, is to be saved. We may specify further: Marxist socialism also belongs to that subgroup which promises paradise on this side of the grave.

Schumpeter believes that Marxism is successful because at some necessary points Marx sacrificed logical integrity for what was in effect good marketing to a audience that had an emotional need for his message.

This need came about in part because, with the success of bourgeois capitalism, other religions had begun to wane in influence. “Faith in any real sense was rapidly falling away from all classes of society”, leaving “the workman” literally hopeless. “Now, to millions of human hearts the Marxian message of the terrestrial paradise of socialism meant a new ray of light and a new meaning of life.” This acknowledgement of the emotional power of Marxism is not meant to be dismissive; on the contrary, what made it successful was that “the message was framed and conveyed in such a way as to be acceptable to the positivistic mind of its time.” He “formulat[ed] with unsurpassed force that feeling of being thwarted and ill treated which is the auto-therapeutic attitude of the unsuccessful many, and, on the other hand, by proclaiming that socialistic deliverance from these ills was a certainty amenable to rational proof.”

Marxism offered certainty of one’s course of action for people who were otherwise despairing, all in a form that seemed consistent with dominant rationalist and scientific modes of thought.

The religious quality of Marxism also explains the characteristic attitude of the orthodox Marxist towards opponents. To him, as any believer in a Faith, the opponent is not merely in error but in sin. Dissent is disapproved of not only intellectually but also morally. There cannot be any excuse for it once the Message has been revealed.

Schumpeter does find an intellectual weakness in Marxism here. He argues that Marxism excites individual feelings and attempts to direct them towards class consciousmess, an idea that depends on theoretical assumptions about the logic of social evolution. Schumpeter is doubtful about this logic of class formation. He writes that “the true psychology of the workman… centers in the wish to become a small bourgeois and to be helped to that status by political force.” This question of the structure of social classes is addressed in the next chapter.

Discussion:

Religion is a difficult topic for mainstream research and scholarship today, especially in the United States. For applications for University lecturer positions in the United Kingdom, there is commonly a section devoted to the applicant’s experience with “pastoral care”. The history of universities in the UK is tied up with religious history, and this resonates through the expectation that part of the role of a professor is to tend to the spiritual needs, in the broadest possible sense, of the students.

The equivalent section in applications in the United States is a section asking the applicant to discuss their experiences fostering or representing diversity, equity, and inclusion. These terms have had many meanings, but it requires a special amount of mental density to not read this as relating to the representation and treatment of minorities. This is a striking indication that the prevailing “religion” of education institutions in the U.S. is indeed a form of political progressivism.

There is a wide variance of opinion how much contemporary progressive ideals are aligned with or indebted to Marxian ones. I’m not sure I can add to that debate. Here, I am simply noting that both Marxism and progressivism have some of these religious traits in common, including perhaps the promise of a new material world order and a certain amount of epistemic closure.

Naturally there are more and less intellectual approaches to both Marxism and contemporary progressivism. There are priests and lay people of every good religion. Schumpeter’s analysis proceeds as intellectual critique.

Marx the Sociologist”

Schumpeter next addresses the sociological content of Marx. He argues that though Marx was a neo-Hegelian and that these philosophical themes permeate his work, they do not dominate it. “Nowhere did he betray positive science to metaphysics.” He brought an powerful comprehensive of contemporary social facts to his work, and used the persuasively in his arguments in a way that raised the standard of empiricism in the scholarship of his time. And the result of this empiricism is Marx’s Economic Intepretation of History, according to which economic conditions shape and account for the rise and fall of the world of ideas: religions, metaphysics, schools of art, political volitions, etc. Ideas and values “had in the social engine the role of transmission belts.” This is as opposed to a vulgar interpretation that would assume all individual motives can be reduced to individual economic motivates; this is a misrepresentation.

Schumpeter views the term “materialism”, as applied to Marx, as meaningless, and mentions in a footnote his encounters with Catholic radicals who “declared themselves Marxists in everything except in matters related to their faith” with perfect consistency.

Schumpeter instead condenses Marx’s view of history into two statements:

  • “The forms or conditions of production are the fundamental determinant of social structures. “[T]he “hand mill” creates feudal, and the “steam-mill,” capitalist societies. Technology thus becomes a driving factor of social change, though technology is understood in its fullness as situated sociotechnical process.
  • The forms of production have a logic of their own. The hand-mill and steam-mill each create social orders which ultimately outgrow their own frame and lead to the practical necessity of the next technological advance.

This smacks of “technological determinism” which is full-throatedly rejected by more contemporary sociological and anthropological scholars. And Schumpeter points out this weakness as well, in a particular operational form: he notes that many social structures are quite durable, persisting past the technological context of their origins. This is a weakness of Marx’s work. There are historical facts, such as the emergence of feudal landlordism in the sixth century, which run counter to Marx’s analysis. The implication is that _unless_ one is taking Marx _religiously_, one would take his arguments seriously enough to engage them as positive science, and then refine one’s views in light of contradictory evidence. This all can be done with ample respect for Marx’s work. Schumpeter is warning against a fundamentalist use of Marx.

This is a buildup to his analysis of the next major sociological theme of Marx, the Theory of Social Classes. Schumpeter credits Marx with the introduction of the important idea of social class. The important claim made by Marx is that a social class is not simple a set of individuals that have something in common. Rather, they are theorized as a social form, “live entities that exist as such”, emergent beings with their own causal force. Marxism rejects methodological individualism.

Once we understand social classes to be social forms in themselves, it becomes sensible to discuss “class struggle”, an important Marxist idea. Schumpeter seems to believe that the strongest form of the idea of class struggle is incorrect, but a weaker version, “the proposition that historical events may often be interpreted in terms of class interests and class attitudes and that existing class structures are always an important factor in historical interpretation”, is a valuable contribution.

“Clearly, success on the line of advance opened up by the principle of class struggle depends upon the validity of the particular theory of classes we make our own. Our picture of history and all our interpretations of cultural patterns and the mechanism of social change will differ according to whether we choose, for instance, the racial theory of classes and like Gobineau reduce human history to the history of the struggle of races or, say, the division of labor theory of classes in the fashion of Schmoller or of Durkheim and resolve class antagonisms into antagonisms between the interests of vocational groups. Nor is the range of possible differences in analysis confined to the problem of the nature of classes. Whatever view we may hold about it, different interpretations will result from different definitions of class interest and from different opinions about how class action manifests itself. The subject is a hotbed of prejudice to this day, and as yet hardly in its scientific stage.”

Schumpeter sees Marx’s own theory of the nature and action of social classes as incomplete and under-specified. “The theory of his chief associate, Engels, was of the division of labor type and essentially un-Marxian in its implications.” Finding Marx’s true theory of social classes is, in Schumpeter’s view, a delicate task of piecing together disjoint parts of Das Kapital.

“The basic idea is clear enough, however. The stratifying principle consists in the ownership, or exclusion from ownership, of means of production such as factory buildings, machinery, raw materials and the consumers’ goods that enter in the workman’s budget. We have thus, fundamentally, two and only two classes, those owners, the capitalists, and those have-nots, who are compelled to sell their labor, the laboring class or proletariat. The existence of intermediate groups, such as are formed by farmers or artisans who employ labor but also do manual work, by clerks and by the professions is of course not denied; but they are treated as anomalies which tend to disappear in the course of the capitalist process.”

This sets up the most fundamental antagonism as that over the private control over the means to produce. The very nature of this relation is strife, or class war.

The crucial question raised by this framing is the question of primitive accumulation, “that is to say, how capitalists came to be capitalists in the first instance.” Here, Schumpeter calls shenanigans on Marx. For while Marx rejects wholesale the idea that some people became capitalists rather than others due to superior intelligence, work ethic, and saving. Schumpeter believes this ‘children’s tale’, “whale far from telling the whole truth, yet tells a good deal of it’. Schumpeter is a believer in entrepreneurial wit, energy, and frugality as the accounting for “the founding of industrial positions in nine cases out of ten.” And yet, he agrees that saving alone, as perhaps implied by classical economics predating Marx, does not account for capital accumulation.

Here, Schumpeter begins to work with some economics facts. Some people save. But saving does not in general turn one into a capitalist. Rather, typically an enterprise is begun by borrowing other people’s savings. Banks arise as the intermediary between household savings and entrepreneurial capital investments.

Schumpeter attributes to Marx a bad faith or at least simplistic rejection of this theory–a popularly applauded “guffaw”–that paves the way for an alternative theory: that primitive accumulation was the result of force or robbery. This is a popular theory. But Schumpeter argues that is begs the question. For how is it that “some people acquire the power to subjugate and rob”? Marx’s answer to this is a historical argument: feudalism was a classist regime of force. Feudal inequality gave way to capitalist inequality. This core logic of this idea is considered, skeptically, in several footnotes. For example, in one, Schumpeter asks whether it is more likely that control over cannons gives one power, or if power gives one control over cannons.

Schumpeter remains incredulous, as he sees Marx’s theory of primative accumulation as avoidant of the main phenomenon that it undertakes to explain. He points to the phenomenon of medium-sized owner-managed firms. Where do they come from? Class positions, he argues, are more often the cause of economic conditions than the other way around, as “business achievement is obviously not everywhere the only avenue to social eminence and only where it is can ownership of means of production causally determine group’s position in the social structure.” Schumpeter also questions the implied hereditary nature of Marx’s theory of social class, as he sees class mobility (both upward and downward) as a historical fact. For Schumpeter, the empirical counterarguments to Marx here are all “obvious”.

Schumpeter then places the value of Marxist theory instead in the propagandist joining of the Economic Theory of History and his theory of Social Classes, which together have more tightly deterministic implications than either do individually. Here Schumpeter makes all kinds of heretical points. For example, socialism, “which in reality has nothing to do with the presence or absence of social classes”, became, for Marx, the only possible kind of classless society. Why? It is so by virtue of the tautology given the definitions Marxist theory provides. But this begins to crumble once the strict binary of social classes is eroded into something more realistic. Schumpeter argues that, contra Marx, in normal times, the relationship between labor and capital is “primarily one of cooperation and that any theory to the contrary must draw largely on pathological cases for verification.” You wouldn’t have the grounds for antagonism at all, he points out, if you didn’t have some much cooperation to work with; indeed, in Schumpeter’s view the two are inseperable.

Ultimately, Schumpeter believes Marx’s theory of social classes depends on this economic theory, grounded in economic facts. The sociological theory of social classes is compelling to many in its own right, but does not hold up to scrutiny in itself. “Marx the Economist” is the subject of the next chapter.

Discussion:

Schumpeter is treating Marx dialectically, attempting to lay out the scope of his argument in its popularly understood, schematic form and showing how, while tautological in its structure, it depends ultimately on some more nuanced theories of economics which will no doubt be questioned in the next chapter.

Comparing Schumpeter’s analysis of Marx with the contemporary economy, we see all sorts of confusions that seem to violate that Marxian class binary. There are multiple social classes, many of whom seem to have a far more ambiguous relationship to capital than either the proletariat or capitalists. The relatively modern idea that one’s savings, however they are earned, should be invested directly into the stock market (a market for ownership over capital) rather than into a bank that then lends to companies has, it’s been said, given “everyone” with substantial savings a stake in the capitalist economy. What does this mean for Marx?

Economic sociology, such as that of Bourdieu, has since developed a far more nuanced analysis of social classes. It is an empirical question, truly, what sociological theory of social classes is most valid; it is unlikely to be anything simple, given how richly textured the social field is in fact. On the other hand, there is much to be learned from a theory of history that gives weight to economic forces, especially economic forces broadly construed. Schumpeter is asking us to try a little harder to understand what actually happens historically, including the plurality of explanations for a large aggregate social fact, rather than fall for the emotional potency and simplistic tautology Marx provides.

Starting to read Schumpeter

I’ve started reading Schumpeter’s Capitalism, Socialism, and Democracy (1942).

Joseph Schumpeter ekonomialaria.jpg
It’s Schumpeter.

Why? Because of the Big Tech anti-trust hearings. I’ve heard that:

(a) U.S. anti-trust policy is based on a theory of monopoly pricing which is not bearing out with todays Big Tech monopolies,

(b) possibly those monopolies are justified on the basis of Schumpeterian “creative destruction” competition, wherein one monopoly gets upended by another in sequence, rather than having many firms competing all at once on the market,

(c) one of the major shots taken at Amazon in the hearings is that it would acquire companies that it saw as a threat, indicating a strategic understanding of Schumpeterian competition on the part of e.g. Bezos, and also how one can maintain a monopolistic position despite that competition,

(d) this idea of capitalism and entrepreneurship seems both fundamentally correct, still somehow formally undertheorized, and tractable with some of the simulation methods I’ve been learning recently with Econ-ARK and NYU’s ABM Lab

All good signs. But who was Schumpeter and what did he think? I can’t really say I know. So I’m returning to my somewhat antiquated method/habit/hobby of Actually Reading the Book.

A few striking things about the book based entirely on its Prefaces (1942, and the later one from 1946):

  • Schumpeter is quite consciously trying to make accurate descriptive claims without normative policy implications, and his kind of annoyed by readers who think he’s doing anything but objective analysis. His enemy is ideology. He apparently gets misunderstood a lot as a result. I think I can hang with this dude.
  • The first section of this book is dedicated to a long treatment of the work of Karl Marx. This opens with the idea that Karl Marx is a great theorist not so much because he’s right or wrong, but because his ideas survive from generation to generation. This view of theoretical greatness prefigures, I think, his view of economic greatness; as an evolutionary battle of competing beings whose success is defined by their Darwinian survival. Schumpeter takes on Marx with great respect. I expect him to be involved in a dismantling of him, though he agrees with Marx that capitalism ends up destroying itself with its accomplishments. He says this as a pro-capitalist, which is interesting.
  • He points out, somewhat amusingly, that Marx is popular (at the time of his writing, the 1940’s) in Russia, where it has been misinterpreted by the Bolsheviks, and for some reason that mystifies him in the United States, but not in the place most deeply familiar with Marx, which is Germany. German socialists, he notes, reason just like economist everywhere else. Since I find that in academic circles Marxist ideas are still fashionable, but other forms of economics, let alone socialist economics, are less so, I have to see Schumpeter as making yet another enduring point here.
  • In the 1946 preface, he mentions an objection by professional economists to his work, which is the objection that while Schumpeter predicts that profits in capitalism will fall over time, this view is critiqued since this does not apparently take into account the return on salesmanship or something like that. Schumpeter then says something interesting: sales is considered as the wages of management. What he’s talking about is the profitability of new goods, new productions methods, new processes, etc: i.e., the sort of stuff that would be actually valuable, directly or indirectly, to consumers. This is interesting. Because given a Herbet Simon view of organizations, management process are precisely what have been changing so dramatically with the “tech economy”–all this AI stuff is really just about streamlining management processes, sales, etc. SO: what does it mean if Schumpeterian competition winds up being nullified by monopolies of managerial power, as opposed to monopolies of something more substantive? This whole complex of information technology and management being produced and marketed as commodities or securities or something else, what we might in a very extended sense call capital markets, is just the sort of thing that neither Marx nor most early economists would get and what actual dominates the economy now. So, let us proceed.

from morality to economics: some stuff about Marx for Tapan Parikh

I work on a toolkit for heterogeneous agent structural modeling in Economics, Econ-ARK. In this capacity, I work with the project’s creators, who are economists Chris Carroll and Matt White. I think this project has a lot of promise and am each day more excited about its potential.

I am also often in academic circles where it’s considered normal to just insult the entire project of economics out of hand. I hear some empty, shallow snarking economists about once every two weeks. I find this kind of professional politics boring and distracting. It’d also often ignorant. I wanted to connect a few dots to try to remedy the situation, while also noting some substantive points that I think fill out some historical context.

Tracking back to this discussion of morality in the Western philosophical tradition and what challenges it today, the focal character there was Immanuel Kant, who for the sake of argument espoused a model of morality based on universal properties of a moral agent.

Tapan Parikh has argued (in personal communications) that I am “a dumb ass” for using Kant in this way, because Kant is on the record for writing some very racist things. I feel I have to address this point. No, I’m not going to stop working with the ideas from the Western philosophical canon just because so many of them were racist. I’m not a cancel culturist in any sense. I agree with Dave Chappelle on the subject of Louis C.K., for example.

However, it is actually essential to know whether or not racism is a substantive, logical problem with Kant’s philosophy. I’ll defer to others on this point. A quick Googling of the topic seems to indicate that either: Kant was inconsistent, and was a racist while also espousing universalist morality, and that tells us more about Kant the person than it does about universalist morality–the universalist morality transcending Kant’s human failings in this case (Allais, 2016) or Kant actually became less racist during the period in which he was most philosophically productive, which was late in his life (Kleingeld, 2007). I like this latter story better: Kant, being an 18th century German, was racist as hell; then he thought about it a bit harder, developed a universalist moral system, and because, as a consequence, less racist. That seems to be a positive endorsement of what we now call Kantian morality, which is a product of that later period and not the earlier virulently racist period.

Having hopefully settled that question, or at least smoothed it over sufficiently to move on, we can build in more context. Everybody knows this sequence:

Kant -> Hegel -> Marx

Kant starts a transcendent dialectic as a universalist moral project. Hegel historicizes that dialectic, in the process taking into serious consideration the Haitian rebellion, which inspires his account of the Master/Slave dialectic, which is quite literally about slavery and how it is undone by its internal contradictions. The problem, to make a long story short, is that the Master winds up being psychologically dependent on the Slave, and this gives the Slave power over the Master. The Slave’s rebellion is successful, as has happened in history many times. This line of thinking results in, if my notes are right (they might not be) Hegel’s endorsement of something that looks vaguely like a Republic as the end-of-history.

He dies in 1831, and Marx picks up this thread, but famously thinks the historical dialectic is material, not ideal. The Master/Slave dialectic is transposed onto the relationship between Capital and the Proletariat. Capital exploits the Proletariat, but needs the Proletariat. This is what enables the Proletariat to rebel. Once the Proletariat rebel, says Marx, everybody will be on the same level and there will be world peace. I.e., communism is the material manifestation of a universalist morality. This is what Marx inherits from Kant.

But wait, you say. Kant and Hegel were both German Idealists. Where did Marx get this materialist innovation? It was probably his own genius head, you say.

Wrong! Because there’s a thread missing here.

Recall that it was David Hume, a Scotsman, whose provocative skeptical ideas roused Kant from his “dogmatic slumber”. (Historical question: Was it Hume who made Kant “woke” in his old age?) Hume was in the line of Anglophone empiricism, which was getting very bourgey after the Whigs and Locke and all that. Buddies with Hume is Adam Smith who was, let’s not forget, a moral philosopher.

So while Kant is getting very transcendental, Smith is realizing that in order to do any serious moral work you have to start looking at material reality, and so he starts Economics in England.

This next part I didn’t really realize the significance of until digging into it. Smith dies in 1790, just around when Kant is completing the moral project he’s famous for. At that time, the next major figure is 18, coming of age. It’s David Ricardo: a Sephardic Jew turned Unitarian, a Whig, a businessman who makes a fortune speculating on the Battle of Waterloo, who winds up buying a seat in Parliament because you could do that then, and also winds up doing a lot of the best foundational work on economics including inventing the labor theory of value. He was also, incidentally, an abolitionist.

Which means that to complete one’s understanding of Marx, you have to also be thinking:

Hume -> Smith -> Ricardo -> Marx

In other words, Marx is the unlikely marriage of German Idealism, with its continued commitment to universalist ethics, with British empiricism which is–and I keep having to bring this up–weak on ethics. Empiricism is a bad way of building an ethical theory and it’s why the U.S. has bad privacy laws. But it’s a good way to build up an economic materialist view of history. Hence all of Marx’s time looking at factories.

It’s worth noting that Ricardo was also the one who came up with the idea of Land Value Taxation (LVT), which later Henry George popularized as the Single Tax in the late 19th/early 20th century. So Ricardo really is the pivotal figure here in a lot of ways.

In future posts, I hope to be working out more of the background of economics and its connection to moral philosophy. In addition to trying to make the connections to my work on Econ-ARK, there’s also resonances coming up in the policy space. For example, the Law and Political Economy community has been rather explicitly trying to bring back “political economy”–in the sense of Smith, Ricardo, and Marx–into legal scholarship, with a particular aim at regulating the Internet. These threads are braiding together.

References

Allais, L. (2016). Kant’s racism. Philosophical papers45(1-2), 1-36.

Kleingeld, P. (2007). Kant’s second thoughts on race. The Philosophical Quarterly57(229), 573-592.

Land value taxation

Henry George’s Progress and Poverty, first published in 1879, is dedicated

TO THOSE WHO, SEEING THE VICE AND MISERY THAT SPRING FROM THE UNEQUAL DISTRIBUTION OF WEALTH AND PRIVILEGE, FEEL THE POSSIBILITY OF A HIGHER SOCIAL STATE AND WOULD STRIVE FOR ITS ATTAINMENT

The book is best known as an articulation of the idea of a “Single Tax [on land]”, a circa 1900 populist movement to replace all taxes with a single tax on land value. This view influence many later land reform and taxation policies around the world; the modern name for this sort of policy is Land Value Taxation (LVT).

The gist of LVT is that the economic value of owning land comes both from the land itself and improvements built on top of it. The value of the underlying land over time is “unearned”–it does not require labor to maintain, it comes mainly from the artificial monopoly right over its use. This can be taxed and redistributed without distorting incentives in the economy.

Phillip Bess’s 2018 article provides an excellent summary of the economic arguments in favor of LVT. Michel Bauwen’s P2P Foundation article summaries where it has been successfully in place. Henry George was an American, but Georgism has been largely an export. General MacArthur was, it has been said, a Georgist, and this accounts for some of the land reform in Asian countries after World War II. Singapore, which owns and rents all of its land, is organized under roughly Georgist principles.

This policy is neither “left” nor “right”. Wikipedia has sprouted an article on geolibertarianism, a term that to me seems a bit sui generis. The 75th-anniversary edition of Progress and Poverty, published 1953, points out that one of the promises of communism is land reform, but it argues that this is a false promise. Rather, Georgist land reform is enlightened and compatible with market freedoms, etc.

I’ve recently dug up my copy of Progress and Poverty and begun to read it. I’m interested in mining it for ideas. What is most striking about it, to a contemporary reader, is the earnest piety of the author. Henry George was clearly a quite religious man, and wrote his lengthy and thorough political-economic analysis of land ownership out of a sincere belief that he was promoting a new world order which would preserve civilization from collapse under the social pressures of inequality.

A note towards formal modeling of informational capitalism

Cohen’s Between Truth and Power (2019) is enormously clarifying on all issues of the politics of AI, etc.

“The data refinery is only secondarily an apparatus for producing knowledge; it is principally an apparatus for producing wealth.”

– Julie Cohen, Between Truth and Power, 2019

Cohen lays out the logic of informational capitalism in comprehensive detail. Among her authoritatively argued points is that scholarly consideration of platforms, privacy, data science, etc. has focused on the scientific and technical accomplishments undergirding the new information economy, but that really its key institutions, the platform and the data refinery, are first and foremost legal and economic institutions. They exist as businesses; they are designed to “extract surplus”.

I am deeply sympathetic to this view. I’ve argued before that the ethical and political questions around AI are best looked at by considering computational institutions (1, 2). I think getting to the heart of the economic logic is the best way to understand the political and moral concerns raised by information capitalism. Many have argued that there is something institutionally amiss about informational capitalism (e.g. Strandburg, 2013); a recent CfP went so far as to say that the current market for data and AI is not “functional or sustainable.”

As far as I’m concerned, Cohen (2019) is the new gold standard for qualitative analysis of these issues. It is thorough. It is, as far as I can tell, correct. It is a dense and formidable work; I’m not through it yet. So while it may contain all the answers, I haven’t read them yet. This leaves me free to continue to think about how I would go about solving them.

My perspective is this: it will require social scientific progress to crack the right institutional design to settle informational capitalism in a satisfying way. Because computation is really at the heart of the activity of economic institutions, computation will need to be included within the social scientific models in question. But this is not something particularly new; rather, it’s implicitly already how things are done in many “hard” social science disciplines. Epstein (2006) draws the connections between classical game theoretic modeling and agent-based simulation, arguing that “The Computer is not the point”: rather, the point is that the models are defined in terms of mathematical equations, which are by foundational laws of computing amenable to being simulated or solved through computation. Hence, we have already seen a convergence of methods from “AI” into computational economics (Carroll, 2006) and sociology (Castelfranchi, 2001).

This position is entirely consistent with Abebe et al.’s analysis of “roles for computing in social change” (2020). In that paper, the authors are concerned with “social problems of justice and equity”, loosely defined, which can be potentially be addressed through “social change”. They defend the use of technical analysis and modeling as playing a positive role even according to the politics the Fairness, Accountability, and Transparency research community, which are particular. Abebe et al. address backlashes against uses of formalism such as that of Selbst et al. (2019); this rebuttal was necessary given the disciplinary fraughtness of the tech policy discourse.

What I am proposing in this note is something ever so slightly different. First, I am aiming at a different political problematic than the “social problems of justice and equity”. I’m trying to address the economic problems raised by Cohen’s analysis, such as the dysfunctionality of the data market. Second, I’d like to distinguish between “computing” in the method of solving mathematical model equations and “computing” as an element of the object of study, the computational institution (or platform, or data refinery, etc.) Indeed, it is the wonder and power of computation that it is possible to model one computational process within another. This point may be confusing for lawyers and anthropologists, but it should be clear to computational social scientists when we are talking about one or other, though our scientific language has not settled on a lexicon for this yet.

The next step for my own research here is to draw up a mathematical description of informational capitalism, or the stylized facts about it implied by Cohen’s arguments. This is made paradoxically both easier and more difficult by the fact that much of this work has already been done. A simple search of literature on “search costs”, “network effects”, “switching costs”, and so on, brings up a lot of fine work. The economists have not been asleep all this time. But then why has it taken so long for the policy critiques around informational capitalism, including those around informational capitalism and algorithmic opacity, to emerge?

I have two conflicting hypotheses, one quite gloomy and the other exciting. The gloomy view is that I’m simply in the wrong conversation. The correct conversation, the one that has adequately captured the nuances of the data economy already, is elsewhere–maybe in an economics conference in Zurich or something, and this discursive field of lawyers and computer scientists and ethicists is just effectively twiddling its thumbs and working on poorly framed problems because it hasn’t and can’t catch up with the other discourse.

The exciting view is that the problem of synthesizing the fragments of a solution from the various economists literatures with the most insight legal analyses is an unsolved problem ripe for attention.

Edit: It took me a few days, but I’ve found the correct conversation. It is Ross Anderson’s Workshop on Economics and Information Security. That makes perfect sense: Ross Anderson is a brilliant thinker in that arena. Naturally, as one finds, all the major results in this space are 10-20 years old. Quite probably, if I had found this one web page a couple years ago, my dissertation would have been written much differently–not so amateurishly.

It is supremely ironic to me how, in an economy characterized by a reduction in search costs, the search for the answers I’ve been looking for in information economics has been so costly for me.

References

Abebe, R., Barocas, S., Kleinberg, J., Levy, K., Raghavan, M., & Robinson, D. G. (2020, January). Roles for computing in social change. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 252-260).

Castelfranchi, C. (2001). The theory of social functions: challenges for computational social science and multi-agent learning. Cognitive Systems Research2(1), 5-38.

Carroll, C. D. (2006). The method of endogenous gridpoints for solving dynamic stochastic optimization problems. Economics letters91(3), 312-320.

Cohen, J. E. (2019). Between Truth and Power: The Legal Constructions of Informational Capitalism. Oxford University Press, USA.

Epstein, Joshua M. Generative social science: Studies in agent-based computational modeling. Princeton University Press, 2006.

Fraser, N. (2017). The end of progressive neoliberalism. Dissent2(1), 2017.

Selbst, A. D., Boyd, D., Friedler, S. A., Venkatasubramanian, S., & Vertesi, J. (2019, January). Fairness and abstraction in sociotechnical systems. In Proceedings of the Conference on Fairness, Accountability, and Transparency (pp. 59-68).

Strandburg, K. J. (2013). Free fall: The online market’s consumer preference disconnect. U. Chi. Legal F., 95.