I’ve picked up this delightful book again: David Foster Wallace’s *Everything and More: A Compact History of Infinity* (2003). It is *the* David Foster Wallace (the brilliant and sadly dead writer and novelist you’ve heard of) writing a history of mathematics, starting with the Ancient Greeks and building up to the discovery of infinity by Georg Cantor.

It’s a brilliantly written book written to educate its reader without any doctrinal baggage. Wallace doesn’t care if he’s a mathematician or a historian; he’s just a great writer. And what comes through in the book is truly a history of the *idea* of infinity, with all the ways that it was a reflection of the intellectual climate and preconceptions of the mathematicians working on it. The book is fully of mathematical proofs that are blended seamlessly into the casual prose. The whole idea is to build up the excitement and wonder of mathematical discover, just how hard it was to come to appreciate infinity in the way we understand it mathematically today. A lot of this development had to do with the way mathematicians and scientists thought about their relationship to abstraction.

It’s a wonderful book that, refreshingly, isn’t obsessed with how everything has been digitized. Rather (just as one gem), it offers a historical perspective on what was perhaps even a more profound change: that time in the 1700’s when suddenly everything started to be looked at as an expression of mathematical calculus.

To quote the relevant passage:

As has been at least implied and will now be exposited on, the math-historical consensus is that the late 1600s mark the start of a modern Golden Age in which there are far more significant mathematical advances than anytime else in world history. Now things start moving really fast, and we can do little more than try to build a sort of flagstone path from early work on functions to Cantor’s infinicopia.

Two large-scale changes in the world of math to note very quickly The first involves abstraction. Pretty much all math from the Greeks to Galileo is empirically based: math concepts are straightforward abstractions from real-world experience. This is one reason why geometry (along with Aristotle) dominated mathematical reasoning for so long. The modern transition from geometric to algebraic reasoning was itself a symptom of a larger shift. By 1600, entities like zero, negative integers, and irrationals are used routinely. Now start adding in the subsequent decades’ introductions of complex numbers, Napierian logarithms, higher-degree polynomials and literal coefficients in algebra–plus of course eventually the 1st and 2nd derivative and the integral–and it’s clear that as of some pre-Enlightenment date math has gotten so remote from any sort of real-world observation that we and Saussure can say verily it is now, as a system of symbols, “independent of the objects designated,” i.e. that math is now concerned much more with the logical relations between abstract concepts than with any particular correspondence between those concepts and physical reality. The point: It’s in the seventeenth century that math becomes primarily a system of abstractions from other abstractions instead of from the world.

Which makes the second big change seem paradoxical: **math’s new hyperabstractness turns out to work incredibly well in real-world applications.** In science, engineering, physics, etc. Take, for one obvious example, calculus, which is exponentially more abstract than any sort of ‘practical’ math before (like, from what real-world observation does one dream up the idea than an object’s velocity and a curve’s subtending area have anything to do with each other?), and yet it is unprecedentedly good for representing/explaining motion and acceleration, gravity, planetary movements, heat–everything science tells us is real about the real world. Not at all for nothing does D. Berlinski call calculus “the story this world first told itself as it became the modern world.” Because what the modern world’s about, what it *is*, is science.And it’s in the seventeenth century that the marriage of math and science is consummated, the Scientific Revolution both causing and caused by the Math Explosion because science–increasingly freed of its Aristotelian hangups with substance v. matter and potentiality v. actuality–becomes now essentially a mathematical enterprise in which force, motion, mass, and law-as-formula compose the new template for understanding how reality works. By the late 1600s, serious math is part of astronomy, mechanics, geography, civil engineering, city planning, stonecutting, carpentry, metallurgy, chemistry, hyrdraulics, optics, lens-grinding, military strategy, gun- and cannon-design, winemaking, architecture, music, shipbuilding, timekeeping, calendar-reckoning; everything.

We take these changes for granted now.

But once, this was a scientific revolution that transformed, as Wallace observed, everything.

Maybe this is the best historical analogy for the digital transformation we’ve been experiencing in the past decade.