The reason why ubiquitous networked computing has changed how we represent knowledge is because the semantics of code are guaranteed by the mechnical implementations of its compilers.
This introduces a kind of discipline in the representation of knowledge as source code that is not present in natural language or even in formal mathematical notation, which must be interpreted by humans.
Evolutionarily, humanity’s innate capacity for natural language is well established. Literacy, however, is a trained skill that involves years of education. As Derrida points out in Of Grammatology, the transition from the understanding of language as speech or breath to the understanding of knowledge as text was a very significant change in the history of knowledge.
We have not yet adjusted institutionally to a world where knowledge is represented as code. Most of the institutions that run the world–the legal system, universities, etc.–still run on the basis of written language.
But the new institutions that are adapting to represent knowledge as data and software code to process it are becoming more powerful than these older institutions.
This power comes from these new institutions’ ability to assign the work of acting on their knowledge to computing machines that can work tirelessly and that integrate well with operations. These new institutions can process more information, gathered from more sources, than the old institutions. They are organizationally more intelligent than the older organizations. Because of this intelligence, they can accrue wealth and more power.