The General Theory of Capital: Self-Reproduction of Humans Through Increasing Meanings
Шрифт:
A string of figurae forms a meaning, a bundle of meanings forms a context. Since meanings exist in context, they function not as a discrete, but as a continuous set. There is no clearly defined, fixed boundary between figurae and meanings; this boundary is mobile and is determined by the context. The same human movement can, depending on the context, be either part of an action or an independent action with its own meaning—a gesture. Every meaning acquires its specifics in context. Meaning is always a specific action and the result of such an action. An abstract expression is a meaning only insofar as it is found in the context of concrete social actions.
Aristotle said that “art in some cases completes what nature cannot bring to a finish, and in others imitates nature” (Aristotle 1984, vol. 1, p. 340). Completeness or perfection is the main characteristic of a meaning and of culture as a whole, as compared to figurae. A finished biface is a meaning, a stone fragment is a figura. A finished phrase is a meaning, an unfinished phrase is an assortment of figurae. A well-thought-out book is a meaning; an ill-thought-out book is an assortment of figurae. Wisdom is the highest form of completeness of an action, enabling one to begin a fundamentally new action.
The relationship between figurae and meanings in their linguistic (symbolic) forms, a kind of “sense of meaning,” is the basis of the common language of all humans, which enables us to learn new languages in adulthood and to guess the purpose of rubble found during archaeological excavations. “Such non-signs as enter into a sign system as parts of signs we shall here call figurae; this is a purely operative term, introduced simply for convenience. Thus, a language is so ordered that with the help of a handful of figurae and trough ever new arrangements of them a legion of signs can be constructed” (Hjelmslev 1969, p. 44).
When we say that bits are “building blocks” of information, and figurae are “building blocks” of meaning, we imply that figurae, unlike bits, have qualitative properties and that the set of figurae can be divided into subsets, or that we can distinguish between basic types of figurae. This applies to meanings as such, but it was first noted for linguistic signs. Hjelmslev considered the identification of these types to be a necessary condition for understanding both the expression and the content of languages.
“Such an exhaustive description presupposes the possibility of explaining and describing an unlimited number of signs, in respect of their content as well, with the aid of a limited number of figurae. And the reduction requirement must be the same here as for the expression plane: the lower we can make the number of content-figurae, the better we can satisfy the empirical principle in its requirement of the simplest possible description” (Hjelmslev 1969, p. 67).
As we saw above, meanings are not reduced to signs and symbols. Meanings manifest themselves in the mental, social and physical existence of a person, but meanings are not born in this existence. Abstractions are not a product of the human intellect, neither in its affirmative form of understanding nor in its negative form of reason. Rather, it is understanding and reason that are the result of the evolution of social and material abstractions in action. Meanings only reproduce fundamental definitions, states, relationships, changes, directions in nature and society: “If we’re able to learn language from a few years’ worth of examples, it’s partly because of the similarity between its structure and the structure of the world” (Domingos 2015, p. 37). Hence the universality of meanings, the ability of people to understand each other, to translate each other’s languages—and this after tens of thousands of years of isolated life. During the Age of Discovery, Europeans found a common language with the Indians or Australians. All people act, talk and think in one language—the language of meaning:
“In Leibniz’s view, if we want to understand anything, we should always proceed like this: we should reduce everything that is complex to what is simple, that is, present complex ideas as configurations of very simple ones which are absolutely necessary for the expression of thoughts” (Wierzbicka 2011, p. 380). “…’Inside’ all languages we can find a small shared lexicon and a small shared grammar. Together, this panhuman lexicon and the panhuman grammar linked with it represent a minilanguage, apparently shared by the whole of humankind. … On the one hand, this mini-language is an intersection of all the languages of the world. On the other hand, it is, as we see it, the innate language of human thoughts, corresponding to what Leibniz called ‘lingua naturae’” (ibid., p. 383).
Mathematics as a domain of meaning is also a reflection of the fundamental definitions of the world. The similarities between the world and mathematics make it possible to solve scientific problems. This similarity did not arise overnight. Mathematics is a result of the evolution of meaning from the order of the universe up to the reflection of this order in the minds of people. On the scale of millions and billions of years, the difference between Turing and Wittgenstein disappears: “Turing thought of mathematics as something that was essentially discovered, something like a science of the abstract. Wittgenstein insisted that mathematics was essentially something invented, following out a set of rules we have chosen—more like an art than a science” (Grim 2017, p. 151). In fact, both mathematics and logic in general are the result of cultural evolution that occurs through selection and choice. It could be, that the logical contradiction between meanings expresses the historical and practical discrepancy of meanings in relation to the environment and the subject, and the resolution of such a contradiction reflects the overcoming of this discrepancy.
The simplicity of early meanings did not only concern making. Thinking and communicating were just as simple, relying on crude motions of body and mind. Primitive making has left us its direct results: stones, bones, etc. Unfortunately, the direct products of communicating or thinking no longer exist, so we can only judge them indirectly. In the process of social and then cultural learning, as the norm of first learned and then rational reaction expanded and cultural selection turned into traditional choice, the complexity of meanings and of the culture-society as a whole increased, as did the number of figurae and meanings.
The gradual complication of meanings becomes clear, for example, when we consider the evolution of stone tools: from the simplest Paleolithic choppers to the polished and drilled Neolithic axes, which are characterized by a much higher level of workmanship. Cultural evolution consists in the division of meanings, that is, in the emergence of ever new types of actions and their results. By dividing their activity and knowledge, people specialized in those types of actions in which they had a competitive advantage due to the characteristics of the environment or their active power. Hunting and gathering divided into farming, herding, crafts, trade. Not only the complexity of making grew, but also of communicating and thinking. Languages were more complex. Learned actions became a more important part of self-reproduction relative to instinctive behaviors, rational actions grew more vital relative to learning.
2. Complexity of meaning
Minimal subject and minimal action
That the complexity of meanings increases as they evolve may be intuitively obvious, but it was only in the middle of the 20th century that the concepts of the quantity of information and information complexity were rigorously substantiated in the works of Claude Shannon and Andrey Kolmogorov.
Shannon introduced the concept of information entropy. According to him, entropy H is a measure of the uncertainty, unpredictability, surprise or randomness of a message, event or phenomenon. In terms of culture, such a message or event is a counterfact. Without information losses, Shannon entropy is equal to the amount of information per message symbol. The amount of information is determined by the degree of surprise inherent in a particular message: