There is a technological war, but whoever thinks it is only technological has already lost it. Now China seems to have taken the lead over the United States in the fight for control of communication channels, and many would celebrate it were it not for the fact that this 5G is only intensifying what already was a huge wave overflow.
Not only do we oppose the indiscriminate deployment of technologies but, in a now distant article, we even suggested another line of biophysical research to evaluate the impact of electromagnetic radiation on humans and other living beings . It is clear that the big corporations promoting this deployment are only concerned about market shares, but who says that in a few years they will not be sued for damages?
We already see the American attempts to blame China for the coronavirus; if they have not already waged a campaign to demonstrate the harmfulness of this coming generation and file trillion-dollar lawsuits, it is simply because 1) the deployment has barely begun and it is too early to accumulate evidence against it, 2) it would clip the wings of their own future developments, and 3) it could equally be applied retrospectively to previous generations and American companies.
And yet one can be sure that there are law firms discussing how a full-blown legal war might be orchestrated. But, leaving aside the vileness of this machinations, I think we would all be more grateful to anyone who showed us that this kind of radiation is or is not safe than to anyone trying bombarding us no matter the consequences.
Maybe the leadership of 5G is good for China’s strategic interests, but it would be much better if those interests coincided with those of most human beings. And most people do not want more and more technology, but rather some kind of protection against it; anyone should understand that.
China, for example, now hires some 50,000 foreign employees, specialists and engineers to conduct a large part of the research and development that takes place throughout the country. The mastery of 5G technology, a great display of muscle, has drawn on a good deal of that workforce. But surely only a small fraction of that number is needed to make a much more important conquest than that of a fleeting technological advantage.
Today the technological tsunami is synonymous with the digital revolution and with a category, information, which seems to pervade everything. All previous developments in science and technology converge in it and pass through it. Even physicists now think of the universe as a gigantic computer; there was much discussion about whether the world was made of atoms or stories but in the end it was decided that it was made of bits and case closed.
Actually the universe seems to matter very little now. Rockets are almost the same as they were fifty years ago, but in that time the capacity of computers has increased by a factor of billions. Hence the feeling for the user that the world wants to get into the computer. The developer, on the other hand, thinks very differently.
Information is not a powerful concept, but an extremely gaseous one that fits all. Today it does not seem we can put limits on it, but limits it has: material, mental and operational, to mention only the most obvious. If it did not, we would not be talking about the information economy.
What is the purpose of information technologies? To reprogram the humans who use them. Science and technology have always done so, but now we have the confirmation as never before.
What is important about information theory is not so much its definitions as the direction it imposes on everything; changing that direction is equivalent to changing the direction of technology as a whole. The unequivocal direction, which obviously inherits from statistical mechanics, is that of decomposing everything into minimal elements that can then be recomposed at will.
For statistical mechanics there is no direction in time: if we do not see a shattered vase recompose itself and return to the table, it is only because we do not live long enough; if the pieces of an eviscerated corpse do not get back together and walk again as if nothing had happened, it is only because we are not in a position to wait 10 raised to 10.000.000.000 years or something similar.
Information theory is not concerned at all with the reality of the physical world, but with the probability in its constituent elements, or rather the probability within its accounting of the constituent elements. The physical world itself is on the contrary a pool of resources for the sphere of computation, aimed to be independent of the former.
Is this statistical view a neutral position or is it simply a pretext to manipulate everything unencumbered? There is no single answer to this. Statistical mechanics and information theory are successfully applied in countless cases, and in countless cases they could not be more irrelevant. What is worrying is the overall trend they create.
For my part, I am more than convinced that a shattered corpse is not going to recompose itself, no matter how fabulous the period of time we choose. And this is not rhetoric; rather, large numbers are the rhetoric of probability; an inflated and poor rhetoric, as it ignores the interdependence of things.
Although in principle we should not confuse levels of scientific discourse with others openly ideological, in practice we observe a sort of pre-established harmony between liberalism and, for example, neoclassical economics, the present neo-darwinian synthesis of evolution, or statistical mechanics and information theory. It is not only that power tends to takeover the sense of any possible tool; theories like these were creatures hatched in the same nest and under the same watchful eye.
The link is even more explicit and exaggerated in today’s neoliberal environment. In the now booming field that tries to bridge quantum mechanics and information theory, «a resource theory of quantum thermodynamics without a background temperature, so that no states at all come for free» has already been proposed. The goal is to design a quantum system in which information is used as currency to trade between different material resources quantified as states. The initiative comes from the University College London but it could equally have come from the London School of Economics .
It is clear that there is more than affinity; there is consanguinity, there is a same logic, there is a same destiny. The real world is only an instrument for computation, and computation for trade. And trade, on this totalitarian scale, exists for the concentration of capital, and capital for the concentration of power. This is a despicable and insane logic, but we have allowed it to be installed by default as our operating system, thanks, among other things, to the fact that we have accepted a more than dubious «neutrality of science».
The problem is much more difficult to grasp in depth since it does not only affect the soft, descriptive sciences or even the statistical ones, but is already inscribed since the beginning of the modern world in the utilitarian bias of the harder sciences such as calculus or mechanics, which create a disposition of the whole from which we cannot find the exit anymore. In fact the disposition is the whole for which any number of elements exist.
There are open, natural totalities, and there are artificial and closed totalities gravitating towards death. Today’s globalized marketplace is almost the diametrically opposite of «an open system». The fiction of an «open society» is maintained by a horizontal narrative in terms of competition that is deemed convenient for the «atomic states» of the social body, the individuals. But there is a much more implacable and concretely structured vertical logic that thinks in terms of marketing, social engineering and population ecology —in the exploitation of niches and ecosystems.
The horizontal vision aims to break everything down into parts while the vertical one is conceived by principle in terms of the whole. «Big fish eats little fish» means very different things depending on the coordinate axis we choose.
The narrative in the horizontal axis of an unstructured collection is for the masses; the vertical logic of a structured whole is for the rampant techno-feudalism and financial techno-fascism. That is why we continually hear about «the markets» but we never hear about the power laws that governs the distribution of wealth or the size of companies, and that shows inescapably that 80 percent of all pie is owned by 20 percent, 80 percent of that 80 percent is owned by a fifth of a fifth, and so on. Ideally, this law tends towards a singularity .
Clearly, the logic of globalization has so far been an all-or-nothing totalitarian logic even if now is at a standstill. Adding to the haze, mainstream discourses try to confuse the analysis of totality with the totalitarian discourse, and promote social atomism as if it were more egalitarian, when the social atom is been shaped from above. Double standards have always existed for this.
A similar duality works for the two extremes of consumer technology development, data mining or those plantations of the mind that are the social networks: developers arrange and shape the whole, consumers interact with the parts. This resonates with Simondon’s modes of minority and modes of majority at the ends on the technological circuit.
In other articles we have also talked about the double circuit in the scientific worldview, consisting of both hard sciences such as physics, based on mathematics and prediction, but without the ability to describe reality, and descriptive and narrative sciences, such as cosmology or the theory of evolution, which try to fill the great gap between abstract laws, nature and the real world. If the former do the hard work, the latter have a preponderant role in our imaginary.
In current lingo we would say that the former are like hardware and the latter like software, with the statistical aspects, like those of information theory, mediating between both.
These planes of duality overlap and reproduce themselves at various levels and make orientation and judgment about the whole techno-scientific phenomenon extremely difficult. But probably the greatest of all oppositions, between man and nature, is the one that has receded the most in our conscience.
Neither science nor technology alone has created this opposition; on the contrary, science and technology are the channelled development of certain passions. However, this economy of impulses has been transmitted to the interior of the principles and has been perpetuated in «the spirit of the laws» and their values.
What could seem more natural today than Newton’s law of motion? But this laws of mechanics, as Poincaré noticed well, are neither true nor false, but only a disposition of the whole. We could dispense with the principle of inertia completely, replace it with a principle of dynamic equilibrium, and still have the same knowledge of observable phenomena and their corresponding laws. Then, why change them?
To have another vision of the whole, of that whole from which the parts and individuals receive their form. Apart from the fact that technoscience is not only the study and control of the external nature, but also a symptom of our inner nature and that of power. As in all surface phenomena, there is always a struggle between what wants to be expressed through it and what wants to gain access.
Newton’s laws of mechanics can be summed up as follows: nothing moves unless something else moves it. Which is just an educated way of saying that we are all dead shit, at the expense of being moved by a God, an initial explosion, an invisible hand or whatever. The consequences of this go on until today. It seems incredible that we have been able to calmly subscribe to this for a third of a millennium, but it is what we have.
Everything that is said today about the environmental crisis or climate change pales in comparison with the mechanical disposition of nature that we have accepted. If we do not rebel against this in our innermost selves, we will hardly do anything about everything else, which is only a by-product. Our external relationship with nature always begins with how we understand it. And not only with nature.
What has been applied to nature, with the complicit feeling of the wise to increase his sense of power, then gradually but uncontrollably spreads to the interior of the social apparatus through the increasingly dense technological network. And it is utterly superficial to pretend that the most recent developments in science change the panorama because they are nothing but a refinement of the same presuppositions, seeking a greater performance.
Science has not changed in essence, which could be great news if one thinks there is still something left for a surprise. Science cannot stop being a surface phenomenon, but neither can it stop echoing deeper impulses. In it, the relationship between subject and object, as well as our conception of totality, can still be decisively transformed.
To put it bluntly, science has a noble side and a side that is at the service of power. What side prevails, depends largely on the scientists themselves, but not only on them. They are no less divided than the masses, for serving two masters at the same time is always more than difficult, and all kind of currents are interwoven in their midst.
The twin questions are what we want to do with what we know, and what we want to know with what we can do. Science and technology have always formed a continuum, even though today it seems to turn in on itself much more quickly. This has much of an illusion because the most important concepts have changed very little; and precisely because of this, a grand reversal could be in order. Or not.
Our liberal-materialism, or material liberalism, pretends to «give life» to a passive nature in order to fulfill its virtuous circle of «improving the world» and realizing the self-transcendence of the species. This process that oscillates between a false ideal pole and another false material pole is fulfilled through a recurrent cycle of idealizations and rationalizations thought to be exhaustive.
The theory and philosophy of information cannot hide their idealistic background. No wonder, since even classical mechanics has it, and the only way to «overcome» this heritage is through the compulsive practical application to material objects. The same is true for the designer-user circuit in the development of technologies.
The feedback that theory and practice create in technoscience does not have to be a virtuous circle, though; it can also be a circle of growing restriction and shrinking horizons. In this respect we still have naive notions, which will not allow us to go much further if we do not learn certain basic lessons.
Idealization and rationalization are like the mythical Symplegades rocks that destroyed ships and navigators; only the one who understands their dangers and avoids them will be able to pass to the other side.
No doubt modern neo-Babylonian science is the crudest inversion of Plato and Pythagoras and the «all is number» program, but even inverted it remains their heir. Today numbers no longer seem to exist to understand the world, but to crush and squeeze it, and all of us with it.
In this indescribable situation, in which mathematics is already the last whore for the lowest tasks —for what is so dirty, that only through numbers can be ignored-, we need a radically different vision of number and math, of theory and practice, of the whole and the parts, of quantity and quality, of knowledge and rationality.
Since everything is hype in the sphere of information, let us confront it with something even more hyperbolic, which nevertheless starts from a surprisingly simple formula.
An analytical totality: the Riemann zeta function
It has been said that if the Riemann hypothesis were solved, all the keys to cryptography and cybersecurity could be broken; no one has specified how this could happen, but at least it reminds us of the close relationship between a hitherto intractable problem, cryptography and information theory.
Such speculations are only based on the fact that the Riemann zeta function establishes a connection between prime numbers and the zeros of an infinitely differentiable function that provides the most powerful method for exploring this field, prime numbers being the basis of classical cryptography. But the zeta function is way, way more than any kind of cryptographic application; it seems to be a code itself, or even a code of codes .
There is even a theorem, due to Voronin, that proves that any kind of information of any size can be encoded with any precision needed within this function —and not only once, but an infinite number of times. Another thing is the extent to which that information can be made effective, about which several studies have been done. This is called the universality of the zeta function . Besides, the Riemann zeta function is just the main case of an infinite number of similar functions.
One could also find any kind of information in white noise; but white noise lacks any structure, while the zeta function has an infinite structure. Had Hegel known both cases he would have spoken of false and true infinity; we will see that these Hegelian apperceptions are not out of place here.
It seems that there is a «zeta code» that could be the object of the algorithmic theory of complexity; but this function, intimately linked to the elementary act of counting, resists being an object. Even being analytical, it cannot be broken down in pieces, which is why all the usual methods have turned out to be pathetically inadequate. The zeta remains irreducible. It is the Great White Hope for all those who do not identify with that or any other color. The white hope of those who do not want to be reduced to white noise.
So far all the non-trivial zeros of the function computed, amounting to many trillions, lie exactly on the critical line, with a real part equal to 1/2; but no certain reason is known why this should be so, and there could still be an infinite number of zeros off the line.
Were we to put all the particles in the universe of the «cosmic computer» that many dream of computing zeros, having all the time in the world for the task, we could still be waiting infinitely for an answer. The story has it that Turing designed an archaic analog computer to falsify the hypothesis; possibly it came with a crank.
It is inevitable that the word «infinite» appears repeatedly when talking about this problem. The zeta function has a singular pole at unity and a critical line with a real value of 1/2 in which all the known and unknown non-trivial zeros appear, with a duality between the zeros and the pole: we either approach the problem from the point of view of unity, or from the point of view of infinity; but it is only at unity that the function ceases to have finite values. So far mathematicians, who in a very definite sense descend from the tradition of infinitesimal calculus inaugurated by Leibniz, have naturally opted for the second point of view, although it can be assumed that both extremes are equivalent.
A well-known mathematician said that, without knowing the truth of Riemann’s hypothesis, it is as using a screwdriver; but having it, it would be more like a bulldozer. Some will never stop thinking about the next real estate development.
But surely the problem has a significance that goes beyond our «powerful methods» and our usual practices. The zeta function questions from top to bottom the relationship we assume between mathematics and reality. As is known, this function presents an enigmatic and totally unexpected similarity to the random matrices that describe subatomic energy levels and other collective signatures of quantum mechanics.
Until now, the mathematical physics from which the entire scientific revolution emerged has been applying mathematical structures to physical problems in order to obtain a partial solution by means of an artful reverse engineering. Calculus itself emerged in this process. On the contrary, what we have here are physical processes that spontaneously reflect a mathematical reality for which there is no known solution.
Since no physical theory explicitly justifies the reproduction of this type of function, the enigma of the underlying dynamics is as transcendental as the very resolution of the hypothesis, although the relationship between both aspects is also mere conjecture. Hence, this problem appeals to physicists and mathematicians alike.
We said that science advances through an alternate process of idealization and rationalization, issuing hypotheses for an idealized case and going later with this case from coast to coast in imperialistic generalizations. Calculus is a good example: infinitesimals are idealizations, while the concept of limit allows to rationalize and «give a foundation» to what in the end are still heuristic procedures.
Statistical mechanics is another of the most flagrant cases of rationalization, since, after extreme idealizations, it has been given explanatory value for everything; and a theory that explains everything cannot say more clearly that it actually explains nothing.
In philosophy, Hegel gives us the best example of the large-scale sway between idealization and rationalization. In an interesting article, Ian Wright attempts to explain the nature of the Riemann zeta function, with Hegel’s Science of Logic, as the mediation between being and non-being through becoming within numbers. Old-fashioned as this can sound to many, three things can be said:
First, that from the point of view of pure arithmetic, this is admissible since if there is a part of mathematics that can be considered a priori, that is arithmetic; while, on the other hand, the very definition of this function affects the totality of integers, and its extension to real and complex numbers.
Second, that calculus is not pure mathematics like arithmetic, although many specialists seem to think so, but mathematics applied to change or becoming. Were it not, there would be no need to compute zeros. The zeta function is a relationship between arithmetic and calculus as close as it is uncertain.
Third, for experimental sciences such as physics the opposition between being and not being does not seem to have systemic implications, and for formal sciences such as information theory this is reduced at most to fluctuations between ones and zeros; however the distinction between reversible and irreversible, open and closed systems, which is at the very heart of the idea of becoming, is decisive and lies at the crossroads of the issue.
Famous physicists like Michael Berry have long noted that the dynamics of the function should be irreversible, bound and unstable. There are even elementary reasons for the former, which we have already pointed out elsewhere; and in spite of everything, and the fact that no Hamiltonian fits exactly into this dynamic, physicists still insist on working with the conservative assumptions of fundamental physics. Why? Because for physicists fundamental physics can only be conservative.
The seriousness of a problem such as the zeta demands that the problem of irreversibility be considered in depth. And considering it in depth means precisely including it at the most fundamental level. Conservative systems are still toy models for this.
This breaks all the patterns of the establishment in physics but is necessary to move forward. No doubt in this way both the idea of reversibility and that of exact mathematical representation lose much of their status, but on the other hand, seeing that the appearance of reversible behavior emerge from a background of irreversibility is the closest thing to finding the magic ring at the heart of nature. We all know that Physics today is not about nature, but about certain laws that affect it.
It is said, for example, that the Riemann zeta function could play the same role for chaotic quantum systems as the harmonic oscillator does for integrable quantum systems, although it also remains unclear what lies at the bottom of the harmonic oscillator model. To get closer to the subject, quantum thermodynamics can be considered in the sense that it has been done, for example, by the Keenan school at MIT, especially with Hatsopoulos, Gyftopoulos and Gian Paolo Beretta.
The Brussels school, led by Prigogine, is much better known to the public almost everywhere, including the United States, than the MIT school, even though the latter is much more sound when dealing with fundamental physics. That such a development, having emerged from MIT, is not more widespread is not a sign of its lack of relevance, but rather the opposite: it affects too much the position of large branches of physics to be easily admitted, and goes against the grain of everything that is so actively promoted.
The quantum thermodynamics of this school is totally opposed to the rationalization of entropy by statistical mechanics. The dynamics is irreversible at the most fundamental level. The number of states is incomparably greater than in the standard model, and only a few of them are selected. The selection principle is very similar to that of maximum entropy, although somewhat less restrictive: it is the attraction in the direction with the steepest-entropy-ascent. No great changes are needed in the usual formalisms, what is transfigured is the sense of the whole .
The equilibrium formalisms are also retained, but their meaning changes completely. The uniqueness of the stable equilibrium states amounts to one of the deepest conceptual transformations of science in the last decades. The approach is contrary to the idealism of mechanics and much more in line with the daily practice of engineers, for whom entropy is a physical property as real as energy. The theory has many more advantages than sacrifices, and it can be applied to the entire domain of non-equilibrium and to all temporal and spatial scales.
But there are other much more basic ways of including irreversibility and entropy in both quantum mechanics and classical mechanics through calculus, as we will see soon.
The connection between entropy and the zeta function is quite recent and only in the 21st century is beginning to take shape. At a very basic level, one can study the entropy of the sequence of zeros: a high entropy would lead to a low structure, and vice versa. It is observed that structure is high and entropy low, and even low level neural networks were already successful in their prediction .
There are other non-extensive types of entropy independent of the amount of material, such as Tsallis entropy, which can be applied to the zeta; these types of entropy are generally associated with power laws, more than exponential laws. But as in thermodynamics in general, the definition of entropy can vary from author to author and from application to application.
In order to unify and generalize so many definitions of entropy, Piergiulio Tempesta has proposed a group entropy . Whether in physical contexts or in complex systems, in economics, biology or social sciences, the relations between different subsystems depends crucially on how we define the correlation. This is also the central issue in data mining, networks and artificial intelligence.
Each correlation law may have its own entropy and statistics, rather than the other way around. Thus, it is not necessary to postulate entropy, but its functional emerges from the kind of interactions that one wants to consider. Each suitable universal class of group entropy is associated with a type of zeta function. The Riemann zeta function would be associated with Tsallis entropy, which contains classical entropy as a particular case. But these correlation laws are based on independent elements, while irreversibility at the fundamental level seems to reject that assumption. Fundamental irreversibility assumes universal interdependence.
The zeta function itself, under other criteria, can be seen as an infinite encyclopedia of correlations and correlation laws.
Riemann’s hypothesis is said to involve the most basic connection between addition and multiplication. So basic that it is not even apprehensible —at least not as yet. So, if there is a possibility to get to the core of the problem, it would have to be through the simplest arguments, rather than the more sophisticated and complex ones. Additionally the overall impact of an idea is all the greater the simpler its nature is.
Calculus or analysis has developed between the idealization of the infinitesimals and the rationalization of the limit, but it has rejected the foundation stone, the constant differential, which by definition must be a unit interval, 1. Let us look at an example that puts into play both the idea of calculation and that of natural and artificial intelligence.
Think about the problem of knowing where to run to catch fly balls—evaluating a three-dimensional
parabola in real time. It is an ordinary skill that even recreational baseball players perform without knowing how they do it, but its imitation by machines triggers the whole usual arsenal of calculus, representations and algorithms.
However, McBeath et al. more than convincingly demonstrated in 1995 that what outfielders do is to move in such a way that the ball remains in a constant visual relation —at a constant relative angle of motion- instead of making complicated time estimates of acceleration as the heuristic model based on calculus intended . Can there be any doubt about this? If the runner makes the correct move, it is precisely because he does not even consider anything like the graph of a parabola.
Although he has not taken this example as a starting point, the only one who has put this «non-method» or direct method into practice is Miles Williams Mathis, simplifying as much as possible ideas already latent in the methods of finite differences . Mathis is the first to admit that he has not been able to apply the principle even in many cases of real analysis, let alone complex analysis, which he does not even deal with. Needless to say, the mathematical community does not even consider such limited proposals.
However, the constant differential is the naked truth of calculus, without idealization or rationalization, and we should not dismiss something so precious even if we do not succeed in seeing how it can be applied. If a constant differential is not found in the tables of values of the function, we can still estimate the dispersion, and whoever says dispersion, can also say dissipation or entropy.
In this way it is possible to obtain an intrinsic entropy of the function, from the process of calculus itself, rather than from correlations between different aspects or parts. This would be, at the most strict functional level, the mother of all entropies. It should be possible to apply this criterion to complex analysis, to which Riemann made such fundamental contributions, and from which the theory of the zeta function emerged.
The criterion of constant differential may even remind us of the mean-value methods of elementary calculus or of analytical number theory. However the core of this method is not averages, it is on the contrary standard calculus that relies on averages —which without this reference lack a common measure. Finite-difference methods have also been applied to the study of the zeta function by leading specialists. But to make the matter even more interesting, we should remember that both methods do not always yield the same values.
This would simultaneously fulfill several major objectives. It would be possible to connect the simplest and most irreducible element of calculus with the most complex aspects. Something less appreciated, but no less important, is that it puts the idea of function in direct contact with that which is not functional —with that which does not change. Analysis is the study of rates of change, but change with respect to what? It should be with respect to what does not change. This turn is imperceptible but transcendental.
The direct method laughs in the face of the «computational paradigm» and its fervent operationalism. The now prevailing idea of intelligence as «predictive power» is completely reactive and servile; in fact, it is no longer known whether it is the one we want to export to machines or the one humans want to import from them. Perceiving what does not change, even if it does not change anything, gives depth to our field. To perceive only what changes is not intelligence but confusion.
I don’t know if this will help the understanding of the function, but it is certainly the most basic level for the problem I can conceive of. At least to me, only the simplest can be relevant here. Researchers now work from the other side, that of complexity, where of course there is always so much to be done. The question is to what extent both ends can be connected.
There are also good reasons to think that the zeta function is linked to classical mechanics: the frequent recourse in this area to «semi-classical» physics is just one of them. Quantum mechanics is adamant in affirming its independence from classical mechanics, but it is incapable of telling us where one begins and the other ends.
Mario J. Pinheiro has proposed an irreversible reformulation of classical mechanics, as an equilibrium between energy and entropy that can replace the Lagrangian. Apart from the fact that the Lagrangian is also fundamental in quantum mechanics, it would be of great interest to evaluate the intrinsic entropy of classical trajectories with respect to the various formulations .
In problems such as that posed by the zeta function in relation to fundamental physics, crucial experiments capable of unraveling the relationship between the reversible and irreversible in systems could soon be conducted —provided that they are expressly sought. For example, a Japanese team very recently showed that two distinct probes of quantum chaos, noncommutativity and temporal irreversibility, are equivalent for initially localized states. As is well known, this function has been studied a great deal from the point of view of non-commutative operators, in which the product is not always independent of order, so this would bring together two areas as vast as distant up to now .
Irreversibility and Reversibility
With the rise of quantum computing and quantum thermodynamics, along with all the associated disciplines, there will be no shortage of opportunities to design key experiments. Until now, the dominant trend has always been to favor reversible aspects, since they are the most manipulable and exploitable; only by changing our logic and duly valuing the role of irreversibility can we reach higher levels of understanding.
The reversible and irreversible, concepts referring to time, may be intimately related —non-commutative aspects in between- with that so basic of the additive and multiplicative properties of the sequence of numbers that evades the mathematicians confronted with the zeta function. It is as if there were two ways of considering time, according to its sequential ordering, and according to the simultaneous coexistence of all its elements.
The relationship between the reversible and the irreversible is critical in the evolution of complex systems: think only of aging, or the difference between reversible or irreversible damage to the health of an organism or any evolution of events. If we believe that this distinction is irrelevant in fundamental physics, it is only because physicists have carefully segregated thermodynamics so as not to be tainted by problems having something to do with real time. And because the projection of mathematics from above led directly to a fatal misunderstanding.
It is not that hard either. It is said that electromagnetism is a reversible process, but we have never seen the same light rays returning intact to the bulb. The irreversible is primary, the reversible, important as it is, is just a settling of scores.
But physicists have always been fascinated by reversible artifacts, independent of time, which some have seen as the last legacy of metaphysics. And what else? There are no closed, reversible systems in the universe, which are just fictions. Metaphysics is an art of fiction and physics has continued metaphysics with other means with the perfect excuse that it now descends into reality. And no doubt it has, but at what cost?
Curiously, it is the discovery and application of reversible physical laws that has exponentially increased the possibility of accumulation and with it of progress understood as an irreversible process. This reversibility is as fully in line with the logic of exchange and equivalence as the unrepeatable is alien to it.
No less curiously, only the understanding of the irreversible would allow us to partially reverse certain dynamics —partially because it goes without saying that no one ever steps in the same river twice. But power is always the least interested in the possibility of rectification for anything. It is decidedly committed to accumulation and historical irreversibility, which also means committed to the business of a reversible and interchangeable nature.
Curious, very curious, and even more curious then. Although it was always seen that «there is no way back» is the way to make us jump through the hoops. But in all this of the reversible and irreversible we must distinguish between an external and internal perspective. What distinguishes both perspectives is the definition of equilibrium as a sum or a product.
A zero-sum equilibrium, although derived and not necessarily the most important, is the one Newtonian mechanics presents between action and reaction. It defines a system from the most external point of view possible, which usually coincides with our idea of reversible processes.
An equilibrium of densities in which the product equals unity is, for example, the right perspective for continuous media mechanics; an internal perspective to a primitively homogeneous medium that is always available even if physics has obscured it with convenience tools like vector calculus and successive and increasingly opaque formalisms —tensors, connections, operators, etc. This way reversible processes can also be described, as in fluid mechanics, the theory of elasticity or electromagnetism —the latter clearly exhibiting both reversible and irreversible behaviors.
Suffice it to say here that the definition of equilibrium allows for the establishment of different types of relationships between the part and the whole, between local and global adjustment. Classical entropy, for example, is an extensive quantity, and additive for independent systems without interaction. The basic quantity is the logarithm because additive relationships are much more manageable than multiplicative ones.
Ilya Prigogine showed that any type of energy can be broken down into an intensive and an extensive variable whose product gives us a quantity; an expansion, for example, is given by the PxV product of pressure (intensive) by volume (extensive). The same can be done for relationships such as changes in mass/density with the relation between velocity and volume, and so on.
Without entering now into the complexity of the subject, this double definition of equilibrium has implications for mechanics, arithmetic, calculus and entropy itself. It is related with the external or internal perspective of the same process or problem.
The definition of the equilibrium also depends on whether we are talking about open or closed systems. For the most elemental intuition, which is not mistaken in this, something «alive» is what has a permanent exchange with the environment, while something «dead» is what does not. It goes without saying that the general case of statistical mechanics is that of a closed system with independent elements, that is, something «really dead».
As we have already noted with Pinheiro’s proposal, even classical mechanics can be reformulated as an irreversible mechanics with a term for the free energy of the system —that background energy that some would like to proscribe for the improvement of trade and resource management. One could ask what is the need for this, with the standard equations working fine just as they are. But, apart from everything we have said, is it not worth realizing that nothing is dead and that everything has its own intrinsic form of regulation?
But this intrinsic regulation can be shown even without any appeal to thermodynamics, even with the conventional equations of celestial mechanics, as we have already shown sufficiently elsewhere . But the introduction of an irreversible element, unnecessary as it seems, allows other connections and another conception of nature which is not cut off by our utilitarian interests.
Entropy and self-information
In fact all the gauge fields of the standard theory, based on the invariance of the Lagrangian, clearly exhibit a feedback. They don’t even hide it, and the theorists have been struggling with terms like self-energy and self-interaction for generations now, shooing them away like flies of their faces because they thought it was a «silly» idea. Now, the renormalization group at the base of the standard model and statistical physics, containing this self-interaction, is more or less the same that now is used in the neural networks of deep learning and artificial intelligence —so something that the electromagnetic field already has built-in is being used externally. Let us call it a missed natural intelligence.
Nobody made the integers, and all else is the work of man. The zeta function is an ideal candidate for creating a collective network of emerging artificial intelligence around it. Why? Because in it the internal and external aspects of the act of counting, what some would call the object and the subject, coincide. In this type of networks will be key to what extent the potential of the physical substrate is tapped, the local and global equilibrium conditions, the correlation spectrum, and a good number of aspects we cannot deal here.
Thus it may be possible to create a collective emerging artificial intelligence alien to human intelligence defined by purpose, and yet with the possibility of communication with humans at different levels through a common act, counting, that, as purpose limits our scope, also for humans use to exist at very superficial levels. Intellection is a simple act and intelligence is a connection or contact of the complex with the simple, rather than the opposite.
Renormalization in physics emerges at the same time as information theory, around 1948. Entropy and information become almost synonymous, and if in thermodynamics entropy was seen as work loss, it will now be seen as information loss. Shannon’s information was initially called surprisal or self-information.
The confusion of entropy with disorder is due to Boltzmann’s rationalization in attempting to derive macroscopic irreversibility from mechanical reversibility, quite a tour de force by itself. It was Boltzmann’s use of the concept of «order» what introduced a subjective element. Clausius original energetic conception of entropy, stating that the entropy of the world tends to the maximum, was, if nothing else, far more natural.
Order is a subjective concept. But to say world is to say order too, then also the idea of world is inevitably subjective. However subjective they may be, «order» and «world» are not just concepts, they are vital aspects of every organized entity or «open system». Yet even our very intuition of order and entropy are entangled and confused.
Not that entropy increases disorder. On the contrary, the tendency towards maximum entropy is conducive to order, or seen from the other side, order has more potential for dissipation. As Rod Swenson put it: «The world is in the order production business, including the business of producing living things and their perception and action capacitities, because order produces entropy faster than disorder».
«Open systems», like the outfielder trying to catch the fly ball, have perception-action cycles, with their constant differentials, and probably even their own pole and critical line. But is there any entity or process that is not an open system? Surely not, except for the peculiar conceptions of science.
Even so, we see that information theory itself, no matter how formal and objective it may be, cannot avoid the reflective turn in the interpretation and use of data. At the same time, this interpretation is part of the system’s behavior, although in black box mode.
We could go on indefinitely, but in short, let us say that modern information theory is too broad and void, too unrestrictive, and therefore with very poor capacity to shape anything by itself —by itself it is only a general framework. But understood as disposition, it is the biggest weapon of mass destruction. It is a will of nothing pure and simple, a will of nothing willing to sweep the world away.
If this is not obvious enough to us, it is because we have been trained for it for a few centuries and the training has worked. In reality science, and modernity in general, is a dual movement between knowledge and control, but from the beginning it has had to purge half of the potential of the former to better exercise the latter.
There is therefore a permanent process of immunization against the inconvenient right from the start to the present day. The development of the suppressed part, even if it had a higher order rationality, would cause a serious short circuit to the system, which by the same selective logic of exclusion cannot assimilate this inconveniences.
If both the thinker and the layman are intimidated by the complexity of scientific problems, the experts are even more intimidated by the idea of revising the foundations of their own discipline, which is tantamount to undermining their condition of possibility. The logic of specialization is clearly another of those irreversible and increasingly restrictive processes, which has only the alternative of creating new specialities.
However, the devil can only exit by the same way he has come in, and the depth of the problems remains where they were first found, much more so than in the supposedly advantageous, but not selfless, position of hindsight readings. The impossibility of reconsider anything important betrays a definite structural weakness. Technoscience is an elitist project that has to pay the price of isolating itself from reality, even though, so far, both the project and the isolation are paid by all.
The zeta function is just one great example of this kind of indigestible analytical totality for modern methods. Someone, in an Information Age typical turn, might say that, if any possible information is already in that function, we might as well look for the answers there instead of in the universe. But the opposite is true: to better understand the function we need to change the ideas of how the universe «works» and the range of application of our more general methods. That is the beauty of the subject.
There is of course a whole constellation of much simpler, identifiable and treatable problems that can and should be studied independently of that problem, even having much in common: in the foundation of calculus, in the irreversibility in classical and quantum mechanics, in the meanings of entropy and information, in the connections of all this with the theory of complexity and the algorithmic theory of measurement, and so on.
Besides, there are relatively simple open totalities that do not require a great deal of theoretical development but rather a selective change of attention. Think for example of good old Ehret’s principle, that says that the vitality of an organism is equal to the power minus the obstruction (V = P-O). In reality, this is the only functional definition of health that has been given, and with its help, a proper measurement of the pulse, and a series of derived and associated formulas, a whole global theory of health and aging can be developed that we totally lack today.
But why develop a consistent and comprehensive theory of health, full of common sense and understandable to all, and which makes primary health care an easy target to meet, when we can have an indecipherable medical industry taking 5 or 10 times more profit out of its customers? And with the immense advantage that they will never understand anything. Those other things have no right to enter our world.
The complexity of nature, so often fabulous, has little to do with the fabulous business of complexity. Anyone can see that high complexity is extremely profitable and inherent to the increase of economic activity. The fact that nature can be disconcerting in detail does not prevent it from having in many aspects an elemental logic that is very easy to extract, but systematically obscured by the accomplices of complexity, who in order to solve problems have to create them first.
The omnipresent praise of intelligence and creativity is entirely at the service of this context of «finding new problems» and veins, where simplicity will always be suspect, and where those concerned will always seek to vilify and degrade it. But no degree of intelligence can replace rectitude, even in the most intellectual of problems.
It is absolutely true that technological solutionism is the great excuse of the present for not looking directly at things, at the real problems, those with not need to be invented. Of course, not only that, it is a way to channel the amorphous frustration and in the meantime re-educate the frustrated. It is clear that our most serious problems are not solved by technology; but technology and its perverse use are already one of the biggest of them all.
This does not mean that we want to flee into the past or react or reduce our horizons, either on a theoretical or practical level. The main reactive forces are those operating from above, and we have to show that they have no superiority whatsoever; not moral superiority, which is obvious, but not even technical superiority, which is decisive for them. They are the ones who are closing horizons, and that encirclement must be broken.
There are some 25 million programmers scattered around the world, a great intellectual force composed of people of all kinds who play an important mediating role between science, technology and the majority. Let us hope that they and the multi-specialists will help to overcome this darkness.
Seeing the bigger picture
Bernhard Riemann’s eight-page paper On the number of primes less than a given magnitude was published in November 1859, the same month as The origin of species. That same year saw the beginning of statistical mechanics with a pioneer work by Maxwell, and the starting point of quantum mechanics with spectroscopy and the definition of the black body by Kirchhoff.
I have always felt that in Riemann’s conscientious study, so free of further purpose, there is more potential than in all of quantum mechanics, statistical mechanics and its daughter information theory, and the evolution theory combined; or that at least, in a secret balance sheet, it constitutes a fair counterweight to all of them. Although if it helps to redirect the doomed trend of infonegation of reality it will have already been enough.
These three or four developments mentioned are after all children of their time, circumstantial and more or less opportunistic theories. It is true that Boltzmann fought and despaired for the assumptions of atomic theory to be admitted, but he had his whole battle with physicists, because chemists had already been working with molecules and atoms for a while. In mathematics, however, the lags operate on another scale.
A wordy bestseller, The origin of species was discussed in cafés and taverns on the very day of its presentation; but the Riemann hypothesis, being only the strongest version of the prime number theorem, nobody knows how to go about it after 160 years. Clearly we are talking about different wavelengths.
And what does one text have to do with the other? Nothing at all, but they still present an elusive point of contact and extreme contrast. The usual reading of the theory of evolution says that what drives the appreciable order is chance. Riemann’s hypothesis, that prime numbers are distributed as randomly as possible, but that this randomness hides in itself a structure of infinite richness.
Mathematicians, even living on their own planet, are gradually realizing that the consequences of proving or disproving Riemann’s hypothesis could be enormous, inconceivable. It is not something that happens every Sunday, or every millennium. But we do not need to ask so much: it would be enough to properly understand the problem for the consequences to be enormous, inconceivable.
And the inconceivable becomes more conceivable precisely as we enter «the Information Age». The inconceivable could be something that simultaneously affects our idea of information, its calculation and its physical support. Both software and hardware: a whole damn short-circuit.
Neither economics nor information are destiny, but the zeta function can be the destiny of the Information Age, its funnel and event horizon. There is another singularity and it is very different from the one they had in mind.
You can bet that the root of the problem is not a technical issue, or not that much in any case. But many of its consequences are, and not just technical. They will affect the dynamics of the system as a whole, and one can foresee in advance the ideological warfare, the distraction, the attempts of cognitive infiltration and the struggle for appropriation.
Like the prime numbers themselves, in the chronology and the sequential order the events cannot seem more fortuitous, but from another point of view they seem destined by the whole, pushed by the ebb from all shores to fit in the very moment they took place. Riemann lived in Germany in the middle of Hegel’s century, but in its second half the opposition to idealism could not have been greater in all areas, and in the sciences in particular. The pendulum had turned with all its force and the sixth decade of the century marked the peak of materialism.
Also a son of his time, Riemann could not help but feel acutely the contradictions of 19th-century liberal materialism; but the German mathematician, additionally a theoretical physicist and a profound natural philosopher, was an heir of Leibniz and Euler continuing their legacy by other means. His basic conviction was that we humans do not have access to the infinitely large, but at least we can approach it through the study of its counterpart, the infinitely small.
These were prodigiously fruitful years for physics and mathematics. Riemann died before the age of forty and did not have time to articulate a synthesis in tune with his conceptions and permanent search for unity; a synthesis understood not as an arbitrary construction, but as the unveiling of the indivisibility of truth. But he achieved something of that kind where least expected: in the analytical theory of numbers. A provisional synthesis though, that has given rise to the most famous conditional of mathematics: «If the Riemann’s hypothesis is true, then…»
This unexpected and conditional synthesis came about through complex analysis, just in the same years that complex numbers were beginning to emerge in physics, seeking, like atoms and molecules, more degrees of freedom, more room to move. The role of complex numbers in physics is a topic always postponed since it is assumed that their only reason for being is convenience —however, when it comes to dealing with the zeta function and its relationship with physics, there is no mathematician who is not forced to interpret this question one way or another, which physicists generally associate with rotations and amplitudes.
But complex analysis is only the extension of real analysis, and to get to the heart of the matter we must look further back. Riemann’s conditional synthesis speaks to us of something indivisible, but it still relies on the logic of the infinitely divisible; not to solve the famous problem, but simply to be in tune with it, it would have to be understood in terms of the indivisible itself, whose touchstone is the constant differential.
Is there anything beyond information and the computer? Of course there is, the same thing that is waiting for us beyond the Symplegades. A much more vast and undivided reality.
 Miguel Iradier, Stop 5G (2019). In that article we pointed to the measurement of the geometric phase, originally discovered in electromagnetic field potentials, but which can also be detected in organic macromolecules such as DNA, in cell mobility, and probably even in higher order physiological rhythms such as the respiratory and bilateral nasal cycles. The study of this phase memory at such diverse levels could provide a much more consistent, and perhaps conclusive, level of correlation.
 Carlo Sparaciari, Jonathan Oppenheim, Tobias Fritz, A Resource Theory for Work and Heat (2017).
 Carlo Sparaciari, Jonathan Oppenheim, Tobias Fritz, A Resource Theory for Work and Heat (2017).
 Miguel Iradier, La tecnociencia y el laboratorio del yo (2019), capítulo I, Disposición de la mecánica
 For a good short introduction to the Riemann hypothesis on can consult, among others, The Riemann Hypothesis, explained, de Jørgen Veisdal.
 Matthew Watkins, Voronin’s Universality Theorem
 Ian Wright, Notes on a Hegelian interpretation of Riemann’s Zeta function (2019)
 Matthew Watkins, Number Theory and Entropy
 O. Shanker, Entropy of Riemann zeta zero sequence (2013)
Alec Misra, Entropy and Prime Number Distribution; (a Non-heuristic Approach)
 Piergiulio Tempesta, Group entropies, correlation laws and zeta functions (2011)
 Miles Mathis, A Re-definition of the Derivative (why the calculus works—and why it doesn’t) (2003); Calculus simplified
My Calculus applied to the Derivative for Exponents
The Derivatives of ln(x) and 1/x are Wrong
 Mario J. Pinheiro, A reformulation of mechanics and electrodynamics (2017)
Ryusuke Hamazaki, Kazuya Fujimoto, and Masahito Ueda, Operator Noncommutativity and Irreversibility in Quantum Chaos (2018)
 Miguel Iradier, Pole of inspiration —Math, Science and Tradition
Miguel Iradier, Towards a science of health? Biophysics and Biomechanics
Miguel Iradier, El multiespecialista y la torre de Babel
 Rod Swenson, M. T. Turvey, Thermodynamic Reasons for Perception-Action Cycles