We have tried to approach a single subject from the most diverse angles, so that everyone has some possibility of connecting it in the most direct way with his or her own interests. This is a minimal introduction relying on a bibliography that is not at all exhaustive, but necessary to delve into any of the issues presented.
Of course our subject proper was not the continuous proportion, but reciprocity and self-organization in Nature and beyond Nature. The constant φ continues to play a marginal, totally episodic role within modern science, which is mainly focused on calculus.
Thus, our preferential point of view is not so much that of science and objectivity, as that of reciprocity itself, to which I attach more importance; as an attempt to give more importance to the awareness of reality than to science.
And reality itself resembles a parable. We can say that next to unity, sometimes also symbolized by the point, the circle and the constant π, two other great mathematical constants exist from eternity, e and φ. The constant e, the basis of the natural logarithms and the exponential function, perfectly embodies the analytical exhaustiveness of calculus, as if looking towards the plurality of the world. The constant φ, the natural algorithm of a Nature that completely ignores calculus, looks at unity without knowing it. And unity itself, if it is really unity, cannot show a preference for either.
Now, it is clear that in modern science the weight of e is infinitely greater than that of φ, to the point that we could perfectly dispense with φ without even noticing it. The number e refers us to continuity and infinite divisibility; φ refers us to discrete operations without intended purpose —and it should be understood that infinite divisibility is already a human purpose that can always exceed its operational limits. Hence the need to look back to the contradictory basis of infinitesimal calculus.
As it is known, the number e was first identified by Jacob Bernoulli in 1683 in a problem of compound interest, and is absolutely consubstantial with the modern spirit of calculus with all that it entails. The continuous proportion has undoubtedly a much older past but, in the eyes of the moderns, it is hard to see how it could have played a relevant role in the knowledge of antiquity. Euler’s number is at the base of so-called advanced or superior mathematics, while the number closest to the muses cannot ever deny the elemental character of its origin —which is also its greatest charm. Surely this is the reason for its permanent popularity among amateur mathematicians.
But this undeniable circumstance hides another naivety on the part of the spirit of calculus that we should learn to appreciate. It is well known that people like Stevin or Newton still believed that ancient cultures could have had broader knowledge than their contemporaries; if mathematicians like them still dared to think that, it must surely be attributed to the incomparable impact that the legacy of Apollonius and Archimedes —the most advanced mathematicians of antiquity – had at that time.
But this already presupposed a totally biased idea about what was advanced in knowledge, which has been perpetuated until today.
It is said that the amount of knowledge regularly doubles every 15 years since the scientific revolution, which implies that today our knowledge of physics and mathematics is four million times greater than in 1687, when the Principia were published. Yet this magnum opus is already an obscure, hard to read treatise that has been and continues to be routinely misinterpreted even by the best experts.
Let us try to understand four million Principia. We do not walk on the shoulders of giants, the giants advance on our shoulders, although less and less, as it is easy to understand. And the logic of accumulated capital is that accumulated capital does not take risks. The theory of relativity and other «revolutions» were adopted following the principle of minimum elimination and maximum conservation of capital. The whole exacerbated search for novelty in theoretical physics is nothing but the enforced headlong rush because it is not allowed to really examine the foundations. But the less one eliminates, and the less one renews the foundation, the more inexorably one ages.
Obviously the stratification of knowledge and the branching of specialities follows the logic of continuous interest and accumulated capital and debt.
It is really curious that two constants as ubiquitous as e and φ, the «two natural fractals», cross so little of each other’s paths in a field as unlimited but redundant as mathematics. So curious, that the study of the points of contact and divergence of both constants should be an area of mathematical research in its own right, full of interest for both pure and applied mathematics. If this has not happened yet, is because of the one-sided development of all sciences and mathematics on the side of calculus and prediction, putting the rest of the resources at their service.
It is understandable that there is, among many mathematicians, a typical allergic reaction to the questions raised by the continuous proportion and the mathematics of harmony. They are seen more as a hindrance than as a guide, since they might imply some sort of discrete, constructive limits to an analysis for which no restriction is wanted. Nothing should measure the one who measures.
It seems that at the time of the emergence of writing and the first great cities, the priests kept the standards of measurement in the temples. But Nature is the perfect temple that keeps everything without hiding anything.
We have said that φ seems to be «the natural algorithm» of a Nature that does not care in the least about calculus. But not having any system of calculus and measurement is, in a sense, equivalent to having all of them and surpassing them all; in the same way that nature’s lack of intention infinitely surpasses human purposes. So, the value of this branch of mathematics for the theory of measurement and computation should be out of the question. It is a matter of identifying relevant problems in the domain of interdependence.
In just over three centuries we have had at least five great cuts with the constructive and proportional conception of measurement: the infinitesimal calculus and Newton’s classical mechanics, Maxwell’s theory, set theory, relativity and quantum mechanics. With each of these successive steps, the problem of measurement has become more and more critical and controversial. But it is absolutely superficial to think that only the latest developments count and that measure theory itself is just an aid to our predictions. This is exactly how this tower of Babel was built.
The interest of the continuous proportion would go beyond our modern involvement in complexity and calculus. In this area, new discoveries are always possible even at the elementary level that are so unexpected that we do not even know how to value them. Moreover, its persistent appearance in music —that unconscious arithmetic, as Leibniz said-, in physiological rhythms and anatomical sequences indicates not only an intuitive, but, ultimately, pre-numerical character.
There is here a great open track not only for the archaeology of knowledge, but for the very activation of an ancient knowledge that seems inconceivable to us today. In the Book of Changes we may see an asymmetric implication algebra. The six lines of a hexagram correspond to the six directions of space at the ends of its three axes, which oppose an I and its circumstance. But here the symmetry of the coordinates is only the external and passive frame, it is the asymmetry what constitutes the internal dynamic spring in which agent and situation interpenetrate.
The Book of Changes is the best possible example of an analogical knowledge that does not depend on calculus, and towards which a certain natural logic of implication converges. What is important is not the 64 cases or the 384 lines, but the plane of synthesis towards which they point.
If mathematical analysis has its so-called complex plane to operate with any number of dimensions or variables, one can equally conceive an implication logic that is capable of reducing any number of variables to an equally intangible plane of synthesis, which we will call the plane of universal synthesis or universal inclusion.
Of course, in this day and age anyone who hears of «universal inclusion» can only think of universal confusion. That is why we have developed the analysis, because we do not believe in knowledge that is not properly formalized. However, the development of formalization has not contributed to greater intelligibility, but rather the opposite. For us, formal knowledge increasingly projects more shadows than light; shadows we know too well are cast by ourselves. Shadows of power over natural or social processes.
The whole philosophy of the West since Descartes is based on the idea of a separate intelligence. That is why the successive cuts that come one after another since Newton do not seem a loss to us, since in this line of logic, more separation from Nature amounts to more self-affirmation. But any process has its limit, and when there is nothing left to assert oneself about —because Nature vanished- everything stops making sense, except for the mere exercise of power, which lacks intrinsic stimulus for knowledge.
Besides, the golden age when the field of fundamental predictions was growing is now far behind us. It will not return, for the simple reason that everything was optimized for predictions and all the low hanging fruit is already picked. One can only scrape out diminishing returns in the ugly struggle with complexity. Yet modern science finds it almost impossible to examine its foundations, which is the only thing where there could be real novelty; we have the great advantage of our historical perspective, but the very existence of the specialities depends on not reflecting on foundations.
Modern science is not capable of dealing with either transcendence or immanence; for as subjects we cannot separate ourselves anymore, nor does we know how to see things again from within Nature.
Now, if we look back as we have done, only reordering our perception of the present theories, what does it mean to dispense with the principle of inertia? What is the point of saying that everything is based on self-interaction? What is the point of saying that the only thing we perceive is the Ether?
These are transcendental statements, in the sense that the father of phenomenology, Edmund Husserl might have given them, had he dealt with physics. However, to suspend the principle of inertia ends with the false idealism of physics, such a fecund contradiction that isolates a ball that rolls from the rest of the world except from ourselves. To realize that we only perceive the Ether, because we only perceive in the mode of light, is to realize that we are always right in the middle and that matter and space themselves are transcendental limits.
Finally, to say that the planets or the electrons orbit their centers by self-interaction can only be understood in the sense that the relationship between matter and the medium appear to be reflexive just because both are not separate.
Separation and reflexivity are appearances both for the subject and the object, and it is useless to adhere to one part trying to deny the other, as science has attempted. Intelligence and being coincide —at least from the point of view of intelligence, since this one is incapable of perceiving itself. This reflexivity, this intelligibility, is the plane of universal synthesis itself. But this has also a physical translation. The evolution of a vortex in six dimensions in Venis coordinates could be a good example of the intersection of a naturalistic view with the transcendental plane.
These ideas can be applied to both the mediate and immediate knowledge of Nature. From the operational and formal point of view, all observable knowledge of physics can be included in the principle of dynamic equilibrium. But from the point of view of immediate cognition, one can hardly sustain oneself in the contemplation of the instant without inertia —to such an extent that ball rolling in the void has captured our subjective idea of succession. However, these principles, which seem now much more demanding from the intuitive point of view, since they are much more full of content, are not based on the separation of nature and therefore make their forced «unification» by man less necessary.
Truly, that transcendental plane is the one in which transcendence with respect to Nature and its return to it coincide —but in truth the only nature that is transcended here is that relative to the inertia of habits, what we call our «second nature». Here we could say with Raymond Abellio: «The perception of relationships belongs to the mode of vision of the «empirical» consciousness, while the perception of proportions is part of the mode of vision of the «transcendental» consciousness» .
But, of course, in modern science there are hardly any proportions, because all the units we handle are a heterogeneous and unintelligible jumble of quantities —that’s why we rely more and more on computers and their programs for data crunching.
The term «transcendental» can only have some meaning for those who deal with the intelligibility of knowledge, not for those who are simply content with its formalization to obtain predictions. However, here we are making it descend to the very core of physical principles and of that eternal unknown we call causality; and this can be done both in a qualitative and a quantitative way.
No physicist or mathematician needs to prove the existence of the complex plane or the complex manifolds, however advisable it may be to review its foundations; much less could we prove the existence of a transcendental plane of synthesis, since the word «transcendental» means that it is the condition of knowledge. Motion is shown by walking, and knowledge by understanding.
Moreover, to speak of the «existence» of such a plane in relation to the world of objects and measurements is not only out of place but completely reverses the situation. Santayana said that essences are «the only thing that people see and the last thing they notice»; from a perspective in line with what we have already said, and which of course has little to do with the usual narratives and cosmologies, the entire experience is a transition between an unknowable but measurable matter and a space that is diaphanous to knowledge but immeasurable. Existence itself is that process of awakening, but with dissimilar rhythms for all kinds of systems and entities.
This our condition «between Heaven and Earth», between the diaphanous and the measurable, is not a mere philosophical or poetic remark, but it determines the whole range of our possibilities of knowledge, which began with the acts of counting and measuring, of arithmetic and geometry, and which have been successively developing with calculus, algebra and everything else until we got here. And we can not only see that it determines it, but also appreciate that there is always a double current, a double movement, descending and ascending.
Nor is it to be believed that this plane of essences is reduced to the mathematical aspect; on the contrary, this is only one possible expression of an infinity of modalities. In our time we have come to believe that complexity exists only in numbers, computers and models, but the richness of phenomena has always been infinite, regardless of any quantification. Our perceptions, as our thoughts, are also a fleeting part of the absolute.
Technoscience is a continuum of practice and theory that dictates what seems acceptable to both. On the other hand, it is not ideas that determine our actions —it is what we do and what we want to do what determine our ideas.
For today’s technoscience, to see intelligence as something totally separate from Nature is an indispensable condition for recombining any aspect of nature at will: atoms, machines, biological molecules and genes, and all the possible interfaces between them under the least restrictive criterion of information.
But granting this convenient separation of domains to manipulate them with the least possible restrictions, paradoxically entails unnecessarily restrictive principles, as is already evident in fundamental physics, which is also the founding ground of our overall commerce with Nature.
The reintegration of intelligence into the unity of being is absolutely contrary to the liberal principle of prior separation in order to recombine without restrictions. The mere possibility of an intelligence in Nature threatens to short-circuit this separating intelligence who has established herself as the ultimate arbiter.
And yet, as we have seen, the idea that there is feedback in the orbit of a planet or an electron is less contradictory than the usual picture that we are dealing with a cannonball trapped in a field. In fact, it is not contradictory at all: it is just absolutely disconcerting, as well as inconvenient, for most of us.
We want to see on the one hand a separate intelligence, and on the other a completely inert matter, and between these two fictions, the evidence of a perfectly impersonal, pre-individual consciousness without qualities becomes totally inconceivable.
There is nothing extraordinary about the fact that apparently heterogeneous bodies seek to attain the homogeneous condition with the environment from which they have emerged, and that they cannot do it without both inner and outer action. The particular aspect of this balance is indiscernible from the intelligence of that entity or system, which cannot but participate in the universal intelligence. Were it not for our involvement in this last one, our very intelligence would only exists subjectively for ourselves.
It is not strange at all, but it is totally inconvenient for the practices in which we are immersed, for the horizontal, indiscriminate remixing that aspires to dissolve all natural boundaries.
In the immediate postwar period, around 1948, under the shadow of the Manhattan project and other military programs, three new «theories» emerged that consolidated the new «algorithmic» style of the sciences: quantum electrodynamics with its endless loops of calculations, information theory, and cybernetics or modern control theory. To this was added, five years later, the identification of the DNA helix, which would soon be reduced equally to the category of information.
Except for the calculation feats that are its only justification, the quantization of the electromagnetic field was on a theoretical level a superfluous undertaking that added nothing new to the known equations. The funny thing is that its promoters had to be averting terms like «self-interaction» and «self-energy» constantly popping up in their faces. Terms like this seem to sound undesirable for such a clean and fundamental theory as QED, and moreover in physics it is assumed that any kind of feedback only can result in stronger non-linear effects.
Thus, while fundamental physics fought tooth and nail to conjure up the idea of feedback, cybernetics had to assume that feedback is a weak, emergent property of highly organized systems made up of «fundamental» blind blocks. At the same time, information theory hacked Boltzmann’s mechanical-statistical entropy into a new brand, rather than the irreversible entropy of thermodynamics. To question the foundations of the past theories would have been out of place, so theorists were content to generalize heuristic procedures, and it could not be otherwise since this science-technology continuum does not demand anything else.
We have seen that our very idea of celestial mechanics, of calculus, and of the mechanical-statistical interpretation of the Second Law are based on flagrant rationalizations —not to speak of more recent developments. Even the explanation of the functioning of our heart is based on a rationalization that tries to ignore the monumental evidence of the role of the breath in the global dynamics of blood circulation.
The only reason we maintain these mirages is a very powerful one, because they reaffirm our idea that we have a separate intelligence, while justifying our indiscriminate intervention in Nature, a Nature that we very conveniently want to reduce to blind laws and random processes. Men of science find it shocking, to say the least, to talk about the transcendental in knowledge, but all modern scientific knowledge is based on an ego pretending to be transcendental that in the end became trivial.