To come in
Sewerage and drainpipes portal
  • Pythagoras and the Pythagoreans. The doctrine and school of Pythagoras. Philosophy of Pythagoras In the philosophy of Pythagoras, the core was
  • Complementarity principle
  • The problem of consciousness in the history of philosophy
  • Dualism - what is it in psychology, philosophy and religion?
  • Topic of lecture subject and history of development of pathopsychology lecturer
  • Goddess Demeter: all about her
  • Who formulated the principle of complementarity. Complementarity principle

    Who formulated the principle of complementarity. Complementarity principle

    In everyday life, there are two ways to transfer energy in space - through particles or waves. To, say, throw off a domino tile that is balancing on its edge, you can give it the necessary energy in two ways. First, you can throw another domino at it (that is, transmit a point impulse using a particle). Secondly, you can build a row of dominoes standing in a chain leading to the one on the edge of the table, and drop the first on the second: in this case, the impulse will be transmitted along the chain - the second domino will fill up the third, the third fourth, and so on. This is the wave principle of energy transfer. In everyday life, there are no visible contradictions between the two mechanisms of energy transfer. So, a basketball is a particle, and sound is a wave, and everything is clear.

    Let's summarize what has been said. If photons or electrons are directed into such a chamber one at a time, they behave like particles; However, if we collect enough statistics of such single experiments, it turns out that in the aggregate the same electrons or photons will be distributed on the back wall of the chamber so that a familiar pattern of alternating peaks and decays in intensity will be observed on it, indicating their wave nature. In other words, in the microcosm, objects that behave like particles, while, as it were, “remember” their wave nature, and vice versa. This strange property of the objects of the microworld is called quantum wave dualism... Many experiments were carried out in order to "reveal the true nature" of quantum particles: various experimental techniques and installations were used, including those that would allow halfway to the receiver to reveal the wave properties of an individual particle or, conversely, to determine the wave properties of a light beam through the characteristics of individual quanta. All is in vain. Apparently, quantum-wave dualism is objectively inherent in quantum particles.

    The principle of complementarity is a simple statement of this fact. According to this principle, if we measure the properties of a quantum object as a particle, we see that it behaves like a particle. If we measure its wave properties, for us it behaves like a wave. Both ideas do not contradict each other at all - they are complement one another, which is reflected in the name of the principle.

    As I already explained in the Introduction, I believe that the philosophy of science has gained from such a wave-particle duality incomparably more than would have been possible in the absence of it and a strict differentiation of phenomena into corpuscular and wave. Today it is quite obvious that the objects of the microcosm behave in a fundamentally different way than the objects of the macrocosm we are used to. But why? What tablets is it written on? And, just as medieval natural philosophers struggled to understand whether the arrow's flight was “free” or “forced,” so modern philosophers struggle to resolve quantum-wave dualism. In fact, both electrons and photons are not waves or particles, but something completely special in their inner nature - and therefore defying description in terms of our everyday experience. If we continue to try to squeeze their behavior into the framework of familiar paradigms, more and more paradoxes are inevitable. So the main conclusion here is that the dualism we observe is generated not by the properties inherent in quantum objects, but by the imperfection of the categories with which we think.

    ADDITIONS PRINCIPLE - one of the most important methodological and heuristic principles of modern science. Proposed N. Bor (1927) in the interpretation of quantum mechanics: for a complete description of quantum mechanical objects, two mutually exclusive ("additional") classes of concepts are needed, each of which is applicable in special conditions, and their combination is necessary to reproduce the integrity of these objects. The physical meaning of the principle of complementarity lies in the fact that quantum theory is associated with the recognition of the fundamental limitations of classical physical concepts in relation to atomic and subatomic phenomena. However, as Bohr pointed out, “the interpretation of empirical material essentially rests precisely on the application of classical concepts” ( Bor N.Fav. scientific. works, t. 2. M., 1970, p. thirty). This means that the effect of the quantum postulate extends to the processes of observation (measurement) of objects of the microworld: "observation of atomic phenomena includes such interaction of the latter with observation means that cannot be neglected" (ibid., P. 37), i.e., on the one hand , this interaction leads to the impossibility of an unambiguous ("classical") determination of the state of the observed system regardless of the means of observation, and on the other hand, no other observation excluding the influence of the means of observation in relation to the objects of the microworld is impossible. In this sense, the complementarity principle is closely related to the physical meaning of the "uncertainty relation" by V. Heisenberg: if the values \u200b\u200bof the momentum and energy of a micro-object are certain, its space-time coordinates cannot be uniquely determined, and vice versa; therefore, a complete description of a micro-object requires the joint (additional) use of its kinematic (spatio-temporal) and dynamic (energy-impulse) characteristics, which, however, should not be understood as a unification in a single picture like similar descriptions in classical physics. An additional way of description is sometimes called the non-classical use of classical concepts (I.S. Alekseev).

    The principle of complementarity is applicable to the problem of "wave-particle duality", which arises when comparing the explanations of quantum phenomena based on the ideas of wave mechanics (E. Schrödinger) and matrix mechanics (W. Heisenberg). The first type of explanation, using the apparatus of differential equations, is analytical; he emphasizes the continuity of the movements of micro-objects described in the form of generalizations of the classical laws of physics. The second type is based on an algebraic approach, for which an emphasis on the discreteness of micro-objects, understood as particles, is essential, despite the impossibility of describing them in “classical” space-time terms. According to the principle of complementarity, continuity and discreteness are accepted as equally adequate characteristics of the reality of the microworld, they are irreducible to some "third" physical characteristic that would "tie" them in a contradictory unity; the coexistence of these characteristics fits the formula “either one or the other,” and the choice of them depends on the theoretical or experimental problems that arise before the researcher (J. Holton).

    Bohr believed that the principle of complementarity is applicable not only in physics, but has a broader methodological significance. The situation related to the interpretation of quantum mechanics “has a far-reaching analogy with the general difficulties in the formation of human concepts arising from the separation of subject and object” (ibid., P. 53). Bohr saw this kind of analogy in psychology and, in particular, relied on the ideas of W. James about the specifics of introspective observation of the continuous course of thinking: such an observation affects the observed process, changing it; therefore, to describe the mental phenomena established by introspection, mutually exclusive classes of concepts are required, which corresponds to the situation of describing objects of microphysics. Another analogy that Bohr pointed out in biology is related to the complementarity between the physicochemical nature of life processes and their functional aspects, between the deterministic and teleological approaches. He also drew attention to the applicability of the principle of complementarity to understanding the interaction of cultures and social structures. At the same time, Bohr warned against absolutizing the principle of complementarity as a kind of metaphysical dogma.

    Such interpretations of the principle of complementarity can be considered dead-end, when it is interpreted as an epistemological “image” of some kind of “inherent” objects of the microcosm of inconsistency reflected in paradoxical descriptions (“dialectical contradictions”) of the type “a micro-object is both a wave and a particle”, “an electron has and does not have wave properties ", etc. The development of the methodological content of the principle of complementarity is one of the most promising directions in the philosophy and methodology of science. It examines the application of the principle of complementarity in the study of the relationship between normative and descriptive models of the development of science, between moral norms and moral self-determination of human subjectivity, between "criterial" and "critically reflexive" models of scientific rationality.

    Literature:

    1. Heisenberg V.Physics and Philosophy. M., 1963;

    2. Kuznetsov B.G.Complementarity principle. M., 1968;

    3. Methodological principles of physics. History and modernity. M., 1975;

    4. Holton J. Thematic analysis of science. M., 1981;

    5. Alekseev I.S.The activity concept of cognition and reality. - Fav. works on the methodology and history of physics. M., 1995;

    6. Historical types of scientific rationality, v. 1–2. M., 1997.

    The fundamental principle of quantum mechanics, along with the uncertainty relation, is the complementarity principle, to which N. Bohr gave the following formulation:

    “The concepts of particle and wave complement each other and at the same time contradict each other, they are complementary pictures of what is happening.”

    The contradictions in the wave-particle properties of micro-objects are the result of uncontrolled interaction of micro-objects and macro-devices. There are two classes of devices: in some, quantum objects behave like waves, in others like particles. In experiments, we observe not reality as such, but only a quantum phenomenon, including the result of the interaction of the device with a micro-object. M. Born figuratively noted that waves and particles are “projections” of physical reality onto an experimental situation.

    First, the idea of \u200b\u200bwave-particle duality means that any material object with wave-particle duality has an energy envelope. A similar energy shell exists for the Earth, as well as for humans, which is most often called an energy cocoon. This energy shell can play the role of a sensory shell, shielding a material object from the external environment and constituting its external "gravitational sphere". This sphere can play the role of a membrane in the cells of living organisms. It passes inside only "filtered" signals, with a disturbance level exceeding a certain limit value. Similar signals that have exceeded a certain specific threshold of the shell sensitivity, it can also pass in the opposite direction.

    Secondly, the presence of an energy shell in material objects brings to a new level of comprehension the hypothesis of the French physicist L. de Broglie about the truly universal nature of the wave-particle dualism.

    Thirdly, due to the evolution of the structure of matter, the nature of the particle-wave dualism of the electron can be a reflection of the particle-wave dualism of photons. This means that the photon, being a neutral particle, has a mesonic structure and is the most elementary micro atom, from which, in the image and likeness, all material objects of the Universe are built. Moreover, this construction is carried out according to the same rules.

    Fourthly, wave-particle dualism allows to naturally explain the phenomenon of gene memory (Gene memory) of particles, atoms, molecules, living organisms, making it possible to realize the mechanisms of such memory, when a structureless particle remembers all its creations in the Past and has "intelligence" to selected processes of synthesis, with the aim of forming new "particles" with selected properties.

    The uncertainty principle is a physical law that states that it is impossible to simultaneously accurately measure the coordinates and momentum of a microscopic object, because the measurement process upsets the balance of the system. The product of these two uncertainties is always greater than Planck's Constant. This principle was first formulated by Werner Heisenberg.

    It follows from the uncertainty principle that the more accurately one of the quantities included in the inequality is determined, the less definite is the value of the other. No experiment can simultaneously lead to accurate measurement of such dynamic variables; in this case, the uncertainty in the measurements is associated not with the imperfection of the experimental technique, but with the objective properties of matter.

    The uncertainty principle, discovered in 1927 by the German physicist W. Heisenberg, was an important stage in the elucidation of the patterns of intra-atomic phenomena and the construction of quantum mechanics. An essential feature of microscopic objects is their wave-corpuscular nature. The state of a particle is completely determined by the wave function (a value that fully describes the state of a micro-object (electron, proton, atom, molecule) and, in general, any quantum system). A particle can be detected at any point in space where the wave function is nonzero. Therefore, the results of experiments to determine, for example, coordinates are probabilistic in nature.

    Example: the motion of an electron is the propagation of its own wave. If you shoot a beam of electrons through a narrow hole in the wall: a narrow beam will pass through it. But if you make this hole even smaller, such that its diameter is equal in size to the wavelength of the electron, then the electron beam will disperse in all directions. And this is not a deflection caused by nearby wall atoms that can be eliminated: it is due to the wave nature of the electron. Try to predict what will happen next with an electron passing through the wall, and you will be powerless. You know exactly where it crosses the wall, but you cannot tell what momentum it will acquire in the transverse direction. On the contrary, in order to accurately determine that an electron will appear with such and such a certain momentum in the original direction, it is necessary to enlarge the aperture so that the electron wave passes directly, only slightly diverging in all directions due to diffraction. But then it is impossible to say exactly where the electron-particle passed through the wall: the hole is wide. As much as you gain in the accuracy of determining the momentum, you lose as much in the accuracy with which its position is known.

    This is the Heisenberg uncertainty principle. He played an extremely important role in the construction of a mathematical apparatus for describing the waves of particles in atoms. Its strict interpretation in experiments with electrons is that, like light waves, electrons resist any attempt to make measurements with extreme precision. This principle also changes the picture of the Bohr atom. It is possible to determine exactly the momentum of an electron (and, consequently, its energy level) in any of its orbits, but at the same time its location will be absolutely unknown: nothing can be said about where it is. Hence, it is clear that it makes no sense to draw a clear orbit of an electron and mark it on it in the form of a circle. At the end of the XIX century. many scientists believed that the development of physics ended for the following reasons:

    The laws of mechanics, the theory of universal gravitation have existed for over 200 years

    Molecular kinetic theory developed

    A solid foundation for thermodynamics has been laid

    Completed Maxwell's theory of electromagnetism

    The fundamental laws of conservation (energy, momentum, angular momentum, mass and electric charge) are discovered

    In the late XIX - early XX century. discovered by W. Roentgen - X-rays (X-rays), A. Becquerel - the phenomenon of radioactivity, J. Thomson - the electron. However, classical physics failed to explain these phenomena.

    A. Einstein's theory of relativity demanded a radical revision of the concept of space and time. Special experiments have confirmed the validity of J. Maxwell's hypothesis about the electromagnetic nature of light. It could be assumed that the emission of electromagnetic waves by heated bodies is due to the oscillatory motion of electrons. But this assumption had to be confirmed by comparing theoretical and experimental data.

    For the theoretical consideration of the laws of radiation, we used the model of an absolutely black body, that is, a body that completely absorbs electromagnetic waves of any length and, accordingly, radiates all lengths of electromagnetic waves.

    An example of an absolutely black body in terms of emissivity can be the Sun, in terms of absorbing - a cavity with mirrored walls with a small hole.

    Austrian physicists I. Stefan and L. Boltzmann experimentally established that the total energy E emitted in 1 s of an absolutely black body from a unit surface is proportional to the fourth power of the absolute temperature T:

    where s \u003d 5.67.10-8 J / (m2.K-s) is the Stefan-Boltzmann constant.

    This law was called the Stefan-Boltzmann law. He made it possible to calculate the radiation energy of an absolutely black body from a known temperature.

    In an effort to overcome the difficulties of the classical theory in explaining blackbody radiation, M. Planck in 1900 put forward a hypothesis: atoms emit electromagnetic energy in separate portions - quanta. Energy E, where h \u003d 6.63.10-34 J.s - Planck's constant.

    Sometimes it is convenient to measure the energy and Planck's constant in electron volts.

    Then h \u003d 4.136.10-15 eV. In atomic physics, the quantity

    (1 eV is the energy that an elementary charge acquires when passing an accelerating potential difference of 1 V. 1 eV \u003d 1.6.10-19 J).

    Thus, M. Planck pointed out the way out of the difficulties faced by the theory of thermal radiation, after which a modern physical theory called quantum physics began to develop.

    Physics is the main natural science, because it reveals the truths about the relationship of several basic variables that are true for the entire universe. Its versatility is inversely proportional to the number of variables it introduces into its formulas.

    The progress of physics (and science in general) is associated with the gradual rejection of direct visualization. As if such a conclusion should contradict the fact that modern science and physics, first of all, are based on experiment, i.e. an empirical experience that takes place under human controlled conditions and can be reproduced at any time, any number of times. But the thing is that some aspects of reality are invisible to superficial observation and clarity can be misleading.

    Quantum mechanics is a physical theory that establishes a way of describing and the laws of motion at the micro level.

    Classical mechanics is characterized by the description of particles by specifying their position and velocities, and the dependence of these quantities on time. In quantum mechanics, the same particles under the same conditions can behave differently.

    Statistical laws can only be applied to large populations, not to individuals. Quantum mechanics refuses to search for individual laws of elementary particles and establishes statistical laws. On the basis of quantum mechanics, it is impossible to describe the position and speed of an elementary particle or predict its future path. Probability waves tell us about the likelihood of encountering an electron in a particular location.

    The importance of experiment has grown in quantum mechanics to such an extent that, as Heisenberg writes, "observation plays a decisive role in an atomic event and that reality differs depending on whether we observe it or not."

    The fundamental difference between quantum mechanics and classical mechanics is that its predictions are always probabilistic. This means that we cannot accurately predict where, for example, an electron falls in the experiment considered above, no matter what perfect means of observation and measurement are used. You can only assess his chances of getting to a certain place, and, therefore, apply for this the concepts and methods of probability theory, which serves to analyze uncertain situations.

    In quantum mechanics, any state of a system is described using the so-called density matrix, but, unlike classical mechanics, this matrix does not reliably determine the parameters of its future state, but only with a certain degree of probability. The most important philosophical conclusion from quantum mechanics is the fundamental uncertainty of the measurement results and, therefore, the impossibility of accurately predicting the future.

    This, in combination with the Heisenberg uncertainty principle, as well as other theoretical and experimental data, led some scientists to assume that microparticles have no intrinsic properties at all, and they appear only at the time of measurement. Others suggested that the role of the experimenter's consciousness for the existence of the entire Universe is key, since, according to quantum theory, it is observation that creates or partially creates the observable Determinism is the doctrine of the initial determination of all processes occurring in the world, including all processes of human life, from the side of God (theological determinism, or the doctrine of predestination), or only natural phenomena (cosmological determinism), or specially human will (anthropological-ethical determinism), for the freedom of which, as well as for responsibility, there would then be no room.

    Definability here means the philosophical assertion that every event that has occurred, including human actions and behavior, is uniquely determined by a variety of reasons immediately preceding this event.

    Seen in this light, determinism can also be defined as the thesis that there is only one, precisely defined, possible future.

    Indeterminism is a philosophical doctrine and methodological position that deny either the objectivity of causality or the cognitive value of causal explanation in science.

    In the history of philosophy, starting from ancient Greek philosophy (Socrates) up to the present time, indeterminism and determinism act as opposing concepts on the problems of conditioning the will of a person, his choice, the problem of human responsibility for committed actions.

    Indeterminism treats will as an autonomous force, arguing that the principles of causality do not apply to explaining human choice and behavior.

    The term determination was introduced into circulation by the Hellenistic philosopher Democritus in his atomistic concept, which denied randomness, taking it simply for an unknown necessity. From the Latin language, the term determination is translated as a definition, the obligatory determinability of all things and phenomena in the world by other things and phenomena. At first, to determine meant to define an object through the identification and fixation of its features that separate this object from others. Causality was equated to necessity, while chance was excluded from consideration, it was considered simply non-existent. This understanding of determination implied the presence of a cognizing subject.

    With the emergence of Christianity, determinism is expressed in two new concepts - divine predestination and divine grace, and the old principle of free will collides with this new, Christian determinism. For the general ecclesiastical consciousness of Christianity, it was initially equally important to keep both statements intact: that everything depends on God without exception and that nothing depends on man. In the 5th century, in the West, in his teachings, Pelagius raises the issue of Christian determinism in the aspect of free will. Blessed Augustine spoke out against Pelagian individualism. In his polemical writings, in the name of the demands of Christian universality, he often brought determinism to erroneous extremes incompatible with moral freedom. Augustine develops the idea that the salvation of man depends entirely and exclusively on the grace of God, which is communicated and acts not according to man's own merits, but freely, according to free election and predestination by the Divine.

    Determinism receives further development and substantiation in natural science and materialist philosophy of modern times (F. Bacon, Galileo, Descartes, Newton, Lomonosov, Laplace, Spinoza, French materialists of the 18th century). In accordance with the level of development of natural science, the determinism of this period is mechanistic, abstract.

    Relying on the works of his predecessors and on the fundamental ideas of natural science I. Newton and C. Linnaeus, Laplace, in his work "Experience of the Philosophy of Probability Theory" (1814) brought the ideas of mechanistic determinism to a logical end: he proceeds from the postulate that from the knowledge of the initial causes, you can always unambiguously deduce the consequences.

    The methodological principle of determinism is at the same time the fundamental principle of the philosophical doctrine of being. One of the fundamental ontological ideas underlying classical natural science by its creators (G. Galilei, I. Newton, I. Kepler, etc.) is the concept of determinism. This concept was to make three basic statements:

    1) nature functions and develops in accordance with its immanently inherent internal, "natural" laws;

    2) the laws of nature are the expression of the necessary (unambiguous) connections between phenomena and processes of the objective world;

    3) the goal of science, corresponding to its purpose and capabilities, is the discovery, formulation and substantiation of the laws of nature.

    Among the diverse forms of determination, reflecting the universal interconnection and interaction of phenomena in the surrounding world, the cause-and-effect, or causal (from Latin causa - cause) connection, the knowledge of which is irreplaceable for the correct orientation in practical and scientific activity, stands out. Therefore, it is the cause that is the most important element of the system of determining factors. And yet, the principle of determinism is broader than the principle of causality: in addition to causal relationships, it includes other types of determination (functional relationships, connection of states, target determination, etc.).

    Determinism in its historical development has passed two main stages - classical (mechanistic) and post-classical (dialectical) in essence.

    The doctrine of Epicurus about the spontaneous deviation of the atom from a straight line contained a modern understanding of determinism, but since the accident itself in Epicurus is not determined by anything (causeless), then without any special errors we can say that indeterminism originates from Epicurus.

    Indeterminism is the doctrine that there are states and events for which a reason does not exist or cannot be indicated.

    In the history of philosophy, two types of indeterminism are known:

    · The so-called "objective" indeterminism, which completely denies causality as such, not only its objective reality, but also the possibility of its subjectivist interpretation.

    · Idealistic indeterminism, which, denying the objective nature of relations of determination, declares causality, necessity, regularity as products of subjectivity, and not as attributes of the world itself.

    This means (in Hume, Kant and many other philosophers) that cause and effect, like other categories of determination, are only a priori, i.e. received not from practice, the forms of our thinking. Many subjective idealists declare the use of these categories as a “psychological habit” of a person to observe one phenomenon following another and declare the first phenomenon as a cause and the second as a consequence.

    The stimulus for the revival of indeterministic views at the beginning of the 20th century was the fact that the role of statistical laws increased in physics, the presence of which was declared to refute causality. However, the dialectical-materialist interpretation of the relationship between chance and necessity, the categories of causality and law, the development of quantum mechanics, which revealed new types of objective causal connection of phenomena in the microcosm, showed the inconsistency of attempts to use the presence of probabilistic processes in the foundation of the microcosm to deny determinism.

    Historically, the concept of determinism is associated with the name of P. Laplace, although already among his predecessors, for example, Democritus and Spinoza, there was a tendency to identify the "law of nature", "causality" with "necessity", to consider "chance" as a subjective result of ignorance of "true" causes ...

    Classical physics (in particular, Newtonian mechanics) has developed a specific idea of \u200b\u200ba scientific law. It was taken as obvious that for any scientific law the following requirement must be fulfilled: if the initial state of the physical system (for example, its coordinates and momentum in Newtonian mechanics) and the interaction that sets the dynamics are known, then in accordance with the scientific law, its state can and must be calculated at any given time, both in the future and in the past.

    The causal relationship of phenomena is expressed in the fact that one phenomenon (cause), under certain conditions, necessarily gives rise to another phenomenon (consequence). Accordingly, working definitions of cause and effect can be given. The cause is a phenomenon, the action of which brings to life, determines the subsequent development of another phenomenon. Then the effect is the result of the action of a certain cause.

    In the determination of phenomena, in the system of their definiteness, along with the cause, conditions also enter - those factors without which the cause cannot give rise to the effect. This means that the cause itself does not work under all conditions, but only under certain conditions.

    The system of determination of phenomena (especially social ones) often includes a pretext - this or that factor that determines only the moment, the time of occurrence of the effect.

    There are three types of temporal directionality of causation:

    1) determination by the past. Such a determination is essentially universal, for it reflects an objective law, according to which the cause in the final analysis always precedes the effect. This regularity was very subtly noted by Leibniz, who gave the following definition of the cause: "The cause is that which makes some thing begin to exist";

    2) determination by the present. Learning about nature, society, our own thinking, we invariably find that many things, being determined by the past, are also in a determinative interaction with things that coexist simultaneously with them. It is no accident that we meet the idea of \u200b\u200ba simultaneous determinative connection in different fields of knowledge - physics, chemistry (when analyzing equilibrium processes), in biology (when considering homeostasis), etc.

    Determination by the present is also directly related to those paired categories of dialectics between which there is a causal relationship. As you know, the form of any phenomenon is under the decisive influence of the content, but this does not mean at all that the content precedes the form in general and at its initial point can be formless;

    3) determination by the future. This determination, as emphasized in a number of studies, although it occupies a more limited place among the determining factors compared to the types discussed above, at the same time plays a noticeable role. In addition, it is necessary to take into account all the relativity of the term "determination by the future": future events are still absent, their reality can be said only in the sense that they are necessarily present as tendencies in the present (and were present in the past). And yet the role of this type of determination is very significant. Let's turn to two examples related to the plots that have already been discussed,

    Determination by the future underlies the explanation discovered by academician P.K. Anokhin of the anticipatory reflection of reality by living organisms. The meaning of such an advance, as emphasized in the chapter on consciousness, is in the ability of a living being to react not only to objects that now directly affect him, but also to changes that seem to be indifferent to him at the moment, but in reality, are signals of probable future impacts. The reason here seems to be from the future.

    There are no causeless phenomena. But this does not mean that all connections between phenomena in the surrounding world are causal.

    Philosophical determinism, as the doctrine of the material regular conditioning of phenomena, does not exclude the existence of non-causal types of conditioning. Non-causal relationships between phenomena can be defined as such relationships in which there is an interconnection, interdependence, interdependence between them, but there is no direct relationship between genetic productivity and temporal asymmetry.

    The most typical example of non-causal conditioning or determination is a functional relationship between individual properties or characteristics of an object.

    The connections between causes and effects can be not only necessary, rigidly conditioned, but also random, probabilistic. Cognition of probabilistic cause-and-effect relationships required the inclusion of new dialectical categories in the causal analysis: chance and necessity, possibility and reality, regularity, etc.

    Accident is a concept polarizing necessity. Accidental is a connection between cause and effect, in which causal grounds admit the realization of any of the many possible alternative consequences. At the same time, what kind of communication option will be realized depends on the combination of circumstances, on conditions that are not amenable to accurate accounting and analysis. Thus, a random event occurs as a result of the influence of some of an indefinitely large number of diverse and precisely unknown causes. The onset of a random event-consequence is, in principle, possible, but not predetermined: it may or may not happen.

    In the history of philosophy, the point of view is widely represented, according to which there is really no chance, it is a consequence of necessary reasons unknown to the observer. But, as Hegel first showed, a random event, in principle, cannot be caused by only internal ones, it is necessary for a particular process to have inherent laws. A random event, as Hegel wrote, cannot be explained from itself.

    The unpredictability of accidents seems to run counter to the principle of causality. But this is not so, because random events and causal relationships are the consequences, although unknown in advance and thoroughly, but still really existing and sufficiently definite conditions and causes. They arise not chaotically and not out of "nothing": the possibility of their appearance, although not rigidly, not unambiguously, is naturally connected with causal grounds. These connections and laws are discovered as a result of studying a large number (flow) of homogeneous random events, described using the apparatus of mathematical statistics, and therefore are called statistical. Statistical patterns are objective in nature, but differ significantly from the patterns of single phenomena. The use of quantitative methods of analysis and calculation of characteristics obeying the statistical laws of random phenomena and processes made them the subject of a special branch of mathematics - the theory of probability.

    Probability is a measure of the possibility of a random event occurring. The probability of an impossible event is zero, the probability of a necessary (reliable) event occurring is one.

    The probabilistic and statistical interpretation of complex cause-and-effect relationships has made it possible to develop and apply in scientific research fundamentally new and very effective methods of understanding the structure and laws of the development of the world. The modern successes of quantum mechanics and chemistry, genetics would be impossible without understanding the ambiguity of the relationship between the causes and effects of the phenomena under study, without recognizing that the subsequent states of a developing subject can not always be completely deduced from the previous one.

    ADDITIONAL PRINCIPLE

    The principle, which Bohr called complementarity, is one of the deepest philosophical and natural-scientific ideas of our time, with which only such ideas as the principle of relativity or the concept of a physical field can be compared. Its generality does not allow us to reduce it to any one statement - they must be mastered gradually, with specific examples. The easiest way (as Bohr did in his time) is to begin with an analysis of the process of measuring the momentum p and the coordinate x of an atomic object.

    Niels Bohr noticed a very simple thing: the coordinate and momentum of an atomic particle cannot be measured not only simultaneously, but in general with the help of one and the same device. Indeed, in order to measure the momentum p of an atomic particle and not change it very much, an extremely light mobile "device" is needed. But precisely because of his mobility, his position is very uncertain. To measure the x coordinate, we must therefore take another - a very massive "device" that would not budge when a particle hits it. But no matter how its impulse changes in this case, we will not even notice it.

    When we speak into a microphone, the sound waves of our voice are converted there into membrane vibrations. The lighter and more mobile the membrane, the more accurately it follows the air vibrations. But the more difficult it is to determine its position at any given time. This simplest experimental setup is an illustration of the Heisenberg uncertainty relation: it is impossible in the same experiment to determine both characteristics of an atomic object - the x coordinate and the momentum p. Two measurements and two fundamentally different devices are required, the properties of which are complementary to each other.

    Complementarity - this is the word and that turn of thought that became available to everyone thanks to Bohr. Before him, everyone was convinced that the incompatibility of the two types of devices inevitably entails the inconsistency of their properties. Bohr denied such straightforwardness of judgments and explained: yes, their properties are really incompatible, but for a complete description of an atomic object, both of them are equally necessary and therefore do not contradict, but complement each other.

    This simple reasoning about the complementarity of the properties of two incompatible devices well explains the meaning of the complementarity principle, but in no way exhausts it. Indeed, we do not need instruments by themselves, but only to measure the properties of atomic objects. The x coordinate and momentum p are those conceptsthat correspond to two properties measured with two instruments. In the familiar chain of knowledge

    phenomenon -\u003e image -\u003e concept -\u003e formula

    the principle of complementarity affects primarily the system of concepts of quantum mechanics and the logic of its inferences.

    The fact is that among the strict provisions of formal logic there is a "rule of the excluded third", which says: of two opposite statements, one is true, the other is false, and the third cannot be. In classical physics, there was no occasion to doubt this rule, since there the concepts of "wave" and "particle" are really opposite and are essentially incompatible. It turned out, however, that in atomic physics both of them are equally well applicable to describe the properties of the same objects, and for complete descriptions must be used at the same time.

    People brought up on the traditions of classical physics perceived these requirements as a kind of violence against common sense and even talked about the violation of the laws of logic in atomic physics. Bohr explained that the point here is not at all in the laws of logic, but in the carelessness with which classical concepts are sometimes used to explain atomic phenomena without any reservations. But such reservations are necessary, and the Heisenberg uncertainty relation δx δp ≥ 1 / 2h is an exact record of this requirement in a strict language of formulas.

    The reason for the incompatibility of additional concepts in our consciousness is deep, but explainable. The fact is that we cannot cognize an atomic object directly - with the help of our five senses. Instead, we use precision and sophisticated instruments that are relatively recent invented. To explain the results of experiments, we need words and concepts, and they appeared long before quantum mechanics and are in no way adapted to it. However, we are forced to use them - we have no other choice: we learn the language and all basic concepts with mother's milk and, in any case, long before we learn about the existence of physics.

    Bohr's complementarity principle is a successful attempt to reconcile the shortcomings of an established system of concepts with the progress of our knowledge of the world. This principle expanded the possibilities of our thinking, explaining that in atomic physics not only concepts change, but also the very formulation of questions about the essence of physical phenomena.

    But the significance of the principle of complementarity goes far beyond the limits of quantum mechanics, where it originally arose. Only later, when trying to extend it to other areas of science, did it become clear its true meaning for the entire system of human knowledge. One can argue about the legitimacy of such a step, but one cannot deny its fruitfulness in all cases, even those far from physics.

    Bohr himself liked to give an example from biology, connected with the life of a cell, the role of which is quite similar to that of the atom in physics. If an atom is the last representative of a substance that still retains its properties, then a cell is the smallest part of any organism, which still represents life in its complexity and uniqueness. To study the life of a cell means to know all the elementary processes that take place in it, and at the same time to understand how their interaction leads to a very special state of matter - to life.

    When trying to execute this program, it turns out that the simultaneous combination of such analysis and synthesis is not feasible. Indeed, in order to penetrate into the details of the mechanisms of a cell, we examine it through a microscope - first an ordinary one, then an electronic one - we heat the cell, pass an electric current through it, irradiate it, decompose it into its component parts ... But the more closely we begin to study the life of the cell, the more we will interfere in its functions and in the course of natural processes taking place in it. In the end, we will destroy it and therefore will not learn anything about it as an integral living organism.

    And yet the answer to the question "What is life?" requires analysis and synthesis at the same time. These processes are incompatible, but not contradictory, but only complementary - in the sense of Bohr. And the need to take them into account at the same time is only one of the reasons why there is still no complete answer to the question of the essence of life.

    As in a living organism, the integrity of its "wave - particle" properties is important in an atom. Finite divisibility matter generated not only the finite divisibility of atomic phenomena - she also brought X to the divisibility limit concepts, with which we describe these phenomena.

    It is often said that a correctly posed question is already half the answer. These are not just nice words.

    A correctly posed question is a question about those properties of a phenomenon that it really has. Therefore, such a question already contains all the concepts that must be used in the answer. An ideally posed question can be answered shortly: "yes" or "no". Bohr showed that the question "Wave or particle?" as applied to an atomic object is incorrectly stated. Of such separate the atom has no properties, and therefore the question does not admit an unambiguous answer "yes" or "no". In the same way, as there is no answer to the question: “What is more: meter or kilogram?”, And any other questions of this type.

    Two additional properties of atomic reality cannot be separated without destroying the completeness and unity of the natural phenomenon, which we call an atom. In mythology, such cases are well known: it is impossible to cut a centaur into two parts, while keeping both a horse and a man alive.

    An atomic object is neither a particle, nor a wave, and even neither one nor the other at the same time. An atomic object is something third, not equal to the simple sum of the properties of a wave and a particle. This atomic "something" is inaccessible to the perception of our five senses, and yet it is certainly real. We do not have images and senses to fully imagine the properties of this reality. However, the power of our intellect, based on experience, allows us to cognize it without it. In the end (we must admit that Born was right), "... now the atomic physicist has gone far from the idyllic ideas of the old-fashioned naturalist, who hoped to penetrate the secrets of nature, trapping butterflies in the meadow."

    When Heisenberg rejected the idealization of classical physics - the concept of "a state of a physical system independent of observation" - he thereby anticipated one of the consequences of the principle of complementarity, since the concepts "state" and "observation" are complementary in Bohr's sense. Taken separately, they are incomplete and therefore can only be determined jointly, through each other. Strictly speaking, these concepts do not exist separately at all: we always observe not something at all, but certainly some state... And vice versa: any "state" is a thing in itself until we find a way to "observe" it.

    Taken separately, the concepts: wave, particle, state of a system, observation of a system are some abstractions that have nothing to do with the atomic world, but are necessary for understanding it. Simple, classical pictures are complementary in the sense that a harmonious fusion of these two extremes is necessary for a complete description of nature, but within the framework of the usual logic they can coexist without contradictions only if their area of \u200b\u200bapplicability is mutually limited.

    After thinking a lot about these and other similar problems, Bohr came to the conclusion that this is not an exception, but a general rule: any truly deep natural phenomenon cannot be defined unambiguously with the help of the words of our language and requires at least two mutually exclusive additional concepts for its definition.This means that, subject to the preservation of our language and the usual logic, thinking in the form of complementarity sets limits to the precise formulation of concepts corresponding to truly deep natural phenomena. Such definitions are either unambiguous, but then incomplete, or complete, but then ambiguous, since they include additional concepts that are incompatible within the framework of ordinary logic. These concepts include the concepts of "life", "atomic object", "physical system" and even the very concept of "knowledge of nature."

    It has long been known that science is just one way to study the world around us. Another, additional, method is embodied in art. The very coexistence of art and science is a good illustration of the principle of complementarity. You can completely go into science or live entirely in art - both of these approaches to life are equally legitimate, although taken separately and incomplete. The core of science is logic and experience. The basis of art is intuition and insight. But the art of ballet requires mathematical precision, and "... inspiration in geometry is just as necessary as in poetry." They do not contradict, but complement each other: true science is akin to art - just as real art always includes elements science. In their highest manifestations, they are indistinguishable and inseparable, like the properties of “wave - particle” in an atom. They reflect different, additional aspects of human experience and only taken together give us a complete picture of the world. It is not known, unfortunately, only the "ratio of uncertainties" for the conjugate pair of concepts "science - art", and therefore the degree of damage that we endure in the one-sided perception of life.

    Of course, this analogy, like any analogy, is both incomplete and not rigorous. It only helps us to feel the unity and contradictions of the entire system of human knowledge.

    The principle of complementarity is a methodological postulate, which was originally formulated by the great Danish physicist and philosopher Niels Bohr in relation to the field. on the properties of deductive systems, which belongs to the field Niels Bohr extended the logical conclusions of Gödel to quantum mechanics and formulated the principle in approximately the following way: in some additional systems. This definition went down in history as the principle of complementarity in quantum mechanics.

    An example of such a solution to the problems of the microworld was the consideration of light in the context of two theories - wave and corpuscular, which led to an amazingly effective scientific result that revealed to man the physical nature of light.

    Niels Bohr went even further in his understanding of this conclusion. He makes an attempt to interpret the principle of complementarity through the prism of philosophical knowledge, and it is here that this principle acquires universal scientific significance. Now the formulation of the principle sounded like: in order to reproduce any phenomenon in order to cognize it in a sign (symbolic) system, it is necessary to resort to additional concepts and categories. In simpler terms, the principle of complementarity assumes in cognition not only possible, but in some cases necessary, the use of several methodological systems that will allow you to acquire objective data on the subject of research. The principle of complementarity in this sense, manifested itself as a fact of agreement with the metaphorical nature of the logical systems of methodology - they can manifest themselves in one way or another. Thus, with the emergence and comprehension of this principle, in fact, it was recognized that logic alone was not enough for cognition, and therefore illogical conduct in the research process was recognized as permissible. Ultimately, the application of Bohr's principle contributed to a significant change

    Later, Yu.M. Lotman expanded the methodological meaning of Bohr's principle and transferred its regularities to the sphere of culture, in particular, applied to the description Lotman formulated the so-called "paradox of the amount of information", the essence of which is that human existence mainly proceeds in conditions of information deficiency ... And as it develops, this deficiency will continue to grow. Using the principle of complementarity, it is possible to compensate for the lack of information by translating it into another semiotic (sign) system. This technique led, in fact, to the emergence of computer science and cybernetics, and then the Internet. Later, the functioning of the principle was confirmed by the physiological adaptation of the human brain to this type of thinking, due to the asymmetry of the activity of its hemispheres.

    Another provision, which is mediated by the action of Bohr's principle, is the discovery by the German physicist Werner Heisenberg, the law of the uncertainty relation. Its action can be defined as recognition of the impossibility of the same description of two objects with the same accuracy, if these objects belong to different systems. A philosophical analogy to this conclusion was brought by who in his work "On reliability" stated that in order to assert the certainty of something, one must doubt something.

    Thus, Bohr's principle acquired tremendous methodological significance in various fields.