Quantum physics and its main theories – quantum mechanics and quantum field theory – were observed in the first half of the 20th century by scientists like Max Planck, Albert Einstein, Erwin Schrödinger, Louis de Broglie, Paul Dirac, Niels Bohr, Wolfgang Pauli, Werner Heisenberg, Max Born, and Ludwig Boltzmann. Let’s have a brief quantum timeline through the development of the new understanding of physics.

Changing the approach of thinking from the traditional understanding of principle via Newton’s physic standing on pillars of absolute 1 and o to the principles of Planck’s physic and understanding these 0 and 1 with the expression of the endless volume of variables in between these two absolute numbers.

A quantum timeline is a timeline or roadmap that outlines the projected milestones and developments in the field of quantum computing and related technologies. It is a visual representation of the progress made in this field over time and serves as a guide for researchers, scientists, and industry experts who are working on advancing the field.

Quantum computing is a field that is still in its early stages, but it has the potential to revolutionize computing by enabling the processing of vast amounts of data in a fraction of the time it takes with classical computers. A quantum timeline typically includes significant milestones such as the development of the first quantum computer, the demonstration of quantum supremacy, and the development of practical quantum applications.

The timeline may also include other related technologies, such as quantum cryptography, quantum communication, and quantum sensing. The purpose of a quantum timeline is to provide a clear picture of the progress made in this field and the expected advancements in the future.

As the field continues to evolve, the quantum timeline will continue to be updated to reflect the latest developments and advancements.

quantum timeline

Timeline of Quantum Theories in years

  • 1864. Theory of the Electromagnetic Field
  • 1905. Theory of Relativity
  • 1905. The wave-particle Theory of electromagnetic radiation
  • 1913. Atomic Theory
  • 1916. Quantum Theory of light
  • 1920s. Quantum leap theory
  • 1920s. Quantum Field Theory
  • 1926. New Quantum Theory of Atoms
  • 1950s. Many-body Theory
  • 1952. Hidden-variable Theories
  • 1952. The pilot wave Theory
  • 1957. The Many-worlds (or Multiverse) Theory
  • 1962. Quantum Theory of Stimulated Raman Effect
  • 1968. String Theory (Theory of quantum gravity)
  • 1970s. Gauge Theory
  • 1970s. Quantum trajectory Theory
  • 1974. Lattice quantum field Theory
  • 1980. Quantum complexity Theory
  • 1980s. Quantum Loop Theory
  • 1980. Quantum complexity Theory
  • 1993. Holographic Universe Theory
  • 2013. Quantum Holonomy Theory

1864. Theory of the Electromagnetic Field

The classical theory of the electromagnetic field, proposed by the British physicist James Clerk Maxwell in 1864, is the prototype of gauge theories. In the paper, Maxwell derives an electromagnetic wave equation with a velocity for light in close agreement with measurements made by the experiment and deduces that light is an electromagnetic wave. Albert Einstein used Maxwell’s equations as the starting point for his special theory of relativity.

1900. Quantum Theory

A German theoretical physicist Max Planck was the one who discovered the quantum of action, now known as Planck’s constant, h. This work laid the foundation for quantum theory.

The Development of Planck’s Quantum Theory:

  • In 1900, Planck made the assumption that energy was made of individual units or quanta.
  • In 1905, Albert Einstein theorized that not just the energy but the radiation itself was quantized in the same manner.
  • In 1924, Louis de Broglie proposed that there is no fundamental difference in the makeup and behavior of energy and matter; on the atomic and subatomic level, either may behave as if made of either particles or waves. This theory became known as the principle of wave-particle duality: elementary particles of energy and matter act, depending on the conditions, like particles or waves.
  • In 1927, Werner Heisenberg proposed that precise, simultaneous measurement of two complementary values – such as the position and momentum of a subatomic particle – is impossible. Contrary to the principles of classical physics, their simultaneous measurement is inescapably flawed; the more precisely one value is measured, the more flawed will be the measurement of the other value. This theory became known as the uncertainty principle, which prompted Albert Einstein’s famous comment, “God does not play dice.”

1905. Theory of Relativity

Albert Einstein published the first part of his theory — special relativity — in the German physics journal Annalen der Physik in 1905. He completed his theory of general relativity only after another decade of difficult work. He presented the latter theory in a series of lectures in Berlin in late 1915 and published in the Annalen in 1916. The theory is based on two key concepts. First, the natural world allows no “privileged” frames of reference. As long as an object is moving in a straight line at a constant speed (that is, with no acceleration), the laws of physics are the same for everyone. It’s a bit like when you look out a train window and see an adjacent train appear to move — but is it moving, or are you? It can be hard to tell. Einstein recognized that if the motion is perfectly uniform, it’s literally impossible to tell — and identified this as a central principle of physics. Second, light travels at an unvarying speed of 186,000 miles a second or 299 337 kilometers a second. No matter how fast an observer is moving or how fast a light-emitting object is moving, a measurement of the speed of light always yields the same result.

1905. The wave-particle Theory of electromagnetic radiation

In 1905, the German-born theoretical physicist Albert Einstein developed a new theory about electromagnetic radiation. The theory is often called the wave-particle theory. It explains how electromagnetic radiation can behave as both a wave and a particle. Einstein argued that when an electron returns to a lower energy level and gives off electromagnetic energy, the energy is released as a discrete packet of energy. We now call such a packet of energy a photon. After Einstein presented his theory, scientists found evidence to support it. For example, double-slit experiments showed that light consists of tiny particles that create interference patterns just as waves do.

1913. Atomic Theory

The Danish physicist Niels Bohr was one of the foremost scientists of modern physics. He proposed a planetary model of the structure of the atom, which later became the basis of quantum mechanics. The Bohr model shows the atom as a small, positively charged nucleus surrounded by orbiting electrons. Combining Rutherford’s description of the nucleus and Planck’s theory about quanta, Bohr explained what happens inside an atom and developed a picture of atomic structure. In his atom theory, electrons absorb and emit radiation of fixed wavelengths when jumping between fixed orbits around a nucleus. The theory provided a good description of the spectrum created by the hydrogen atom but needed to be developed to suit more complicated atoms and molecules.

1916. Quantum Theory of light

Photon is also called a light quantum, minute energy packet of electromagnetic radiation. The concept originated in the German-born theoretical physicist Albert Einstein’s explanation of the photoelectric effect in 1905. He proposed the existence of discrete energy packets during the transmission of light. Einstein supported his photon hypothesis with an analysis of the photoelectric effect, a process discovered by Hertz in 1887, in which electrons are ejected from a metallic surface illuminated by light. Then the general theories of relativity by Einstein were developed, in which the principles of the existence of time, matter, and space were established. This knowledge formed the basis of the quantum theory of light, which comprehends new heights at the modern stage of science development and is not finite. Einstein’s prediction of the dependence of the kinetic energy of the ejected electrons on the light frequency, based on his photon model, was experimentally verified by the American physicist Robert Millikan in 1916. In 1922 American Nobelist Arthur Compton treated the scattering of X-rays from electrons as a set of collisions between photons and electrons. His formula matched his experimental findings, and the Compton effect, as it became known, was considered further convincing evidence for the existence of particles of electromagnetic radiation.

The 1920s. Quantum leap theory

As we already know, before the turn of the century, the way things worked was explained by Newtonian mechanics or classical physics. And the central feature of Newtonian mechanics was that everything was continuous; things flowed smoothly through space; energy could come in an infinite range of amounts; light undulated in a constant wave; there was no minimum amount of anything. Quantum mechanics is essentially the mechanics of quantized things. And anything that is quantized is simply expressed in multiples of some small measurable unit. Now energy, light, force, and motion all came to be quantized. For something to be quantized, you can’t have just any old amount; you can only have multiples of specific minimum quantities. Nature revealed herself to be somewhat grainy or jerky, jumping from one quantum amount to the other, never traversing the area in between. This leads to an uneasy uncertainty about what is going on between those quantum states or quantum leaps. The abruptness of quantum leaps was a central pillar of the way quantum theory was formulated by Niels Bohr, Werner Heisenberg, and their colleagues in the mid-1920s, in a picture now commonly called the Copenhagen interpretation. Quantum leaps are not an internal matter of physics as one of its relation to philosophy and human knowledge in general. Bohr and Heisenberg began to develop a mathematical theory of these quantum phenomena in the 1920s. In 1986, three teams of researchers reported them happening in individual atoms suspended in space by electromagnetic fields. In 2007 a team in France reported jumps that correspond to what they called “the birth, life, and death of individual photons.” So everything in the quantum mechanical universe, which is the universe we live in, happens in quantum leaps.

The 1920s. Quantum Field Theory

Its development began in the 1920s by describing interactions between light and electrons, culminating in the first quantum field theory—quantum electrodynamics. The first reasonably complete theory of quantum electrodynamics, which included both the electromagnetic field and electrically charged matter as quantum mechanical objects, was created by the American Paul Dirac in 1927. Thanks to the somewhat brute-force, ad hoc, and heuristic early methods of Feynman, and the abstract methods of Tomonaga and Schwinger, elegantly synthesized by Freeman Dyson, from the period of early renormalization, the modern theory of quantum electrodynamics (QED) has established itself. It is still the most accurate physical theory known, the prototype of a successful quantum field theory. Quantum electrodynamics is the most famous example of what is known as an Abelian gauge theory.

1926. New Quantum Theory of Atoms

Bohr’s orbital model of the atom, later improved by Arnold Sommerfeld, resulted from the old quantum theory. The birth of the new quantum theory took place in 1925-1926 and was associated with Werner Heisenberg and his matrix mechanics and Erwin Schrödinger and Paul Dirac. The quantum-mechanical approach acknowledges the wavelike character of electrons and provides the framework for viewing the electrons as fuzzy clouds of negative charge. Two of the rules of quantum theory that are most important to explaining the atom are the idea of wave-particle duality and the exclusion principle. French physicist Louis de Broglie first suggested that particles could be described as waves in 1924. In the same decade, Austrian physicist Erwin Schrödinger and German physicist Werner Heisenberg expanded de Broglie’s ideas into formal, mathematical descriptions of quantum mechanics. Austrian-born American physicist Wolfgang Pauli developed the exclusion principle in 1925. The combination of wave-particle duality and the Pauli exclusion principle sets up the rules for filling electron orbitals in atoms. These rules explain why atoms with similar numbers of electrons can have very different properties and why chemical properties repeatedly reappear in a regular pattern among the elements.

1950s. Many-body Theory

The many-body problem is a general name for a vast category of physical problems pertaining to the properties of microscopic systems made of many interacting particles. Microscopic here implies that quantum mechanics has to be used to provide an accurate description of the system. In a quantum system, the repeated interactions between particles create quantum correlations or entanglement. As a consequence, the wave function of the system is a complicated object holding a large amount of information, which usually makes exact or analytical calculations impractical or even impossible. Thus, many-body theoretical physics most often rely on a set of approximations specific to the problem at hand and ranks among the most computationally intensive fields of science. The goal of quantum many-body theory, or physics, is to understand the emergent properties—probed by thermodynamic, spectroscopic, and linear response functions—of a system of many interacting particles. As early as the 1950s, the methods of quantum field theory were applied to quantum fluids of fermions and bosons. Those efforts culminated in 1957 in the Bardeen-Cooper-Schrieffer theory of superconductivity. In 1963 Alexei Abrikosov, Lev Gor’kov, and Igor Dzyaloshinskii wrote their classic book Methods of Quantum Field Theory in Statistical Physics (Prentice-Hall) on the use of Feynman diagrams to attack many-body problems at finite temperature. Terse but full of insights, the book had an enormous impact and is still used by practitioners.

1952. Hidden-variable Theories

The uncertainty principle says that you can’t know certain properties of a quantum system simultaneously. For example, you can’t simultaneously know the position of a particle and its momentum. But what does that imply about reality? If we could peer behind the curtains of quantum theory, would we find that objects do have well-defined positions and momentums? Or does the uncertainty principle mean that, at a fundamental level, objects just can’t have a clear position and momentum at the same time? In other words, is the blurriness in our theory, or is it in reality itself? In physics, hidden-variable theories are proposals to provide explanations of quantum mechanical phenomena through the introduction of unobservable hypothetical entities. Historically, the first and most famous of them is the de Broglie-Bohm theory. The emergence of this theory stimulated the appearance of some modifications of Neumann’s theorem. Most hidden-variable theories are attempts at a deterministic description of quantum mechanics to avoid quantum indeterminacy but at the expense of requiring the existence of nonlocal interactions. The currently best known hidden parameter theory, a “causal” interpretation by physicist and philosopher David Bohm, initially published in 1952, is the nonlocal hidden parameter theory. Bohm unknowingly rediscovered (and extended) the idea proposed (and abandoned) by Louis de Broglie in 1927, so this theory is commonly referred to as the “de Broglie-Boehm theory.

1952. The pilot wave Theory

The pilot wave theory, also known as the de Broglie–Bohm theory, Bohmian mechanics, Bohm’s interpretation, and the causal interpretation, interprets quantum mechanics. In addition to the wavefunction, it also postulates an actual configuration of particles exists even when unobserved. A guiding equation defines the evolution over time of the configuration of all particles. The Born rule in Broglie–Bohm theory is not a fundamental law. Instead, in this theory, the link between the probability density and the wave function has the status of a hypothesis, called the “quantum equilibrium hypothesis”, which is additional to the wave function’s basic principles. The theory was historically developed in the 1920s by de Broglie, who, in 1927, was persuaded to abandon it in favor of the then-mainstream Copenhagen interpretation. David Bohm, dissatisfied with the prevailing orthodoxy, rediscovered de Broglie’s pilot-wave theory in 1952. Since the 1990s, there has been renewed interest in formulating extensions to de Broglie–Bohm theory, attempting to reconcile it with special relativity and quantum field theory, besides other features such as spin or curved spatial geometries.

1957. The Many-worlds (or Multiverse) Theory

Hugh Everett was an American physicist who first proposed the many-worlds interpretation of quantum physics in 1957. According to his work, we live in a multiverse of countless universes, full of copies of each of us. The same idea had occurred to Erwin Schrödinger half a decade earlier. Everett’s version is more mathematical, Schrödinger’s more philosophical. Still, the essential point is that both of them were motivated by a wish to get rid of the idea of the “collapse of the wave function,” and both of them succeeded. Bryce DeWitt popularized the formulation and named it many worlds in the 1960s and 1970s.

1962. Quantum Theory of Stimulated Raman Effect

Raman scattering or the Raman effect is the inelastic scattering of a photon by molecules that are excited to higher vibrational or rotational energy levels. It was discovered by the Indian physicist C. V. Ramanand in liquids and independently by Grigory Landsbergand Leonid Mandelstam in crystals. The effect had been predicted theoretically by the Austrian theoretical physicist Adolf Smekal in 1923, followed by theoretical works by Kramers, Heisenberg, Dirac, Schrödinger, and others. Raman scattering can occur in a gas with a change in energy of a molecule due to a transition to another (usually higher) energy level. Chemists are primarily concerned with this “transitional” Raman effect. Indian physicist, whose work was influential in the growth of science in India, was the recipient of the Nobel Prize for Physics in 1930 for the discovery that when light traverses a transparent material, some of the light that is deflected changes in wavelength.

1968. String Theory (Theory of quantum gravity)

The Italian theoretical physicist Gabriele Veneziano first formulated the foundations of string theory in 1968 when he discovered a string picture could describe the interaction of strongly interacting particles. String theory attempts to unify all four forces, and in so doing, unify general relativity and quantum mechanics. At its core is a relatively simple idea—all particles are made of tiny vibrating strands of energy. String theory describes how strings propagate through space and interact with each other. On distance scales larger than the string scale, a string will look just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In this way, all of the different elementary particles may be viewed as vibrating strings. In string theory, one of the vibrational states of the string gives rise to the graviton, a quantum mechanical particle that carries gravitational force. Thus string theory is a theory of quantum gravity.

The 1970s. Gauge Theory

The 1960s and 1970s saw the formulation of a gauge theory now known as the Standard Model of particle physics, which systematically describes the elementary particles and their interactions. Gauge theory, or class of quantum field theory, is a mathematical theory involving both quantum mechanics and Einstein’s special theory of relativity commonly used to describe subatomic particles and their associated wave fields. In short, the structure of the group of gauge transformations in a particular gauge theory entails general restrictions on the way in which the field described by that theory can interact with other fields and elementary particles. For various theoretical reasons, the concept of gauge invariance seems fundamental, and many physicists believe that the final unification of the fundamental interactions (i.e., gravitational, electromagnetic, strong, and weak) will be achieved by a gauge theory.

The 1970s. Quantum trajectory Theory

In the early days of quantum mechanics, it was a probabilistic theory, telling us only what we will observe on average if we collect records for many events or particles. To Erwin Schrödinger, whose eponymous equation prescribes how quantum objects behave, it wasn’t significant to think about specific atoms or electrons doing things in real-time. In other words, quantum mechanics seemed to work only for “ensembles” of many particles. But there’s another way to formulate quantum mechanics to speak about single events happening in individual quantum systems. It is called quantum trajectory theory, and it’s perfectly compatible with the standard formalism of quantum mechanics — it’s really just a more detailed view of quantum behavior. The standard description is recovered over long timescales after the average of many events is computed. Quantum trajectory theory, mainly developed in the quantum optics community to describe open quantum systems subjected to continuous monitoring, has applications in many areas of quantum physics. The formulation through the integral over trajectories was developed in 1948 by Richard Feynman, serving as the basis for developing and completing this theory formulation in the 1970s.

1974. Lattice quantum field Theory

In physics, lattice field theory is the study of lattice models of quantum field theory, that is, of field theory on a space-time that has been discretized onto a lattice. An important step to answer such questions has been made by K. Wilson in 1974. He introduced a formulation of Quantum Chromodynamics on a space-time lattice, which allows the application of various non-perturbative techniques. It should also be pointed out that the introduction of a space-time lattice can be taken as a starting point for a mathematically clean approach to quantum field theory, so-called constructive quantum field theory. Lattice field theory has turned out to be very successful for the non-perturbative calculation of physical quantities.

1980. Quantum complexity Theory

Quantum complexity theory is the subfield of computational complexity theory that deals with complexity classes defined using quantum computers, a computational model based on quantum mechanics. It studies the hardness of computational problems concerning these complexity classes and the relationship between quantum complexity classes and classical (i.e., non-quantum) complexity classes. And the set of computational problems that can be solved by a computational model under certain resource constraints is the complexity class. The development of quantum complexity theory is related to the concept of the Turing machine, which is a mathematical model of computation that defines an abstract machine proposed by Alan Turing in 1936 to formalize the idea of an algorithm. That is, any intuitive algorithm can be implemented using some Turing machine. The history of quantum computing began in the early 1980s when American physicist Paul Benioff proposed a quantum mechanical model of the Turing machine in 1980.

The 1980s. Quantum Loop Theory

It’s only since the middle 1980s that real progress began on unifying relativity and quantum theory. The turning point was the invention of not one but two approaches: loop quantum gravity and string theory. Neither is yet in final form. Lee Smolin, a theoretical physicist, is concerned with quantum gravity, “the name we give to the theory that unifies all the physics now under construction.” He is a co-inventor of an approach called loop quantum gravity. Quantum gravity is the name we give to the theory that unifies all of physics. The roots of it are in Einstein’s general theory of relativity and quantum theory. Einstein’s general theory of relativity is a theory of space, time, and gravity. In contrast, quantum theory describes everything else that exists in the universe, including elementary particles, nuclei, atoms, and chemistry. These two theories were invented in the early twentieth century, and their ascension marked the overthrow of the previous theory, which was Newtonian mechanics. They are the primary legacies of twentieth-century physics. The problem of unifying them is the main open problem in physics left for us to solve in this century.

1980. Quantum complexity Theory

Quantum complexity theory is the subfield of computational complexity theory that deals with complexity classes defined using quantum computers, a computational model based on quantum mechanics. It studies the hardness of computational problems in relation to these complexity classes, as well as the relationship between quantum complexity classes and classical (i.e., non-quantum) complexity classes. And the set of computational problems that can be solved by a computational model under certain resource constraints is the complexity class. The development of quantum complexity theory is related to the concept of the Turing machine which is a mathematical model of computation that defines an abstract machine proposed by Alan Turing in 1936 to formalize the concept of an algorithm. That is, any intuitive algorithm can be implemented using some Turing machine.
The history of quantum computing began in the early 1980s when the American physicist Paul Benioff proposed a quantum mechanical model of the Turing machine in 1980.

1993. Holographic Universe Theory

The holographic principle is a hypothesis put forward in 1993 by the Dutch theoretical physicist Gerard ‘t Hooft. The holographic principle is a tenet of string theories and a supposed property of quantum gravity that states that the description of a volume of space can be thought of as encoded on a lower-dimensional boundary to the region—such as a light-like boundary like a gravitational horizon. First proposed by Gerard ‘t Hooft, Leonard Susskind gave it a precise string-theory interpretation. The holographic principle was inspired by black hole thermodynamics and resolves the black hole information paradox within the framework of string theory.

2013. Quantum Holonomy Theory

Quantum holonomy theory is a non-perturbative theory of quantum gravity coupled to fermionic degrees of freedom. Quantum holonomy theory arises from an intersection between different branches in modern theoretical physics and mathematics, namely quantum field theory and non-commutative geometry. Two Danish scientists, theoretical physicist Jesper Møller Grimstrup and mathematician Johannes Aastrup, have developed Quantum Holonomy Theory, which is a candidate for a fundamental theory of Nature.

LATEST POSTS

PEOPLE
Gandhi and Mandela about struggle for rights and freedom
In this hypothetical discussion, Mahatma Gandhi and Nelson Mandela reflect on their shared commitment to the struggle for rights and freedom.
Nietzsche and Jean-Paul Sartre about role of the individual in society
Friedrich Nietzsche and Jean-Paul Sartre on the topics of existence, freedom, ethics, and the role of the individual in society.

TECHNOLOGY

THOUGHTS
Utopia: The Ideal Society Unveiled
Discover the origins of utopia, its impact throughout history, and humanity's eternal pursuit of an ideal world.
Patocracy
Uncover the concept of patocracy, where a select elite wield significant power, and its effects on society and politics.
Global democracy
Global democracy will be based on one world state operating on liberal and democratic principles.

EDITOR'S SELECTION

science, history, government, economics, space, people, wellbeing, healthcare, technology, energy, climate, infrastructure, business, security, art, games, absurdystan, buzzwords, relax, sustainable development, entertainment, home,