Get PDF The Man Who Invented the Twentieth Century

Free download. Book file PDF easily for everyone and every device. You can download and read online The Man Who Invented the Twentieth Century file PDF Book only if you are registered here. And also you can download or read online all Book PDF file that related with The Man Who Invented the Twentieth Century book. Happy reading The Man Who Invented the Twentieth Century Bookeveryone. Download file Free Book PDF The Man Who Invented the Twentieth Century at Complete PDF Library. This Book have some digital formats such us :paperbook, ebook, kindle, epub, fb2 and another formats. Here is The CompletePDF Book Library. It's free to register here to get Book file PDF The Man Who Invented the Twentieth Century Pocket Guide.
Introduction
Contents:


  1. The world after the Revolution: Physics in the Second Half of the Twentieth Century
  2. Infectious diseases and chemotherapy
  3. Morse Code & the Telegraph - Inventor & World Impact - HISTORY
  4. Accessibility Navigation

The utopian society, in essence, seems like a cage to John.


  • Made in Scotland: A history of invention.
  • Vocal Mastery Talks with Master Singers and Teachers, Comprising Interviews with Caruso, Farrar, Maurel, Lehmann, and Others!
  • Climbing The Corporate Ladder!

Much of the inspiration for Brave New World comes from an earlier novel that scrutinises a futuristic utopia in a similar way. We by the Russian writer E. As the Second World War dawned, writers were less interested in depicting utopian societies founded on scientific advancements. The political dystopia, perhaps inspired by the tumultuous European politics of the lates and earlys, began to flourish.

The future world has been divided into three super-states, Oceania, Eurasia and Eastasia all of which are embroiled in a perpetual war. In , Aldous Huxley wrote Brave New World Revisited , a reassessment of his most famous novel and a furthering of many of its themes. Brave New World Revisited is an important text in the development of post-war dystopias. From his reading of this text, Anthony Burgess produced two novels that dealt with future worlds where freedoms are challenged: The Wanting Seed and A Clockwork Orange , both published in In The Wanting Seed , Burgess tackles overpopulation.

Set in a future London that sprawls from the south coast to Birmingham, the novel depicts a society in which reproduction is brutally policed by the Ministry of Infertility and homosexuality is actively encourage by the State. The story follows the plight of teacher Tristram Foxe as he witnesses the dissolution of civilisation. The Innovative Spirit. Travel Taiwan. American South. Travel With Us. At the Smithsonian Visit. New Research. Curators' Corner. Ask Smithsonian.

Photos Submit to Our Contest. Photo of the Day. Video Ingenuity Awards. Smithsonian Channel. Video Contest. Games Daily Sudoku. Universal Crossword. Daily Word Search. Mah Jong Quest. Fortunately, that same year, the small but highly sensitive antenna became available for radio-astronomy use, as the company had decided to abandon the business of satellite communications. While making measurements on a wavelength of 7.

Moreover, this extra radiation or temperature , which they believed to be the effect of some sort of background noise, turned out to be constant, no matter which direction the antenna was pointing. The data indicated that the origin of what they were measuring was not in the atmosphere, or the sun, or even our galaxy. It was a mystery. Having verified that the noise did not come from the antenna itself, the only possible conclusion they could draw was that it had something to do with the cosmos, although they did not know what its cause might be.

The answer to that question came from their colleagues at nearby Princeton University, some of whom, like James Peebles b. Unwittingly, Penzias and Wilson stumbled upon it first. According to current estimates, the temperature corresponding to that radiation in the microwave realm is around 2. It is significant that Penzias and Wilson detected the microwave background at a center dedicated to industrial research, where new instruments were developed and available. It is a perfect example of what we mentioned above: the necessity for more precise instruments and new technology in order to advance our knowledge of the Universe.

As such technology became available, the image of the cosmos grew, and this led to more discoveries, two of which I will discuss below: pulsars and quasars. In , Cyril Hazard, an English radio-astronomer working in Australia, precisely established the position of a powerful radio-source, called 3C With that data, Maarten Schmidt b. This, in turn, implied that it was an extremely luminous object—more than one hundred times as bright as a typical galaxy. Objects of this type are called quasi-stellar sources, or quasars for short, and are thought to be galaxies with very active nuclei.

Since 3C was discovered, several million more quasars have been found. They constitute ten percent of all light-emitting galaxies and many astrophysicists believe that many of the most brilliant galaxies pass briefly through a quasar phase. Most quasars are very far from our galaxy, which means that the light that reaches us must have been emitted when the Universe was much younger.

In , Jocelyn S. Bell b. While using it, Bell observed a signal that appeared and disappeared with great rapidity and regularity. Its cycle was so constant that it seemed to have an artificial origin could it possibly be a sign of intelligent extraterrestrial life? But what were those highly regular radio sources? Gold realized that such short cycles around one to three seconds in the first detected pulsars , could only come from a very small source.

White dwarfs were too large to rotate or vibrate at such a frequency, but neutron stars could 7. But did the origin of the signals being received lie in the vibration or rotation of such stars? Certainly not their vibrations, because neutron stars vibrate much too fast around a thousand times a second to explain the cycles of most pulsars. So pulsars had to be rotating neutron stars. Since then, scientists have discovered pulsars that emit X-rays or gamma rays and some even emit light in the visible spectrum , so nowadays, scientists also accept the possibility of other mechanisms for the production of their radiation emissions, including the accretion of matter in double systems.

Besides their astrophysical interest, pulsars serve other functions. In , after various years of continuous observation of that binary system, they were able to conclude that the orbits of those pulsars vary and are growing closer together. That result was thought to indicate that the system is losing energy due to the emission of gravitational waves Taylor, Fowler and McCulloch Since then, other binary pulsar systems have been discovered, but it is still not possible to detect gravitational radiation with instruments built and installed on Earth.

The world after the Revolution: Physics in the Second Half of the Twentieth Century

This is extremely difficult, due to the extreme faintness of the affects involved. The gravitational waves that would arrive at the Earth from some part of the Universe where an extremely violent event had taken place would produce distortion in the detectors no greater than one part out of That would be a tiny fraction the size of an atom. However, there are already devices designed to achieve this: the four-kilometer system of detectors in the United States known as LIGO Laser Interferometric Gravitational wave Observatories.

Quasars are also very useful for studying the Universe in conjunction with general relativity. About one of every five-hundred quasars is involved in a very interesting relativist phenomenon: the diversion of the light it emits due to the gravitational effect of other galaxies situated between that quasar and the Earth, from which that effect is being observed. Since then, the Hubble space telescope has photographed a cumulus of galaxies about a thousand million light-years away in which, besides the light of the cumulus of galaxies itself, it is possible —though difficult because of their lesser luminescence— to detect numerous arcs segments of rings.

Those arcs are actually images of galaxies much farther away from us that the cumulus, but seen through the effect of the gravitational lens the cumulous acts as a lens, distorting the light coming from those galaxies. Beside offering new evidence supporting general relativity, these observations have the added value that the magnitude of diversion and distortion visible in those luminous arcs is far greater than could be expected if the cumulus only contained the galaxies we see in it.

In fact, evidence indicates that those cumuli contain between five and ten times more matter than we can see. Could this be the dark matter we will discuss further on? For many scientists—at least until the problem of dark matter and dark energy took the fore—the background radiation, pulsars and quasars discussed in this section were the three most important discoveries in astrophysics during the second half of the twentieth century.

What those discoveries tell us, especially pulsars and quasars, is that the Universe is made up of much more surprising and substantially different objects than were thought to exist in the first half of the twentieth century. That theory mainly addressed the Universe, and exploring it would require technological means that did not even exist at the time, not to mention significant financial support.

This problem began fading at the end of the nineteen sixties, and it can now be said that general relativity is fully integrated into experimental physics, including areas that are not even that close, such as the Global Positioning System GPS. It is not only a part of experimental physics related to astrophysics and cosmology; as we will see further on; it is also a part of high-energy physics.

And here we must mention one of the most surprising and attractive stellar objects linked to general relativity discovered in the last few decades: black holes, whose existence has even reached beyond purely scientific circles and entered the social realm. As I said, these object belong to the theoretical tenets of general relativity, although their Newtonian equivalents had already been proposed—and forgotten—much earlier by the British astronomer John Michell c. Studies leading to black holes began in the nineteen thirties, when the Hindu physicist Subrahamanyan Chandrasekhar and the Russian Lev Landau demonstrated that in the Newtonian theory of gravitation, a cold body with a mass superior to 1.

That result led scientists to ask what general relativity predicted for the same situation. In , Robert Oppenheimer and two of his collaborators, George M. Volkoff and Hartland Snyder demonstrated that a star with that mass would collapse until it was reduced to a singularity; that is, to a point with a volume of zero and an infinite density Oppenheimer and Volkoff , Oppenheimer and Snyder In , Soviet physicists, Evgenii M. Lifshitz and Isaak M. Khalatnikov b. Following the work of his Soviet colleges, the British mathematician and physicist Roger Penrose b.

They demonstrated that such singularities were inevitable when a star collapsed, providing certain conditions were met The man responsible for this apparently insignificant terminological revolution was the United States physicist John A. Wheeler He, himself, explained the genesis of that term in the following manner Wheeler and Ford , :. What were those pulsars?

Vibrating white dwarfs? Rotating neutron stars? In my lecture, I argued that we should consider the possibility that, at the center of a pulsar, we might find a completely collapsed gravitational object.


  • Comprehensive Nursing Care for Parkinsons Disease.
  • Numbers Números (English Portuguese Bilingual Series).
  • The Greatest Inventions That Transformed the Early 1900s;

I had been looking for the right term for months, ruminating in bed, in the bathtub, in my car, whenever I had a free moment. Suddenly, that name seemed totally correct to me. The name was catchy, and it stuck, but the explanation was mistaken as I pointed out above, a pulsar is driven by a neutron star. While the history of black holes began with the physics work of Oppenheimer and his collaborators, mentioned above, for some years, the field was dominated by purely mathematical studies like the previously mentioned ones by Penrose and Hawking.

The Man Who Invented the 20th Century

The underlying physical idea was that they must be very different than any other type of star, even though their origins were linked to them. They would occur when, after exhausting its nuclear fuel, a very massive star began to contract irreversibly, due to gravitational force.

The center of a black hole is its point of collapse. However, there is a possible way out of such a paradoxical situation: the general theory of relativity is not compatible with quantum requirements, but clearly, when matter is compressed into a very reduced area, its behaviour will follow quantum rules. Thus, a true understanding of the physics of black holes calls for a quantum theory of gravitation either by quantizing general relativity, or by constructing a new theory of gravitational interaction that can be quantized.

At the present time, this has yet to be done, although some steps have been made in that direction, including one by Hawking himself, the grand guru of black holes. As a result, we do not really know what those mysterious and attractive objects are. Do they, in fact, exist at all? The answer is yes. There are ever-greater indications that they do. On 12 December , the United States launched a satellite from Kenya to celebrate its independence. Among the identified sources is Cygnus X-1, one of the most brilliant in the Milky Way, located in the region of the Swan.

This source was later linked to a visible super-giant blue star with a mass 30 times that of the Sun and an invisible companion. The movement of the blue star indicated that its companion had a mass 7 times that of the Sun, a magnitude too great to be a white dwarf or a neutron star. It must be, therefore, a black hole. However, some argue that its mass is 3 solar masses, in which case it could be a neutron star.

In over two hundred cases, it has been possible to indirectly determine the masses of those super black holes, but a direct determination has only been possible in a few cases. One of the latter is in our own Milky Way. The study of the Universe is enormously puzzling. Obviously, measuring such basic data as distances, masses and velocities is extremely complex there. With the data then available, there was a time when the model that offered the Robertson-Walker-Friedmann solution to general relativity was sufficient.

It represents a Universe that expands with an acceleration that depends on its mass-energy content. But there were increasingly clear problems with the cosmology of the Big Bang. One of these was the question of whether mass-energy is such that the Universe will continue to expand forever, or if it is large enough that gravitational attraction will eventually overcome the force of the initial explosion, reaching the point where it begins to contract and finally arrives at a Big Crunch. Another problem lay in the considerable uniformity with which mass appears to be distributed throughout the Universe.

This is observable using units of measurement of some million light-years or more of course, on a small scale, the Universe, with its stars, galaxies, cumuli of galaxies and enormous interstellar voids, is not homogeneous.

The world of gravitation

Background microwave radiation is good proof of this macro-homogeneity. To resolve this problem the idea of an inflationary Universe was proposed. In other words, the mini-universe must have experienced a growth so rapid that there was not enough time to develop physical processes that would have led to non-homogeneous distributions.

Once that inflationary stage ended, the Universe must have continued evolving according to the classic Big Bang model. Among the scientists responsible for this inflationary theory, we should mention the American, Alan Guth b. But, more than specific names, what I want to point out is that it is impossible to understand this theory without recourse to high-energy physics—what used to be called elementary-particle physics, which I will discuss further on—especially the Grand Unified Theories GUT , which predict that there would have to be a phase shift at temperatures around degrees Kelvin So, inflation lies at the origin of a uniform Universe.

But then, what caused the miniscule primordial non-homogeneities that, with the passage of time and the effect of gravitational force, gave birth to cosmic structures such as galaxies? One possible answer is that inflation may have enormously amplified the ultramicroscopic quantum fluctuations that occurred as a result of the uncertainty principle applied to energies and time?

If that were the case, what better place to look for non-homogeneities than the microwave radiation background? The answer to this question appeared in the work of a team of US scientists led by John C. Mather b. In , NASA approved funding for the construction of a satellite—the Cosmic Background Explorer COBE , which was put into orbit kilometers above the Earth in the fall of —to study the cosmic microwave background.

The entire project was coordinated by Mather, including the experiment in which he used a spectrophotometer cooled to 1. Meanwhile, Smoot measured the miniscule irregularities predicted by inflation theory. Ten years later, following the work of over a thousand people and a cost of million dollars, it was announced Mather et al. Just how thrilled those researchers were when they confirmed their results is clear in a book for lay readers published by Smoot soon thereafter.

Wrinkles in Time Smoot and Davidson, , :. I was looking at the primordial form of the wrinkles, I could feel it in my bones. Some of the structures were so huge that they could only have been generated when the Universe was born, no later. What was before my eyes was the mark of creation, the seeds of the present Universe. COBE was a magnificent instrument, but it was by no means the only one.

There are many examples of astrophysics and technology working hand in hand, not only with Earth-based instruments, but also spacecraft. At this point, scientists have been exploring our Solar System for quite some time using satellites with refined instruments that send us all sorts of data and images: space probes such as Mariner 10, which observed Venus from a distance of 10, kilometers in ; Pioneer 10 and Voyager 1 and 2, which approached Jupiter, Saturn, Uranus and Pluto between and , and Galileo, aimed at Jupiter and its moons.

Since it was launched, and especially since its defects were corrected, Hubble has sent, and continues to send, spectacular images of the Universe. Thanks to it, we have the first photos of regions such as the Orion nebulous where it appears that stars are being born. It would not be a complete exaggeration to say that Hubble has revolutionized our knowledge of the Universe.

Thanks to technological advances, scientists are starting to be able to see new aspects and objects in the cosmos, such as planetary systems associated with stars other than the Sun. The first discovery of this sort took place in , when Alex Wolszczan and Dale Frail found that at least two Earthlike planets were orbiting around a pulsar Wolszczan and Frail Three years later, Michel Mayor and Didier Queloz announced their discovery of a planet of the same size and type as Jupiter a gaseous giant orbiting around the star 51 Peasi Mayor and Queloz Since then, the number of known extrasolar planets has grown considerably.

And if such planets exist, life may have developed on some of them as well. Now, while the biology that addresses the problem of the origin of life supports the possibility that in sufficiently favorable environments combinations of chemicals could produce life through synergic processes, most probably such life would be of a different type than human life. Evolutionist biology, supported by geological data, has shown that the human species is the product of evolutionary chance. If, for example, an asteroid or comet approximately ten kilometers in diameter had not collided with the Earth some 65 million years ago—it hit the Earth at a speed of about thirty kilometers a second, producing energy equivalent to the explosion of one hundred million hydrogen bombs—then an enormous number of plant and animal species might never have disappeared or certainly not then.

These included the dinosaurs that impeded the rise of those small mammals that later evolved into homo sapiens and other species. It is that element of chance that makes it impossible to be certain there is intelligent life on other planets —in our galaxy or others—or that such a life form might be trying, or have tried, to understand nature, build scientific systems, and attempt to communicate with other living beings that may exist in the Universe.

Still, for quite some time, research programs have been scanning the Universe in search of signs of intelligent life—programs such as the Search of Extra-Terrestrial Intelligence SETI , which has used million-channel receivers that carry out around twenty thousand million operations per second. But other discoveries relative to the contents of the Universe are a very different matter. For example, we have good reasons to believe that the cosmos contains a large amount of invisible matter that exercises gravitational force.

The most immediate evidence comes from rotating disk-shaped galaxies such as our own Milky Way. When we look at the outer part of such galaxies, we see that their gas moves at a surprising speed —much faster than it should, given the gravitational attraction produced by the stars and gasses we can detect inside it. Other evidence comes from the internal movement of galaxy cumuli. That is one of the problems. It could consist of barely luminescent stars such as brown dwarfs , or exotic elemental particles, or black holes. We cannot really understand what galaxies are, or how they came into being, until we know what this dark matter is.

Nor will we be able to know what the ultimate destiny of the Universe is. Along with dark matter, another similar question came to the fore in the last decade of the twentieth century: dark energy. It had been assumed that the Big Bang must have been driven by a repulsive energy during the creation of the universe, but no one had imagined that such energy could continue to exist in the now-mature Universe.

Infectious diseases and chemotherapy

And since energy is equivalent to mass, that dark energy signified a new contribution to the total mass of the Universe, thought not the same as dark matter. In other words: we thought we knew what the Universe is, and it turns out to be practically unknown to us, because we know the nature and make up of neither dark matter nor dark energy.

One possible explanation of the latter could be found in the term introduced by Einstein in in his field equations for general relativity. As we saw, when applying his theory of gravitational interaction to the entire Universe, Einstein sought a model that would represent a static Universe. That obliged him to introduce a new term into his equations, the previously mentioned cosmological constant, which actually represents a field of repulsive forces that compensate for the attractive effects of gravitation.

When relativistic cosmology found solutions that represent an expanding Universe, and that expansion was demonstrated by observation Hubble , Einstein thought that it was no longer necessary to maintain that constant, although it could be included without any difficulty in theoretical expansive models. Now, it seems necessary to resurrect this term, but it will not be enough to include it in relativist cosmology again; it has to find its place and meaning in quantum theories that attempt to make gravity a part of quantum system.

After all, dark energy is the energy of the void, and from a quantum viewpoint, vacuum has a structure. And given that quantum physics has again entered the picture here, let us discuss how the quantum revolution developed and solidified during the second half of the twentieth century. We also saw that it was Ernest Lawrence who found a new way forward, developing instruments called particle accelerators in his case, cyclotrons , which functioned by accelerating particles to high energy levels and then making them collide with each other or with some predetermined target.

The idea was to examine what was produced by such collisions, that is, what new and smaller components make up such particles if, in fact, there are any The physics of elemental particles, also called high-energy physics, as I indicated above, became one of the main protagonists of the second half of the twentieth century.

This is very expensive science it is the epitome of Big Science, which requires large teams of scientists and technicians and large investments , and is becoming ever more expensive, as the size of accelerators grows, making it possible to reach higher energy levels. After World War II, especially in the United States, high-energy physics drew on the prestige of nuclear physics, which had supplied the powerful atomic bombs. Here, I will mention only the most important accelerators.

In , the Cosmotron entered service in Brookhaven, New York. It was for protons and reached 2. CERN now includes more countries including Spain and with its accelerators it has played an outstanding role in the development of high-energy physics. In fact, it was recently dealt a serious blow by what had been, until then, its strongest supporter: the United States. This gigantic accelerator, which U. Inside that tunnel, thousands of magnetic superconductor spools would guide two proton beams.

After millions of laps, they would reach levels twenty times higher than could be attained with existing accelerators. At various points along the ring, protons from the two beams would collide and enormous detectors would track the results of those collisions. The project would take ten years, and its cost was initially estimated at 6, million dollars. Things got off to a rocky start, but the tunnel excavation was completed.

However, on 19 October , following prolonged, difficult and changing discussions in both houses of Congress, the House of Representatives finally cancelled the project. Other scientific programs—especially in the field of biomedicine—were more attractive to American congressmen, senators and—why deny it? One of its inhabitants was particularly striking: quarks.

Morse Code & the Telegraph - Inventor & World Impact - HISTORY

Their existence had been theorized in by U. Until quarks appeared in the complex and varied world of elemental particles, it was thought that protons and neutrons were indivisible atomic structures, truly basic, and that their electrical charge was an indivisible unit. But quarks did not obey this rule, and they were assigned fractional charges. Thus, a proton is made up of two u quarks and one d, while a neutron consists of two d quarks and one u. Therefore, they are composite structures. Later, other physicists proposed the existence of three other quarks: charm c; , bottom b; and top t; To characterize these quarks, scientists say they have six flavors.

Moreover, each of the six types comes in three varieties, or colors: red, yellow or green and blue. And for each quark there is, of course, an antiquark.

Accessibility Navigation

Needless to say, terms like these—color, flavor, up, down, and so on—do not represent the reality we normally associate with such concepts, although in some cases there can be a certain logic to them, as happens with color. This is what Gell-Mann , had to say about that term:. There are three colors, called red, green and blue, like the three basic colors in a simple theory of human color vision in the case of painting, the three primary colors are usually red, yellow and blue, but when mixing light instead of pigment, yellow is replaced by green.

The recipe for a neutron or a proton calls for a quark of each color, that is, one red, one green and one blue, so that the sum of the colors cancels out. As in vision, where white can be considered a mixture of red, green and blue, we can metaphorically state that neutrons and protons are white. In short, quarks have color but hadrons do not: they are white. We will never be able to observe a free quark. Now in order for quarks to remain confined, there have to be forces among them that are very different than electromagnetic or other kinds of forces.

About ten years after quarks appeared, a theory, quantum chromodynamics, was formulated to explain why quarks are so strongly confined that they can never escape from the hadron structures they form. With quantum electrodynamics —which, as I already stated, emerged in the first half of the twentieth century—and quantum chromodynamics, we have quantum theories for both electromagnetic and strong interactions. But what about the weak interaction, responsible for radioactive phenomena? George Sudarshan b. They independently proposed a theory that unified electromagnetic and weak interactions.

Their model included ideas proposed by Sheldon Glashow b. The electroweak theory unified the description of electromagnetic and weak interactions. But could it be possible to take a farther step on the path to unification, formulating a theory that would also include the strong interaction described by quantum chromodynamics? The affirmative answer to this question was provided by Howard Georgi b. From the perspective of GUTs, in the beginning there was only one force, which contained electromagnetic, weak and strong forces. However, as the Universe cooled, they began to separate.

Such theoretical tools make it possible to explain questions such as the existence at least in appearance, and fortunately for us of more matter than antimatter in the Universe. The Japanese physicist, Motohiko Yoshimura used this property to demonstrate that an initial state in which there was an equal amount of matter and antimatter could evolve into one with more protons or neutrons than their respective antiparticles, thus producing a Universe like ours, in which there is more matter than antimatter.

Thanks to the group of theories mentioned above, we have an extraordinary theoretical framework in which to understand what nature is made of. Its predictive capacity is incredible. These theories accept that all matter in the universe is made up of aggregates of three types of elemental particles: electrons and their relatives those called muon and tau , neutrinos electronic, muonic and tauonic neutrinos and quarks, as well as the quanta associated with the fields of the four forces we recognize in nature photons, for electromagnetic interaction, Z and W particles gauge bosons for weak interaction, gluons for strong interaction; and even though gravitation has yet to be included in this framework, the as-yet-unobserved gravitons, for gravitational interaction.

The subset formed by quantum chromodynamics and electroweak theory that is, the theoretical system that includes relativistic and quantum theories of strong, electromagnetic and weak interactions proves especially powerful in its balance of predictions and experimental confirmation. It will be remembered—together with general relativity, quantum mechanics, and the unravelling of the genetic code—as one of the most outstanding intellectual advances of the twentieth century.

But much more so than general relativity and quantum mechanics, it is the product of a communal effort. That is inevitable: the history of high-energy physics calls not for an entire book, but for several. Why do those particles have the masses they have? Why, for example, does the tau weigh around 3, times as much as an electron? Why are there four fundamental interactions, instead of three, five, or just one? And why do those interactions have the properties they do such as intensity or range of action? Let us now consider gravitation, the other basic interaction.

Can it be unified with the other three? A central problem is the lack of a quantum theory of gravitation that has been subjected to experimental testing. There are, however, candidates for this splendid unifying dream: complex mathematical structures called string theories. According to string theory, basic particles existing in nature are actually one-dimensional filaments extremely thin strings in spaces with many more dimensions than the three spatial and single temporal one we are aware of. So what kind of materiality do these one-dimensional theoretical constructs have?

I said before that string theories are complex mathematical structures, and that is certainly true. And even those approximate equations are so complicated that, to date, they have only partially been solved. So it is no surprise that one of the great leaders in this field was a physicist with a special gift for mathematics.

I am referring to the American, Edward Witten b. The reader will get an idea of his stature as a mathematician when I mention that, in , he received one of the four Fields medals alongside Pierre-Louis Lions, Jean-Christophe Yoccoz and Shigefumi Mori that are awarded every four years and are the mathematical equivalent of the Nobel Prize. This eleven-dimensional theory, which Witten called M Theory, has yet to be completely developed But what is that other thing?

After all, a vibration is the oscillation of some sort of matter, but as a permanent structure, it is probably more of a mathematical than a material entity. Physicists would have been working very hard for centuries, or even millennia, only to discover that matter has finally slipped between their fingers, like a net, turning into mathematics, that is, mathematical structures. In sum, string theory unearths age-old problems, and maybe even ghosts: problems such as the relation between physics and the world and mathematics.

Independently of those essentially philosophical aspects of nature, there are others that must be mentioned here. Up to now, string theory has demonstrated very little, especially in light of the fact that science is not only theoretical explanation, but also experiments in which theory is subjected to the ultimate arbiter: experimental testing. String theories are admired by some, discussed by many, and criticized by quite a few, who insist that its nature is excessively speculative.

Thus, the distinguished theoretical physician, Lee Smolin , , pointed out in a book about these theories:. In the last twenty years, a great deal of effort has gone into string theory, but we still do not know if it is certain or not. Even after all the work that has been done, the theory offers no prediction that can be tested through current experiments, or at least, experiments conceivable at the present time.

The few clean predictions they propose have already been formulated by other accepted theories. Part of the reason why string theory makes no new predictions is that there seem to be an infinite number of versions. Even if we limit ourselves to theories that coincide with some of the basic facts observed in our universe, such as its vast size or the existence of dark energy, there continue to be something like different string theories; that is a one with five hundred zeros behind it, which is more than all the known atoms in the universe.

Such a quantity of theories offers little hope of identifying the result of any experiment that would not fit any of them. Thus, no matter what experiments show, it is not possible to demonstrate that string theory is false, although the opposite is equally true: no experiment can demonstrate that it is true. In that sense, we should remember that one of the most influential methodologies in science continues to be the one put forth by Karl Popper , an Austrian philosopher who wound up at the London School of Economics. Popper always insisted that a theory that cannot be refuted by any imaginable experiment is not scientific.

In other words, if it is not possible to imagine any experiment whose results contradict the predictions of a theory, then that theory is not truly scientific. In my opinion, that criterion is too strict to be invariably true, but it is certainly a good guide. At any rate, the future will have the final say about string theory.


  1. Refine your editions:.
  2. TEN FUN THINGS TO DO IN MISSION VIEJO.
  3. The Man Who Invented the Twentieth Century;
  4. Above, I dealt with the basic aspects of the structure of matter, but science is not limited to a search for the most fundamental, the smallest structure. It also seeks to understand what is closest to us and most familiar. In that sense, we must mention another of the great achievements of twentieth-century physics: the theoretical reconstruction of the processes —nucleosynthesis—that led to the formation of the atoms we find in nature, those of which we, ourselves, are made.

    In fact, high-energy physics supplies the basis for nuclear physics, which studies stellar nucleosynthesis. As the universe cooled, the constituent parts of this soup underwent a process of differentiation. At a temperature of around 30, million degrees Kelvin which was reached in approximately 0. Consequently, we believe that the Big Bang generously supplied the universe with hydrogen and helium. But what about the other elements? After all, we know there are many more elements in nature. One does not have to be an expert to know of the existence of oxygen, iron, nitrogen, carbon, lead, sodium, zinc, gold and many other elements.

    How were they formed? Even before high-energy physicists began studying primordial nucleosynthesis, there were nuclear physicists in the first half of the twentieth century who addressed the problem of the formation of elements beyond hydrogen and helium. Almost at the very beginning of the second half of the twentieth century, George Gamow and his collaborators, Ralph Alpher and Robert Herman , took another important step Alpher, Herman and Gamow They were followed two decades later by Robert Wagoner b.

    Thanks to their contributions—and those of many others—it has been possible to reconstruct the most important nuclear reactions in stellar nucleosynthesis. One of those reactions is the following: two helium nuclei collide and form an atom of beryllium, an element that occupies fourth place atomic number on the periodic table, following hydrogen, helium and lithium its atomic weight is 9, compared to 1, for hydrogen, 4, for helium, and 6, for lithium. Actually, more than one type of beryllium was formed, and one of these was an isotope with an atomic weight of 8.

    It was very radioactive and lasted barely one ten-thousand-billionth of a second, after which it disintegrated, producing two helium nuclei again. But if, during that instant of life, the radioactive beryllium collided with a third helium nucleus, it could form a carbon nucleus atomic number 6, atomic weight, 12 , which is stable. And if the temperatures were high enough, then carbon nuclei would combine and disintegrate in very diverse ways, generating elements such as magnesium atomic number 12 , sodium 11 , neon 10 and oxygen 8.

    In turn, two oxygen nuclei could join to generate sulphur and phosphorus.