David L. Martin

in praise of science and technology

Archive for the month “March, 2016”

28. volcano prediction/tsunami prediction

The volcano represents perhaps the quintessential example of mankind versus nature, and the classic caricature of the relationship of people to god(s).  We still can’t stop them from erupting, but science has certainly given us a lot of predictive power and a lot of tools to ameliorate their effects.  Aside from providing many valuable signals to a coming eruption, our modern understanding of volcanoes has given us the ability to identify particular sources of danger that would be overlooked otherwise.  For example, some volcanoes have a history of lahars, mudflows that can sweep down quickly and destroy human lives.  This history is often buried in the landscape, and/or requires scientific knowledge to properly interpret.

volcano

Many volcanoes exhibit clear signs before a major eruption – rumblings in the earth, reflecting rising magma, deformation of the land itself in and/or near the volcano, reflecting pressure from beneath, gas emissions, as magma approaches the surface, and thermal changes in waters associated with the volcano, reflecting increasing heat near the surface.  Every volcano is unique, but there are commonalities.  Of course, many of these signs should be obvious even to unsophisticated observers.  Yet time and again, people living close to volcanoes, like Vesuvius in Italy, have failed to take heed of warnings.  Vesuvius in particular goes through long periods of quiescence, often lasting centuries, but inevitably blows its top eventually.  At the end of the 13th century it entered a period of unusual quiet that lasted more than 200 years.  People clustered around the volcano again, cattle even grazed in its crater, and vineyards and orchards adorned its slopes.  But in 1631 it erupted again, killing thousands.  There was plenty of rumbling, for months, prior to the eruption, as well as other precursors.  But so much time had passed that people forgot.  When Vesuvius erupted in the 20th century, the results were far less lethal.

tsunami

One could argue that it wasn’t so much volcano prediction that improved, but other things – communication, transportation, infrastructure generally, support systems.  In Japan, where many live in close proximity to active volcanoes, there are many systems to give warning and ameliorate effects.  Lahars are directed through channels instead of driving across neighborhoods, and their passage automatically trips sensors that trigger warnings downslope.  More dramatic is scientific success with tsunamis, which historically (and prehistorically) devastated communities with almost no warning.  Because tsunamis often originate hundreds or thousands of miles from the places they devastate, there is often time to give warning.  The Pacific Basin tsunami warning system is highly developed.  A submarine earthquake near Peru will quickly yield warnings in Hawaii and Japan.  Of course one might argue that since a large tsunami is always preceded by a striking drop in the ocean level near shore, tsunamis have a built-in warning system.  But we have seen time and again that people do not interpret this properly.  This illustrates the folly of depending of folk knowledge or common sense to protect people from harm.  Only systematic methods founded on solid science produce dramatic results.

https://en.wikipedia.org/wiki/Prediction_of_volcanic_activity

https://en.wikipedia.org/wiki/Lahar

https://en.wikipedia.org/wiki/Tsunami

27. earthquake prediction

Earthquake prediction is still in its infancy, partly because of the difficulty of getting precise measurements on what is happening deep within the earth.  The only reason we are pretty good at predicting the weather is that we have direct access to the atmosphere, and measure its conditions at frequent intervals in many, many places.  With the earth’s crust we don’t have this advantage.  Some scientists even go so far as to say that earthquake prediction is impossible.  For thousands of years, people have suggested that there are precursors to major earthquakes – signs that an earthquake will soon occur.  Strange animal behavior is perhaps the most widely cited.  None of these have been found to be reliable predictors.

earthquake

Earthquakes often happen because strain is released along fault lines, places where large sections of crust move against one another.  If we could actually get down into the earth and measure the strain directly at many different points, we would undoubtedly be much better at predicting these kinds of earthquakes.  But even with this advantage, it wouldn’t be simple.  One earthquake in an area immediately changes the strain dynamics, and can lead to a cascade of other earthquakes.  Each of this in turn changes the dynamics.  Some earthquake experts believe the crust is in a constant state of self-organized criticality, which is to say that any tremor may lead to a cascade of effects, many of which are very sensitive to the exact conditions, and therefore the process is inherently unpredictable.

goldengate

But even if we can’t predict the occurrence of an earthquake in a given spot, we have been able to map the occurrence of earthquakes on our planet, and we understand quite generally the processes that cause them and intensify their effects.  This at least gives us the ability to say, “This is an area prone to earthquakes,” or “This land is prone to liquefaction in earthquakes.”  We understand the vibrations associated with the events themselves, which has enabled us to design buildings with earthquakes in mind.

https://en.wikipedia.org/wiki/Earthquake_prediction

https://en.wikipedia.org/wiki/Self-organized_criticality

https://en.wikipedia.org/wiki/Soil_liquefaction

26. weather prediction

Perhaps no more dramatic illustration of the power of scientific prediction is that related to weather. Before science came along, humans were at the mercy of the weather. Many lives were lost. Tornadoes, hail storms, floods, droughts, hurricanes, blizzards – The seeming capriciousness of the weather was undoubtedly one of the motivators of religious belief, giving people a feeling of control over something that seemed uncontrollable. In many drought-prone places on earth, people through the centuries have prayed for live-giving rains. We may never be able to predict the weather with unerring accuracy, even over short periods like a day. The weather is a chaotic process – its extreme sensitivity to initial conditions quickly drives it to unpredictability. Nevertheless we have gotten pretty good at predicting storms of all kinds, wet periods, dry periods, and importantly, we can predict the weather for a given point on earth over the short term.

tornado

Because so much of weather prediction depends on information about processes occurring hundreds, even thousands of miles away, weather forecasting was not really possible before the development of rapid communication. The telegraph was the first technology to make it possible. It also required the establishment of a network of weather stations, carefully and frequently recording conditions – temperature, humidity, barometric pressure, wind speed, and so on. In 1870, the U.S. Weather Bureau was established, and remarkably, we have good data on Atlantic hurricanes dating back to this time. But it was not until the 20th century that numerical weather forecasting came into existence. In the early 20th century, it was not unusual for a single tornado to kill more than 100 people. But with the introduction of weather radar and a better understanding of storm processes, this became a rarity in the late 20th century. Doppler radar technology, widely introduced in the 1970’s, allowed forecasters to see storm rotation, not just the intensity of rainfall. Today tornado warnings are often issued based on a radar signature – an indication of strong, localized rotation that frequently precedes the actual formation of a tornado.

trackerrors

Meanwhile, the National Hurricane Center has steadily, year by year, gotten better and better at predicting the paths of hurricanes. Much of this is due to the use of increasingly sophisticated computer models. Our understanding of the nature and influences on storm surge has improved, and surge expectation maps have become increasingly sophisticated and detailed. Even if we weren’t all that good a predicting the movement and behavior of storms, we have so much technology now that allows us to SEE them. An individual can actually observe an approaching hurricane, in his own home, on Doppler radar and satellite imagery. He can see the rain bands, the eyewall, the spots of intense convection. If the storm turns he can SEE this. There is no way for the storm to “sneak up” on us, even if official predictions fail. The same is true of severe thunderstorms. Of course, it is up to us to keep our heads out of our butts and pay attention to the weather when conditions warrant.

https://en.wikipedia.org/wiki/Weather_forecasting

https://en.wikipedia.org/wiki/Atlantic_hurricane_season

https://en.wikipedia.org/wiki/Tropical_cyclone_track_forecasting

25. surgery

Surgery, like many technologies, has been around for a while. Even the ancient Egyptians drilled holes in people in attempts to treat their ailments. Bloodletting is one of those lovely techniques that goes back many centuries, often using leeches. This was so pervasive in some parts of the world that books were eventually published on the subject, recommending specific times during the year to perform it, and advocating it as a cure for everything from infections to psychosis. This is typical of the kind of nonsense that was promulgated in the pre-scientific era. If a treatment was applied, and the person survived, or the ailment abated, a cure was claimed. But a scientist would immediately ask, what would have happened without the treatment?

1-bloodletting-leech-method-science-source

During the Middle Ages, there were pioneers, some far ahead of their time, who advanced the field. For example, the Persian Muhammad ibn Zakariya al-Razi, “the Islamic Hippocrates,” founded the field of pediatrics in the 9th century. Particularly in the Arab world, the medical knowledge of the ancient Greeks was preserved and expanded upon. This knowledge was in turn picked up by physicians and universities in Europe. But advancement was slow, as it generally is without a scientific approach.

Surgeon at work in operating room.

Team surgeon at work in operating room.

Modern surgery really begins with John Hunter in the 18th century. He refused to rely on the testimonies of others. Instead, he did his own experimentation and reconstructed surgical knowledge from scratch. This abandonment of authority for its own sake and reliance on ACTUAL EMPIRICAL EVIDENCE is crucial. If you don’t have the basics right, you can’t make systematic, continuous progress. As general anesthesia became more developed in the 19th century, surgery became more popular. Unfortunately, infection was still poorly understood, and post-operative sepsis became a big issue. Human skin is a formidable barrier to lots of pathogens. When you breach it, it’s party time for septic bacteria. Remarkably, medical science still didn’t understand the role of germs in disease. This had to wait for pioneers like John Snow and Louis Pasteur in the mid 19th century. In the late 19th century, antiseptic procedures were introduced. In the 20th century, surgery advanced by leaps and bounds. Open-heart surgery was unknown until the mid 20th century. Lasers were just being developed. When I was young there was no such thing as a heart transplant. Nowadays, laser surgery, especially for eye imperfections, is commonplace. Even a bullet through the brain is no longer a death sentence. Body parts are routinely replaced, either by transplant from other humans or using prostheses.

https://en.wikipedia.org/wiki/History_of_surgery

https://en.wikipedia.org/wiki/Muhammad_ibn_Zakariya_al-Razi

https://en.wikipedia.org/wiki/John_Hunter_%28surgeon%29

24. analgesics and anesthetics

Pain is a fact of human life. But it is hard to imagine the agonies endured by human beings over the long stretch of history because they had no access to what we take for granted. Often overlooked are the countless examples of agonizing tooth pain suffered by people over the centuries. Opiates of course have been around for a long time. The problem is that opium and opiates tend to be quite addictive. The human suffering that results, not just to the user, but to their loved ones, hardly needs to be expounded upon, and remains a big issue to this day. There are plants such as toothache tree that produce natural analgesics, and these have been known about for centuries. And of course alcohol has been used to “deaden the senses,” which often leads to its own set of problems. But modern medicine has given us an array of inexpensive analgesics, such that pain doesn’t dominate life, at least not in the first world, as it once did.

alchemy

For most of human history, we had alchemy rather than chemistry. The difference is the difference between stumbling in the dark, largely using trial and error to make progress, and having an enlightened, systematic approach that starts with the basics, without reliance on preconceptions or dogma. Much of alchemy was concerned with finding the legendary “philosopher’s stone,” which could turn mercury into gold and lengthen human life. Operating from these kinds of assumptions led to a lot of wasted time and effort. Centuries worth, in fact.

periodictable

Then the science of chemistry came along, building from basic observations and testing hypotheses, always with a commitment to be led by evidence, not dogma. The result was that within 2 centuries, 34 chemical elements had been identified, the beginnings of the periodic table were being developed, and modern chemical textbooks had begun to be published. This was nothing compared to the progress made in the next 2 centuries. By the late 19th century, salicylate medicines, derived from willow bark, were being isolated to treat a variety of symptoms. But these caused a lot of stomach irritation. An alternative was needed. The result was aspirin, derived from years of painstaking basic pharmacological research. Acetaminophen has a more interesting history. Acetanilide was discovered to be an analgesic in the late 19th century, but it had unacceptable side effects. An aniline alternative, paracetamol, was discovered, but it too was claimed to have severe side effects. A third aniline derivative, phenacetin, was settled on, and was widely used in the early 20th century. But in 1947, it was discovered that the research claiming severe side effects for paracetamol was flawed – and furthermore, phenacetin was metabolized to paracetamol in the body anyway. The supposedly safe phenacetin that had been sold for the previous decades was turning into supposedly dangerous paracetamol in people’s bodies. Paracetamol was “rediscovered,” and gained new popularity. It’s more familiar name – acetaminophen. Of course there are other analgesics like ibuprofen that are popular today, each with its own strengths and weaknesses. Today we take for granted that there are chemical and pharmaceutical laboratories all over the world doing this kind of research. But we forget that the methodology is as important as the hard work – for centuries people worked hard at alchemy too.

intubation

Then there is the problem of cutting people open and repairing them without causing additional suffering. Actually, sedatives have been around for a long time. Opiates have been used for centuries for this purpose. In the late Middle Ages, a potion called dwale, which contained opium among other things, was used for general anesthesia. But not until the Enlightenment were substances such as nitrous oxide and ether discovered. Even then, general anesthesia was a tricky business, because the doctor could only expose the patient to the gas, not deliver it directly into the trachea. That changed in the 20th century, with endoscopy and other techniques to introduce the anesthetic directly and to carefully control it. And of course we now have an array of both gaseous and intravenous anesthetics. General anesthesia is much safer now than in the past.

https://en.wikipedia.org/wiki/Alchemy

https://en.wikipedia.org/wiki/Analgesic

https://en.wikipedia.org/wiki/Anesthesia

23. mosquito control

Mosquitoes, statistically, are the most dangerous animals on earth. Every year they kill hundreds of thousands by transmitting deadly diseases. Right up until the late 19th century, scientists were struggling to convince authorities that diseases such as malaria and yellow fever were transmitted by mosquitoes. The entrenched dogma that many common diseases were caused by “bad air” was strong. Prior to the mid 19th century, humanity didn’t even understand germs. Viruses are very small – too small to be seen even with the microscopes available at that time.

yellowfeverrmound

In the mid 19th century, yellow fever epidemics killed many thousands of people in the U.S. Malaria was also widespread in many parts of the South. The French effort to build the Panama Canal in the 1880’s led to many malaria and yellow fever infections. Ironically, the hospitals treating the victims often had potted plants, with pools of stagnant water, which bred more mosquitoes. Right up until the end of the 19th century, medical and political institutions rejected the idea that mosquitoes were the vectors of these diseases. In the 1890’s army physician Walter Reed showed that yellow fever was not contracted by drinking river water. Instead, he noted that many of the soldiers who became infected had a habit of walking trails through swampy woods at night. Finally, in 1900, Reed proved that mosquitoes transmit yellow fever.

cdc

Around this time, the first mosquito control efforts were instituted. These involved mostly source reduction – the elimination of outdoor water-holding containers, the filling in of mosquito breeding sites, and so on. It was not until World War II that modern mosquito control really came into its own. Since American troops were deployed in many malaria-prone areas, the Army developed its Malaria Control in War Areas Program. After the war, this became the National Malaria Eradication Program, administered by the Communicable Diseases Center. Its name today – The Center for Disease Control and Prevention. This illustrates how important mosquitoes are in the history of disease control.

lighttrap

By the mid 20th century, pesticides were being widely used for mosquito control, and the first mosquito control districts were being created. A great deal of basic science was built up concerning mosquitoes. In the late 20th century, Integrated Pest Management was introduced. Today mosquito control is based on sound science. Mosquito populations are monitored, and control efforts are targeted. A wide variety of control methods are used – physical, chemical, biological. Currently efforts are underway to produce genetically modified mosquitoes, to be released into wild populations and reduce them.

https://en.wikipedia.org/wiki/History_of_yellow_fever

https://en.wikipedia.org/wiki/History_of_malaria

https://en.wikipedia.org/wiki/Mosquito_control

22. contraception

No discussion of technology would be complete without mention of contraception, which has had a profound effect on our society. The preoccupation of previous generations with sex is somewhat understandable, when you realize that without reliable contraception, sex often leads to pregnancy. As with most health-related issues, various folk methods were tried, going back to ancient times. Leaves and various other materials were tried to physically block sperm. Of course various plants and plant extracts were taken to achieve contraception, or try to induce abortion. In some cases these were later confirmed to have contraceptive effects.

contraceptives

Truly reliable contraception had to wait for the scientific understanding of human reproduction. Even then, there were many social impediments. The medical understanding of female sexuality and reproductive function lagged far behind that in other medical fields, because of prejudice and patriarchy. Since women were not allowed to become doctors, and since men in the medical profession often considered matters of childbirth “ungentlemanly,” all aspects of female reproductive health suffered. Midwives were mainly responsible for such matters, and they were looked down upon by the medical profession. In the late 19th century, patriarchal ideas were widespread, particularly in Britain, and the concept of an “ideal wife” was heavily promoted – an obedient, self-sacrificing, baby-making homebody. A big part of this included strong ideas about female purity – sexual modesty, fidelity, and refinement. There was a pervasive cultural taboo about even discussing sexual organs or sexuality. Even horseback riding, for women, had to involve concealment and the minimization of suggestive postures. Needless to say, contraception was widely considered to be counterproductive to the proper role of a woman.

sidesaddle

In the late 19th century, laws were passed in the U.S. that not only banned the distribution of contraceptives, but made it a crime to even EDUCATE people about safer sex and contraception. The enfranchisement of women was one of the most hard-fought of the widening enfranchisements of last 4 centuries – slavery was outlawed in the U.S. 5 DECADES before women even obtained the right to vote nationally. Finally, the mid 20th century, the birth control pill was invented. Interestingly, prior to its widespread use, many pundits argued that it would have little effect on birth rates, because women were supposedly genetically programmed to have lots of children. Yet between 1960, when the birth control pill was approved in the U.S., and 1970, the U.S. birth rate dropped 22%, and has never returned to 1950’s levels. Reliable contraception gave women an unprecedented degree of control over their own lives. Women demanded equal rights and entered the work force in unprecedented numbers, delaying having children. Women pursued careers previously unoccupied by them. A female doctor was a relative rarity in 1950. Now it is commonplace. The power of this technology to alter society is illustrated by the reactionary voices that to this day bemoan its availability. The empowerment of women is widely considered to be one of the most important features in the social advancement of any society – with it comes a whole host of other improvements.

https://en.wikipedia.org/wiki/History_of_birth_control

https://en.wikipedia.org/wiki/Comstock_laws

http://www.infoplease.com/ipa/A0005067.html

21. prenatal and postnatal care/childbirth management

Many people are alive today who wouldn’t be (both the mothers and their offspring), were it not for modern prenatal care. Giving birth and getting born are potentially dangerous. How many women died over the centuries in the process of giving birth or immediately after? This has become a rarity in the first world, thanks partly to caesarian sections, but also due to extensive prenatal care. And then there is the fact that many babies are born prematurely (and have always been so) who would have died without 21st century medical technology. Incredibly, a baby born only 6 months after conception now has a better than 50% chance of survival in the U.S. (although many of these children will have long-term health issues).

prenatalcare

In the near future we may see far more dramatic advancements, such as artificial wombs that keep even earlier preemies alive. The advance of science and technology has dramatically changed people’s attitudes toward the subject. In 1900, if a baby was stillborn, it was usually discarded and the parents were expected to immediately let go of their attachment to it. Today such a thing would be unheard of in the first world. This is only one of many ways that children’s lives are more valued today than in generations past, as result of the march of medical science. One of the interesting quirks of medical history is that during the 19th century, obstetrics underwent a period of stagnation. This happened because foremost European institutions of medicine considered delivering babies ungentlemanly, and refused to support the field. Of course women at the time were barred from becoming doctors. This is one of many historical illustrations of the fact that adherence to dogma is inconsistent with scientific advancement.

preemie

Science demands that we pursue evidence and reason, even if it makes us uncomfortable or challenges our parochialisms. In the late 19th century, obstetrics began to move forward again, although because of this period of stagnation, it remained behind most other medical fields for many years. It is worth pointing out that many concepts about female anatomy and physiology remained outdated right up through the 20th century, because of a failure to apply scientific principles, instead relying on dogma, often theologically based.

https://en.wikipedia.org/wiki/Prenatal_care

https://en.wikipedia.org/wiki/Obstetrics

https://en.wikipedia.org/wiki/Gynaecology

20. medical imaging

Centuries ago, the only way to look inside the human body was to cut it open. That changed with the introduction of x-rays in the late 19th century. X-rays were one of those technologies that could hardly have been predicted, even by visionaries, because they are one of many forms of invisible light that were not even suspected by people before that time. The medieval mind would have dismissed such a thing out of hand. How could there be a vast spectrum of light the human eye could not see? Everything was created by God for man’s purposes. What could God’s purpose be in creating such a useless thing? It would have been contrary to the most basic assumptions of that time.

xray

The scientific revolution was the smashing of centuries of dogma. In the process, whole worlds were opened up for discovery. The universe of time, space, matter, and energy turned out to be far greater in scope than the medieval mind could have envisioned. We can know the chemical composition of distant stars. We can look back to the Big Bang. And we can look into the human body without destroying it. Later came more sophisticated scanning technologies – in the 1970’s, computers began to be used in combination with X-rays to create CAT scans, 3-dimensional images built from layers of X-ray images. These provided much better visualization but were primitive by today’s standards. Magnetic Resonance Imaging came into existence around the same time, using very different scanning methods. Here strong magnetic fields and radio waves are used to create contrast between tissues, essentially reflecting the spatial distribution of water in the body. Again, computer techniques are used to build up 3-dimensional images. Also around this time, Positron Emission Tomography (PET) came into use. This technology uses a tracer which emits positrons, the antimatter equivalent of electrons. The positrons induce the emission of pairs of gamma rays which are picked up by the detector. If the tracer is an analogue of glucose, the emission of gamma rays will reflect the amount of metabolic activity in that area. In this way, images of such processes as brain activity can be created. PET scans can produce amazing images but are relatively expensive.

petscans

In recent years, ultrasound images have become familiar to most of us, mainly through their use in prenatal care. The basic principle is no different than that of sonar – the device produces a sound which is reflected back to it. By measuring the strength of the echo and how long it took to return, the device can assign a brightness value to that spot. Repeating this over an area of the body results in an image. Such technology is very safe and produces remarkable images, although it is limited to imaging tissues (or fetuses) fairly close to the surface. The value of being able to visualize internal structures is obvious. It not only helps in diagnosis and treatment, but is invaluable in basic research to understand disease processes. Much of this technology involves science that wasn’t even developed until the 20th century – the understanding of atoms, subatomic particles, antimatter and so on.

https://en.wikipedia.org/wiki/Medical_radiography

https://en.wikipedia.org/wiki/Positron_emission_tomography

https://en.wikipedia.org/wiki/Medical_ultrasound

19. germ theory/antibiotics/vaccines

For centuries medicine was groping in the dark for the cause of many common diseases. “Bad air” was often blamed. In fact, the term malaria comes from the medieval Italian mal aria, meaning bad air. Many diagnoses invoked a vague idea of some “imbalance.” Folk remedies were everywhere, and so was lots of theorizing. Remarkably, many people believed in spontaneous generation – that animals could simply emerge from inanimate matter, like mud. Yet at the same time, most people, even doctors, couldn’t bring themselves to believe in things they couldn’t see. And they couldn’t see germs. If only they had bothered to do simple experiments, with readily available materials, they would have noticed that maggots don’t develop on meat when adult flies can’t get to it. They would have noticed that when you boil broth in a glass container with a long, complex, winding tube connecting it to the outside air, so that dust and other tiny particles never reach the broth, nothing grows in it. This would have clearly showed that there were living things in the air, invisible to the unaided eye.

pasteur

This is exactly what Pasteur did in 1857. Microscopic germs, in the air and the water, are cause of many diseases, not “bad air.” Pasteur developed the first vaccines; later would come vaccination against dreaded diseases such as smallpox and yellow fever. Disinfectants began to be developed; these had huge impacts on the transmission of many diseases. Interestingly, despite the development of numerous disinfectants, ordinary bleach (sodium hypochlorite) remains the most cost-effective, and is the preferred disinfectant in hospitals in the U.S. to this day.

penicillin

In the 20th century came antibiotics, which revolutionized medicine and remain a critical weapon in the arsenal against pathogenic bacteria. Penicillin, once called the wonder drug, was the first, and remains in use today. Medical science is in a constant arms race with pathogenic bacteria, as they develop resistance to older antibiotics and new antibiotics are deployed. Of course, many diseases are not caused by germs, and many germs are not at all harmful. Even so, many major diseases – cholera, typhoid, malaria, yellow fever, measles, small pox, bubonic plague, and many others, which killed millions, did turn out to be germ-caused, and most are now virtually conquered in the first world. It should never be forgotten that it is young children who were and are often the victims of these devastating illnesses, struck in their tender years.

https://en.wikipedia.org/wiki/Germ_theory_of_disease

https://en.wikipedia.org/wiki/Vaccine

https://en.wikipedia.org/wiki/Antibiotics

Post Navigation