Monday, November 25, 2013

Chemophobia Part Three: Today's Society

In the first and second parts of this series of posts, we explored the history of chemophobia, discussing the growth of chemophilia in post one and the development of chemophobia in post two. All of that history has led us to the situation we are in today.

Chemophobia hasn’t gone away. Far from it. Today, chemophobia is a trait of our society that is often overlooked. You don’t have to look far to find products advertising themselves as “all natural” or “chemical free.” Of course, many of those claims are absurd, especially to anyone with enough chemistry knowledge to remember that everything is made of chemicals, so nothing can be free of chemicals. But nonetheless, there are many people who believe that the all-natural or the chemical-free stuff must be better.

And thus far, scientists have been making very little headway against this problem. Again, the chemophobia surrounding vaccines has served as a regrettably excellent example. A single study, now considered fraudulent, that “showed” some link between the MMR vaccine and autism, has led to countless people fearing vaccines, despite the appeals of scientists wielding study upon study refuting those results. Why? Why doesn’t the logic work? If people are shown that they have overreacted, why won’t they change?

Well, the problem is that the concept of the chemicals being safe is a new idea to many people. As we saw in post two, chemophobia has been growing steadily for a very long time. It isn’t just something that’s come out of the blue, and so, the concept that the chemophobia is irrational and that chemicals are good isn’t just a different idea, but rather, a new one. It requires people to change their worldview, and as Dale Carnegie explores in his book How to Win Friends and InfluencePeople, people rarely want to change once they have already made their choice, because at that point, they would be admitting that they were wrong, something that nobody wants to do. And so, new ideas are rejected if they are only pushed though coercion.


So what can be done? Well, one needs to understand that for many people, a life without chemophobia would be a dramatic change. In a culture that for so many years has embraced a fear of the industrial chemicals, produced by those evil corporations, it is very easy to see that as the only logical worldview, and many people will not take kindly to having the change forced down their throats. And so, to get the people to respond the change in the way we want them to, it must be done not through forcing facts at the disillusioned, but rather, acting as relatable friends with the society’s best interests in mind. Otherwise, as Dale Carnegie explored, in response to the new ideas of the chemophiles, society will shut down and dig in deeper. 

Thursday, November 14, 2013

Chemophobia Part Two: The Growth of Chemophobia

As we explored in the last post, chemophilia arose in the early 20th century through the Second World War. However, as Jon Entine explores, complications soon arose. In 1948, at Donora, Pennsylvania, noxious smog from two industrial plants descended for five days, sickening thousands and leaving the town far worse than it had been before. In 1952, a similar smog event happened in London, sickening 100,000 and killing thousands. This hit society hard, and a steady trend from chemophilia began.

Through the 1950s, concern grew over these chemical pollutants, and as before, Americans overreacted to compensate for the change. The first real overreaction manifested itself in the Delaney Clause, in which any substance that causes cancer in any dose in any species was deemed “unsafe.” This blatantly violated a fundamental principle of toxicity: the dose makes the poison. This amendment, in turn led to another national overreaction around cranberries near Thanksgiving 1959. These fears subsided when presidential candidates Richard Nixon and John F. Kennedy publicly ate cranberries, but a new crisis would soon develop, and this time, it would be more legitimate.

In 1957, a new “miracle drug” was released in West Germany, advertised as a cure to insomnia, coughs, colds, headaches, and morning sickness. This drug grew in popularity, spreading to other countries. It was then discovered, however, that this drug was causing birth defects. Thalidomide was never FDA approved in the United States, but nonetheless, millions of tablets had been sent to United States doctors for clinical testing, and so, in 1961, it was pulled from the market. This stunned the country and led to significant reforms, including stricter testing requirements, but it also fed fears: might the government not be able to do enough to protect the citizens from these chemical dangers? Once again, Americans were faced with a changing world, and they were getting even more jittery.

While all this was going on, this jitteriness was manifesting itself in a prevailing philosophy of the time: postmodernism. At the time, of course, the nuclear age was just getting underway, with the first nuclear bombs having been dropped in 1945 and unleashing a feeling of horror that soon culminated in an arms race of epic proportions. Anxiety was the prevailing feeling, and that translated into postmodernism, which grew out of those bombs and the newfound anxiety that science could lead to the extinction of the human race. And so, faced with changes caused by science that it was not prepared for, American society looked upon that science as something to be anxious of.

In 1962, Rachel Carson published Silent Spring, in which she fought against DDT and essentially launched the environmental movement. A chemophobic perception, that government couldn’t stop chemicals and that corporations wouldn’t, now took hold. After this, the fear and overreactions kept rolling in. In 1969, the cyclamate scare, where cyclamates were banned based on very spurious evidence, once again demonstrated the fundamental insecurity that the American society felt. The most striking overreaction was California’s Proposition 65, which created a list of remotely potentially hazardous chemicals and required businesses that might potentially expose the consumers to any chemical on that list to post the now-ubiquitous warning: “This product contains chemicals known to the State of California to cause cancer and birth defects or other reproductive harm.” This, as Entine notes, scared Californians with no measurable benefit.

By the 1990s, chemophobia was well entrenched. A prime example: the now-infamous and still-influential Wakefield Study, in which Andrew Wakefield “conducted” a “study” (both terms used loosely) that, through the magic of fabricated results, claimed that the MMR vaccine was linked to autism and bowel disease. This study, horrendous and fraudulent as it is, is still widely responsible for fears of vaccines and increased chemophobia.


I could list more examples, but the point is this: as the society began changing the way they saw chemicals, they overreacted, causing more change and more overreaction. And so, in that positive feedback loop, we have gotten to where we are today.

Sunday, November 10, 2013

Chemophobia Part One: A Chemophilic Society

In the late 19th century, and especially in the early 20th century, the Second Industrial Revolution took off, building off of the successes of the First Industrial Revolution. This phase of the industrial revolution was distinctive in that many of the most significant innovations came from applied science. And so, science continued to grow and continued to improve the standard of living of those in the industrializing areas. From physics came electricity, telecommunications, and the light bulb. From chemistry came fertilizer, new and stronger alloys, and petroleum distillation. Other innovations include the automobile, bicycles, paper. I could go on, but you get the point: science was helping to make peoples’ lives better.

And the prevailing attitudes of the time reflected that. This blog is about how people respond to changes and new ideas, so this time period offers us a near-perfect natural experiment, in which people had their standards of living drastically changed by science. The result was a growing appreciation for science in the society of the time. This change is reflected in an important philosophy of the time: modernism.

Modernism was a philosophy that flourished in the early 20th century, growing out of the massive changes of the Second Industrial Revolution. One of the principles of modernism is the idea that technology and science would serve to better man’s life and help man gain power. As this site explains, that thought was an extension of the experience man had in the modern world, in which technology had expanded the scope of his abilities. Technology, at this time, was seen as a driving factor for positive, not dangerous, changes.

Of course, when the society’s views of the technology changed for the positive, it only became a matter of time before a change in the society’s habits was seen, and that was exactly what happened. Especially in the 1920s, with its growing economy and affluence, consumption of this new technology took off. One can even go so far as to say that society overreacted in response to the technological change. (That overreaction is a theme you will be seeing again in the next few posts.) Bakelite, which we mentioned earlier, became extremely popular and was soon used to produce, as this article mentions, just about everything. The alloy technology of the Second Industrial Revolution allowed for automobiles to be built and popularized so much that, in the 1920s, families would choose buying cars, the status symbol of the time, over simple necessities like bathtubs. This was a very high point for views technology and chemistry in particular, evidenced by DuPont’s popular 1935 slogan: "Better Things for Better Living...Through Chemistry."

This attitude would carry into the beginning of the post-WWII years. As Jon Entine explores, the return of abundance in the postwar years led to increased demand for consumer goods that chemistry could supply. The pharmaceutical industry became more sophisticated in this time. Agriculturalists used pesticides and fertilizers to launch the Green Revolution. But the seeds of chemophobia were being sown, and this level of chemophilia would not survive much longer.

Saturday, November 9, 2013

Chemophobia: An Introduction

Recently, I read a piece in Nature on chemophobia, and it got me thinking. As someone who loves chemistry, it always rubs me the wrong way whenever I hear someone claim that something is “chemical-free” (which is impossible) or “all natural” (which is irrelevant). Still, I never really thought about the issue as a society-wide problem before, treating it instead as a case-by-case cause for annoyance. And so, this article spurred me to do some more research, and what I found, to a degree, surprised me.


In the next few days, I will be writing three posts on this issue of chemophobia. In the second part, I will explore the rise of chemophobia, and in the third, I will discuss the role of that issue in today’s world. A significant part of that discussion will center around the role that change and new ideas have played in developing and perpetuating chemophobia. But first, it’s important to remember that this phobia did not come up in a vacuum, but rather, in an environment where chemistry and its derivative technology were originally hailed as heroes, not villains. That will be the focus of the first post, and it will be coming shortly.