by Clark Wolf, Director of Bioethics, Iowa State University
She felt terrible, with a horrible pain in her gut that cut like a knife, and nausea and fever to match. Usually stoic in the face of pain, my daughter was doubled over and gasping.
When we took her to the hospital, the doctor took one look at her and immediately ordered a scan. Within hours she was in the operating room to have her ruptured appendix removed. After the operation, the surgeon showed us pictures of the process, including a glossy photo of the inflamed appendix and the staple he had used to close off the end from which it had been removed. Almost immediately after surgery, my daughter’s fever diminished. Her post-surgical pain was minimal compared to the searing pain that brought us to the hospital in the first place. As I write this, she is still in the hospital where she will remain for a few more days. But the crisis is over and there is improvement by the hour. By the time you read this, she will probably be home again in her own bed.
In the May 2009 issue of Bioethics in Brief, I discussed the fear of novelty that often leads to skepticism about new technology. I urged that moderate skepticism may be appropriate if it leads us to logically weigh the risks involved in new technologies, and that caution may be appropriate when we are unsure how to evaluate the risks we face.
The other side of this equation, of course, is the benefit that technological advances bring. In my grandparents’ generation, people often died from a ruptured appendix, and surgery was a far less certain undertaking. Today, an appendectomy is a relatively minor procedure. When the surgery is uncomplicated, patients may leave the hospital within a day or so of surgery.
We are grateful for life-saving technologies when we experience their benefits firsthand, and people are typically much less wary of technology—including biotechnology—when their most central interests hang in the balance.
The danger of adopting a technology that is unproven is the difficulty in weighing the involved risks. Since it is not possible to predict every eventuality, we may not understand how to weigh the risk until it’s too late. But the alternative danger— the danger involved if new technologies are not adopted—may also involve serious risks. We may not give proper weight to those risks until we experience the benefits first hand. Today as I write this, I am vividly aware of the benefits associated with the surgical technologies that saved my child’s life.
Precautionary Principles
How should we evaluate unproven technologies? It is sometimes recommended that we adopt a precautionary approach. The precautionary principle offers a general recommendation that we should be cautious when risks are unknown. Those who dislike the principle often recommend it as a general, blanket condemnation of any new technology simply on the basis of its novelty. In a 2003 New York Times editorial, Clyde Prestowitz memorably represented the precautionary principle as a recommendation that “If we can’t prove absolutely that [a new technology] is harmless, let’s ban it.” (Prestowitz, 2003) Stated in this way, the principle becomes an unfortunate decision criterion. It is neverpossible to prove absolutely that a novel technology is harmless. If we are entirely ruled by our fears we will miss the benefits that new technologies offer.
Often, these benefits can be measured in the same terms of life and death, happiness and misery that we may use to weigh risks and costs. Prestowitz is not a fan of the precautionary principle, so his statement of it is intended to make the principle appear ridiculous. While this may make a successful rhetorical point, his argument would have been more interesting and significant if he had re-presented the principle.
A more moderate version of the precautionary principle found its way into international law in the 1992 Rio Declaration. That agreement states “Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.” (Rio Declaration, 1992, Article 15) If Prestowitz’s statement of a precautionary principle is absurdly strong, so that it would prevent acceptance of any new technology, then perhaps the Rio statement is absurdly weak. Of course “lack of full scientific certainty” should not constitute a reason to postpone “cost effective measures” to prevent harm (or degradation). Empirical science never provides certainty. While one precautionary statement seems to rule out acceptance of any technology at all, the Rio statement is too weak to motivate caution even in cases where caution would be fully justified.
Confusion about the precautionary principle has resulted in the existence of opposing rhetorical camps. Some people reject the principle as obviously excessive while others extol it as a minimal and obviously justified principle for policy choice. If those involved in this discussion have different principles in mind, they may both be correct. But they are talking past each other.
Risky Decisions and New Technologies
I am overwhelmingly grateful to the people who developed and employed the surgical procedures that saved my daughter’s life recently. But the first time these procedures were used, the risks involved were unknown, and there must have been a reasonable expectation that they could fail. In the case of a ruptured appendix, the expected cost of doing nothing is high. Left to follow its natural course untreated, a ruptured appendix can be expected to lead to pain and death. In some cases, new technologies leave us with less dire alternatives than this. The cost of caution is often (though perhaps not always) less immediate and extreme for technologies in agricultural biotechnology.
The question whether we should chose to err on the side of caution or optimism will not be solved by reference to either of the simple principles articulated above. We need rationally to consider all of the risks involved in our choices, including the opportunity cost of proceeding with an abundance of caution. These costs are difficult to measure, since they are reflected in the foregone benefits that technologies might have brought. To see that these costs are very real, we would do well to consider the loss we would have experienced if past technologies had not been developed. In some cases, these opportunity costs are reflected in the lives of people who might have been positively affected by the adaptation of the new technology, even to the extent of dramatically extending the lengths of their lives.
References
Gardiner, S. 2006. A Core Precautionary Principle. J. Pol.Phil. 14(1):33-60.
Prestowitz, Clyde. 2003. Don’t Pester Europe on Genetically Modified Food. New York Times, January 25.
Stich, S. 1978. The Recombinant DNA Debate. Philosophy and Public Affairs. 7(3) Spring 78, pp. 187-205.
Clark Wolf is the Director of Bioethics and a Professor in the Department of Philosophy at Iowa State University. He is a faculty member in the Graduate Program in Sustainable Agriculture and has a cortursey appointment in the Department of Political Science. He teaches and co-teaches a variety of courses, including Foundations of Sustainable Agriculture, Environmental Ethics, and Bioethics and Biotechnology. Clark gives and organizes thought-provoking talks to diverse audiences at Iowa State, including talks on biotechnology and intellectual property.*
Any opinions, findings, conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the view of the ISU Office of Biotechnology or Iowa State University.
Wolf, Clark. Precautionary Principles and the Cost of Caution. Bioethics in Brief, a Publication of the Iowa State University Office of Biotechnology. May 2010. Volume 12, Number 2.
* Biography composed by Anastasia Bodnar.
You say ” We need rationally to consider all of the risks involved in our choices …” above. I’m not convinced this is even possible anymore. “We” as the general public, no longer have the education or even the foundations in basic logical decision making, to evaluate risks. Add to that the bombardment of “anybodies come experts” talking heads in the media or the dime a dozen “authority” web sites, and people are all too easily mislead, fooled, and used. The anti-vax crowd is a prime example of this. Maybe I’m just being too cynical this morning, but I have a hard time seeing a way around this.
Yikes. I wish it were another way, but I too fear that the public is too prone to accepting hyperbole at face value to even start having a rational conversation. Perhaps Clark uses the “royal academic we”?
If you read French here is an excellent analysis of the French situation/disaster.
The Constitution includes since 2005 an Environmental Charter with the followng Article 5
“When the occurrence of any damage, albeit unpredictable in the current state of scientific knowledge, may seriously and irreversibly harm the environment, public authorities shall, with due respect for the principle of precaution and the areas within their jurisdiction, ensure the implementation of procedures for risk assessment and the adoption of temporary measures commensurate with the risk involved in order to deal with the occurrence of such damage.”
(translation from the French Assembly website — “deal with” should be replaced by “prevent”)
I wish to quibble over the statement, “Empirical science never provides certainty.” The history of science is littered with prime examples of empirical certainty, and these constitute the class of falsified theories. (e.g., phlogiston, MMR and autism, etc.)
In fact, a la Popper, falsifiability is an essential component of any empirical claim.
In the context of the Precautionary Principle and risk assessment/management generally, it is insufficient — indeed, fundamentally vacuous — to advert to the existence of “unknown risks” inhering in a new technology. Since the claim cannot be falsified, it’s at best a waste of time. (Stated the other way around, the vast majority of unknown risks — verging on unity — are, and will forever remain, unknown because they simply do not exist.)
To be non-vacuous, a statement regarding the risk of a new technology must be expressed as an hypothesis. That is, something testable and falsifiable. If falsified, it’s empirically certain not to be a risk. If validated, the question becomes whether the risk is manageable, or at least acceptable, in light of the benefits of the technology, on a cost-benefit basis.
Seen in this light, the empirical certainty that results from the falsification of actually hypothesized risks, which result from a growing list of environmental and health tests of GM crops, can actually demonstrate — conclusively — that the products of agricultural biotechnology are safe until proven otherwise.
The Precautionary Principle should not be applied unless it is certain that to do so will cause no harm.
That’s a great quote. 🙂