lundi, décembre 12, 2005

Shirley M. Tilghman's lecture...

Strange Bedfellows: Science, Politics and Religion

President Shirley M. Tilghman
Dec. 1, 2005

George Romanes Lecture, presented at Oxford University

It is a great honor to be invited to give the George Romanes Lecture in this historic theater at Oxford. It is also a personal pleasure, as I have several reasons to feel a special kinship with George Romanes. We are not only both Canadians, but he was born in Kingston, Ontario, where I lived for four years while I was studying at Queen's University in the late 1960s. Romanes was the son of a Presbyterian minister, and Presbyterianism greatly influenced the founders of my own institution, Princeton University. Lastly, Romanes was a fellow biologist — a naturalist as such scientists were called in the 19th century — and was deeply engaged in the study of evolution and natural selection as a close colleague of both Charles Darwin and Thomas Huxley.

Indeed, Thomas Huxley delivered the second Romanes lecture in 1893 on the subject of "Evolution and Ethics." He pointed out in the preface to his published lecture that the Romanes Foundation had laid down one requirement of its lecturers — that they "shall abstain from treating either Religion or Politics." Luckily for the lecturers who came after him, Huxley went on to note that, "Yet Ethical Science is, on all sides, so entangled with Religion and Politics, that the lecturer who essays to touch the former without coming into contact with either of the latter, needs all the dexterity of an egg-dancer."

Today, of necessity, I will be walking on Huxley's metaphorical egg shells, for I wish to explore with you the dangers that arise when science, politics and religion find themselves at cross-purposes on issues of importance to the future. I speak as a scientist, a teacher and a university administrator who believes that for the most part, the contributions that science has made in expanding our understanding of the natural world over the past century have contributed to dramatic improvements in the well-being and the quality of life of most individuals living today. The evidence for this sweeping statement is all around us: in the dramatic increase in life expectancy and particularly the reduction in infant mortality; in the virtual eradication of a disease like smallpox through systematic world-wide vaccination; in the generation of household conveniences that have freed us from punishing manual labor; in the provision of safe drinking water and sanitation; in the availability of world travel and its potential to foster greater understanding among people of different cultures; and in the development of the Internet, a powerful tool that provides global and instantaneous access to everything from the world's great literature and art to mindless chatter on blog sites.

All of this progress, and the economic prosperity it has created, arose from public and private investments in science and technology in many countries. The economic return on investments in science and technology has been documented many times over. In the last 20 years, we have seen the creation of entirely new industries — industries that depended on discoveries such as recombinant DNA, semiconductors, the Internet and lasers. These discoveries form the bases for some of the most powerful drivers of today's economy. What is remarkable is that all of these advances grew out of research in university laboratories and, as often as not, research conducted by students and faculty pursuing knowledge for its own sake, with no commercial application in mind.

I do not mean to suggest that advances in science have always led to unalloyed good. For one thing, progress in science and technology has not benefited all individuals equally. Whether you use metrics within a developed country or between countries, the gap between the rich and the poor has been growing steadily during this scientific revolution, and there remain far too many places in the world where the basic necessities of life — food, housing, security, and health care — are not being met. The development of life-saving anti-retroviral drugs has radically changed the long-term prognosis for individuals infected with HIV in the developed world, turning AIDS from a death sentence into a manageable disease for some. Yet we seem unable to overcome the enormous economic and public health barriers to delivering these same drugs to patients in the developing world.

Furthermore, scientific advances can have unintended negative consequences of their own. Thalidomide was thought to be a savior for pregnant women suffering from morning sickness until it became evident that it was causing birth defects in their newborns. The development of chlorofluorocarbons in the 1920s as a nontoxic replacement for ammonia in refrigerants was considered a breakthrough in technology until it was discovered in the 1970s that the by-products of the chemical reaction between CFCs and sunlight were chlorine atoms, which were destroying the ozone layer in the upper atmosphere, thereby contributing to global warming. On the other hand, scientists have also played a critical role in exposing the dangers of products first created in the lab. For example, the phased elimination in the United States of lead additives used as stabilizers in gasoline owes an enormous debt to the compelling research — and perseverance — of geochemist Clair Patterson. Blood lead levels in the United States have fallen dramatically since the introduction of unleaded gasoline — arguably one of the greatest strides for public health in recent decades.

The scientific progress we have witnessed in both the United States and the United Kingdom in the 20th century did not happen by chance. It arose out of a social contract between governments on the one hand and research universities and institutes on the other. Although it is hard to imagine it today, prior to the Second World War, the government of the United States did very little investing in fundamental scientific research. In those days, foundations like the Rockefeller Foundation were the most important supporters of research in universities, with state and federal governments providing relatively modest funds. The Second World War changed everything as the federal government turned to academic scientists, particularly in physics, to develop the weapons that would ultimately end the war. National research laboratories were created at Oak Ridge and Los Alamos, and others that already existed were greatly expanded. The impact of academic scientists on the outcome of the war was probably startling at the time, but it helps to explain what happened next. When President Harry Truman turned to Vannevar Bush, his science advisor during the war, to advise him on postwar science policy, Bush changed history by writing a highly influential report entitled "Science — the Endless Frontier." In it he laid out the principles by which the federal government would link its future investments in fundamental research with education, particularly the education of graduate students. By investing in the young, the system acquired a vitality, an energy and a capacity to change continually that would make it the envy of the world.

The confidence that society placed in scientific progress as the path to prosperity was reflected for decades in everything from surveys that identified science as among the most respected professions to the yearly generous allocation of tax dollars to basic and applied research. In return for this broad support, society rightfully expected the discovery of new knowledge that would lead to better lives for everyone.

Yet from the very beginning, science and politics, especially religiously-inspired politics, had the potential to become "strange bedfellows," by which I mean working at cross-purposes with one another, rather than in harmony. That potential for conflict seems greater now than at any time in my career, and I would like to explore with you today some underlying causes by focusing on two distinctively American debates that have received considerable attention in the press over the last several years: priority-setting in the national space program and a resurgence in opposition to Darwin's theory of natural selection. While each has features that are unique to it, I believe that there is a common thread to these stories, which I will try to highlight.

On January 14, 2004, President George W. Bush announced major new goals for the publicly funded exploration of space, most prominently, the goals of sending humans back to the moon by 2015 and eventually to Mars. This announcement came at a difficult time in the history of the National Aeronautics and Space Administration (NASA), the United States space agency. The two programs in human-based space exploration, the International Space Station and the Space Shuttle Program, are both in trouble. The Space Station, originally announced by President Ronald Reagan in 1984 for completion in 10 years, is dramatically behind schedule and over budget, and the Space Shuttle Program, just beginning to recover from the 2003 Columbia shuttle disaster, is slated for mothballing in 2010.

The announcement also came at one of the most extraordinarily productive times in the history of astronomy and cosmology, when explorations with satellite space telescopes such as the Hubble Telescope, the Wilkinson Microwave Anisotrophy Probe, and the ground-based Sloan Digital Sky Survey, as well as unmanned space missions like Voyager, are providing us with breathtaking insight into the structure of the universe and our solar system. We are learning that our cosmos is much stranger than we thought. It is flat, not round or spherical, and it is flinging itself apart at an accelerating rate. To explain these observations, cosmologists have invoked a new force, to which they have given the Darth Vader-like moniker of "dark energy," to counteract the forces of gravity that we understand much better. Only 4 percent of the universe can be accounted for by the atoms and molecules we know and understand; the rest is composed of "dark matter" and this strange dark energy. We even have a much more accurate age for the universe — 13.7 billion years plus or minus a few hundred thousand years. At the same time, we are beginning to fill in remarkable details about our solar system, with new galaxies, planets and moons being discovered almost monthly, to the point where we are beginning to reconsider what constitutes a planet in our lexicon. These discoveries comprise a golden age of space exploration — but of a very different kind than President Bush is proposing.

This highlights a tension that has always existed between the scientific community and the political process whereby priorities are set. Ideally, priorities should reflect the relative importance and potential impact of competing questions, coupled with a dispassionate assessment of the likelihood that they can be answered by the proposed experimental or theoretical approach. In many fields, including my own, priority-setting has been a "bottom-up" process, in which scientists compete individually or in groups for resources through a peer review system. While government agencies like the National Institutes of Health can, and sometimes do, create set-asides for Congressional priorities like HIV vaccines or bioterrorism prevention (two recent examples of top-down priority setting), the system is always open to new ideas that arise in the minds of individual creative scientists. The underlying philosophy of such a system is that scientists themselves are in the best position to recognize the most interesting problems that are also solvable.

When the views of scientists are ignored in the priority-setting process, scarce resources tend to be wasted. A good example occurred in the 1970s when a handful of powerful members of the United States Congress pressed the National Institutes of Health to form the National Institute of Aging in order to spur research in gerontology, using existing resources to fund it. Given the ages of these legislators at the time, such action qualified as a clear conflict of interest, but the real problem was that there simply were not enough good ideas to justify the dollars set aside. As a result, potentially important research in other areas was sacrificed in favor of mediocre work.

To return to astrophysics, this scientific community has evolved a unique procedure in which the leaders come together once every 10 years, under the auspices of the National Academy of Sciences, and, through an inclusive and collegial process, establish priorities for the next decade. The small size and relative cohesiveness of the field, together with the large price tags attached to individual experiments, drove the evolution of the decadal process. The resulting recommendations are conveyed to NASA for consideration, but they have no binding authority.

In 2002, less than two years before President Bush's announcement, the National Academy of Sciences produced one of these decadal reports entitled "New Frontiers in the Solar System: An Integrated Exploration Strategy." In it, the Academy proposed priorities and recommended substantial investments in space flights like the Voyager missions to the outer planets, as well as Earth-based experiments. It was a comprehensive list of projects and missions that included everything but human exploration. With regard to the exploration of Mars, the report endorsed the current science-driven strategy of remote sensing, in situ measurements from landers and the transfer of samples to Earth as the best strategies for understanding Mars and its astrobiological significance, and for affording unique perspectives about the origin of life on Earth.

There are a number of plausible reasons why the President and NASA chose to ignore the advice of the country's most distinguished scientists. They may have made a practical judgment that the American public will not continue to support large outlays of dollars for "pure science" in which new knowledge is an end in itself, but instead will require the tangible — even romantic — symbols of space science that the Apollo missions have provided. They may have made a military decision that establishing American dominance in space is strategically important, or an economic decision that mining the natural resources in space will be essential to the future prosperity of the United States. President John F. Kennedy rationalized the first space program by saying that "this nation needs to make a positive decision to pursue space projects aimed at enhancing national prestige." George H.W. Bush, or Bush 41 as his son calls him, fully endorsed the Space Station with yet another rationale in his State of the Union address in 1989: "Why the Moon? Why Mars? Because it is humanity's destiny to strive, to seek, to find. And because it is America's destiny to lead."

Without judging the persuasiveness of these possible rationales, it is worth noting that if President Bush's proposal to launch manned flights to the moon and, ultimately, Mars goes forward, the United States will repeat the decision-making process that led it to establish the Space Station. Then, as now, the scientific community was highly skeptical of the utility of the Space Station, most especially its scientific value, and was concerned that support for the station would preclude support for what in their view were significantly higher scientific priorities. Scientists then, as now, were anxious that the project not be seen as a scientific priority, or worse, be judged by its scientific accomplishments. Twenty years later, history has proven the skeptics of the 1980s to have been highly prescient.

The Space Station has foundered for many reasons, including the failure of all four administrations that oversaw it to support it fully. As John Logden of George Washington University recently pointed out, "NASA continues to work toward completing a space station program first approved in 1984, using a transportation system begun in 1972 and in operation since 1981. This is far from a fast-moving forward-looking effort." But a lesson I would draw from this case study is that top-down, politically driven science projects, especially those that will be enormously expensive, need to be clear about their goals at the outset and are unlikely to be successful in scientific terms unless they have the support of scientists who understand the challenges and likely benefits of the undertaking.

If cosmologists are deciphering the origins of the universe and our solar system in unprecedented ways, biologists are making enormous strides, thanks to the technology that was developed during the Human Genome Project, toward unlocking the origins of life on Earth. Yet here, too, science and politics have found themselves at loggerheads. It is impossible to ignore the increasing assertiveness of elements within American society who have challenged the validity of Darwin's theory of natural selection and have lobbied for an alternative explanation, which they term "intelligent design," to be taught in public schools alongside the principles of evolution. This is deeply disturbing, for the theory of natural selection is one of the two pillars, along with Mendel's laws of inheritance, on which all of modern biology is built. It is virtually impossible to conduct biological research and not be struck by the power of Darwin's theory of natural selection to shed light on the problem at hand. Time and again in the course of my career, I have encountered a mysterious finding that was explained by viewing it through the lens of evolutionary biology. The power of the theory of natural selection to illuminate natural phenomena, as well as its remarkable resilience to experimental challenge over almost 150 years, has led to its overwhelming acceptance by the scientific community.

Today, however, under the banner of "intelligent design," Christian fundamentalists in the United States have launched a well-publicized assault on the theory of evolution, suggesting that the complexity and diversity of nature is not the product of random mutation and natural selection but rather of supernatural intent. Although exponents of intelligent design have been at pains to distance themselves from overtly religious interpretations of the universe, the intellectual roots of intelligent design can be traced to creationism, which holds that the natural world, including human beings in their present form, is the handiwork of a divine designer — namely, God. Biblical creationists contend that the world was created in accordance with the Book of Genesis — in six short days — while the followers of intelligent design eschew this literalism. They say that their goal is to detect empirically whether the "apparent design" in nature is genuine design, in other words, the product of an intelligent cause. They reject out of hand one of the central tenets of natural selection, namely, that biological change arises solely from selection upon random mutations over long periods of time. For those of you who are not conversant with the literature of intelligent design, the argument usually begins with Darwin himself, who said "If it could be demonstrated that any complex organ existed which could not possibly have been formed by numerous, successive, slight modifications, my theory would absolutely break down." From there, advocates such as Michael Behe, a professor of physical chemistry at Lehigh University, declare that "natural selection can only choose among systems that are already working, so the existence in nature of irreducibly complex biological systems poses a powerful challenge to Darwinian theory. We frequently observe such systems in cell organelles, in which the removal of one element would cause the whole system to cease functioning."

What is wrong with this view? To begin with, it reflects a fundamental misunderstanding of how evolution works. Nature is the ultimate tinkerer, constantly co-opting one molecule or process for another purpose. This is spurred on by frequent duplications in the genome, which occur at random. Mutations can accumulate in the extra copy without disrupting the pre-existing function, and those that are beneficial have the potential to become fixed in the population. In other instances, entirely new functions evolve for existing proteins. My favorite example is lactate dehydrogenase, which functions as a metabolic enzyme in the liver and kidney in one context, and as one of the proteins that makes up the transparent lens of the eye in another. In the first cellular setting, the protein has a catalytic function; in the second, a structural one.

A common weapon that is used to advance the "theory" of intelligent design is to posit that evolutionary biology cannot explain everything — that there remains uncertainty in the fossil record and that there is as yet no consensus on the origin or nature of the first self-replicating organisms. This, too, reflects a basic misunderstanding about how science works, for, in fact, all scientific theories, even those that are approaching 150 years of age, are works in progress. Scientists live with uncertainty all the time and are not just reconciled to it but understand that it is an integral part of scientific progress. We know that for every question we answer, there is a new one to be posed. Indeed, the very word, "theory," is misunderstood by many who take it to mean an "idea" that has no greater or lesser merit than any other idea. The fact that Darwin's "ideas" on natural selection have stood the test of time through keen experimental challenge does not give his theory special status in their eyes. There are also those who exploit the fact that scientists often disagree over the interpretation of specific findings or the design of experiments to argue that nothing is settled and thus anything is possible. The fact of the matter is that fierce disagreement is the stuff of scientific inquiry, and the constant give-and-take is needed to test the mettle of our ideas and sharpen our thinking. It is not, as many would claim, prima facie evidence for deep fissures in the central tenets of natural selection.

Of course, the real test of whether intelligent design is a scientific theory, comparable to Darwin's theory of natural selection and worthy of equal consideration in the biology classroom, is whether it poses testable hypotheses. Here the answer is self-evident — it does not — and therefore it has no place in the science curriculum of America's public schools, which rest on the premise that the state has no constitutional authority to impart supernatural truths. Rather than searching for explanations for the complexity that is surely present in each living organism, intelligent design accepts that this complexity is beyond human understanding because it is the work of a higher intelligence, leading logically to the conclusion that experimentation — the tried and true basis for scientific progress — is pointless. The result is an intellectual dead end. In fact, because there is no prediction that can be tested, the future of intelligent design is dependent on the failure of experiments designed to test other hypotheses. It is ironic that intelligent design's reliance on negative proof exacerbates what religious historians have called the "shrinking God" problem. Each time a natural phenomenon that has been attributed to divine inspiration is explained by scientific exploration, the role for an intelligent designer is diminished. In other words, they are setting up God to fail.

Today the scientific merits of intelligent design are being heatedly debated in school districts, courts, legislatures and churches across America. No lesser figures than the President of the United States, the majority leader of the United States Senate, and a cardinal of the Roman Catholic Church have suggested that in the interests of intellectual diversity, intelligent design should be taught together with natural selection. One United States Senator, Rick Santorum of Pennsylvania, went even further by saying that "intelligent design is a legitimate scientific theory that should be taught in science classes." And he is not alone in this view. Eighty years after John T. Scopes was convicted of teaching the theory of evolution in a Tennessee high school, the majority of Americans are still unsure of the validity of Darwin's theory. Not quite two-thirds of respondents in a recent national CBS poll favored the teaching of both evolution and creationism, while more than a third expressed the view that only creationism should be taught.

There is considerable disagreement within the scientific community regarding the best way to respond to this assault on evolution. One view is to dismiss or trivialize it by pointing out, for example, that everything we know about the human knee would suggest that no intelligent being could possibly have designed it. Another faction argues that the scientific community should ignore the opponents of evolution, for by engaging in the public debate over creationism, one inevitably lends credence to its premises. The third strategy is to enter the public debate on the side of science and evolution, and to do so firmly but respectfully.

My own inclination is to engage, to explicate, and to strive to understand why so many people find Darwin's ideas so difficult to embrace. Of course, scientists have found themselves at odds with the guardians of religious orthodoxy for centuries. The Copernican revolution, which dethroned the Earth from its central place in the universe, was regarded by many as heretical, as Galileo discovered at great personal cost. From the splitting of the atom to the creation of so-called test tube babies, scientists have been accused of usurping the role of the Almighty. As Albert Einstein himself would say of Ernest Rutherford, scientists have "tunneled into the very material of God," and for many men and women, these advances have been both alarming and disorienting. As the pace of scientific discovery quickens and no corner of the cosmos or the human body is exempted from scientific inquiry, the perception that we are distorting — rather than explaining — the natural order of things will only intensify. Inevitably, this perception will find political expression, and thanks to the American system of government, in which power is widely diffused, the views of even a small minority can carry disproportionate weight on Capitol Hill or in the White House.

I would argue, however, that evolutionary biology and its sister science cosmology, which seeks natural explanations for cosmic phenomena, may be special cases. This is because they can appear to conflict with humankind's universal need for a narrative to explain our origin and place in the universe. As my colleague and former President of Princeton, Harold Shapiro, wrote recently, "These narratives, grand and modest, rational or irrational, develop as a response to our own mortality, to our lack of control over aspects of our situation, and to the apparent insignificance of any individual." To the degree that evolutionary biology and cosmology appear to undermine the truth of such old and revered narratives, their findings will be deeply troubling and threatening to some. Creationist literature is full of objections to the idea that natural selection works on a random accumulation of mutations and not according to a guiding hand or discernable goal. It rejects as heresy the notion that if the last 5 million years of history were repeated, if we rolled the dice again, human beings in their current form might not have emerged victorious. The great evolutionary biologist Julian Huxley, who rejected dualism of any kind, captured what creationists find most objectionable when he famously if somewhat irreverently opined that "evolution is what you get when you give an idiot all the time in the world."

A more temperate response is offered by Frank Rhodes, the distinguished scholar of Darwin and evolutionary theory, who has no difficulty reconciling evolutionary theory with belief in a creator. He argues that the "truth is that evolution is neither anti-theistic nor theistic. So far as religion is concerned, evolution is neutral. It does suggest that species arise by natural selection which proceeds by natural laws, but, like all scientific theories, it provides no ultimate interpretation of the origin of the natural laws themselves." Another perspective on this theme has been offered by Kenneth Miller, a professor of biology at Brown University, who wrote that if an exponent of intelligent design "wishes to suggest that the intricacies of nature, life and the universe reveal a world of meaning and purpose consistent with a divine intelligence, his point is philosophical, not scientific. It is a philosophical point of view, incidentally, that I share ... [but] in the final analysis, the biochemical hypothesis of intelligent design fails not because the scientific community is closed to it but rather for the most basic of reasons — because it is overwhelmingly contradicted by the scientific evidence."

This brings me to the joint lesson to be derived from the two case studies I have discussed today. Arguments over the relative value of manned and unmanned space flight or over the content of the biology curriculum in America's public schools may seem remote from one another, but at the center of both tales are dangers that arise when science and politics fall out of alignment and become "strange bedfellows." As Thomas Huxley rightfully said, it is naive to think that science can be completely divorced from other aspects of human activity, but the credibility of science can be compromised — sometimes fatally — when it is allowed to be inappropriately co-opted for political and religious purposes. Sending Americans to Mars may be politically astute, and promoting intelligent design in American classrooms may be a source of comfort to those who are threatened by the implications of natural selection, but neither, in my judgment, represents sound science, and to suggest that they do threatens the integrity of the entire scientific enterprise. The ultimate risk is that we lose the trust and respect of the public, on whom we depend for the support of science. It is not that scientists have a monopoly on truth or wisdom — after all, we are human beings, which means we are fallible — but in the scientific method we have a tried and true process to explore natural phenomena based on proposing and testing hypotheses through observation and experimentation. This method has served us well in advancing human knowledge and, ultimately, in helping to improve the lot of our fellow men and women.

The role of the scientist must be safeguarded, not only by society but by scientists themselves. And here I need to take my own profession to task for a moment. All too often, we take refuge in the ivory towers represented by our labs or limit our scientific conversations to our peers and a privileged group of graduate students and post-doctoral fellows — the scientists of tomorrow. There is a lot to be said for the "dreaming spires" of Oxford and the ivied walls of Princeton, but if our work begins and ends there, we will have only ourselves to blame when the public calls for flights to Mars and the teaching of creationism or, conversely, disregards the dangers of global warming or a potential influenza pandemic. And we will have only ourselves to blame when no one offers us a seat at the table where crucial public policies are formulated.

In his farewell address to the National Academy of Sciences this spring, outgoing president Bruce Alberts cautioned that "most people have never encountered a working scientist, nor do they understand how science works or why it has been so successful. Far too many think that we are weird geniuses, when in fact the vast majority of us are neither. . . . I am absolutely convinced that the scientific community will need to devote much more energy and attention to the critical issue of educating everyone in science, starting in kindergarten, if we are to have any hope of preparing our societies for the unexpected, as will be required to spread the benefits of science throughout our nation and the world."

This is not an easy task, but I believe it can be accomplished if we are prepared to venture into the public square and express ourselves clearly, respectfully and passionately. We cannot take refuge in esoteric arguments and bewildering jargon and hope our audience will embrace the positions we espouse. Science need not be difficult or opaque if we choose our words and practical illustrations wisely, and the concepts of evolution or the Big Bang are no exceptions. Neither can we succumb to the arrogance of knowledge. We must listen as well as speak to those who look at the world through a fundamentally different prism and be prepared to acknowledge the legitimacy of their beliefs while drawing a clear distinction between the tenets of science on the one hand and political aspirations and religious beliefs on the other. Finally, we need to invest our message with passion. By passion I mean a willingness to convey the extraordinary beauty of the natural phenomena we study. We can demystify the heavens without destroying the wonder of a meteor shower. We can explain the genesis of the giraffe's extraordinary neck or the millipede's bounty of appendages without reducing these animals to a collection of data sets. We can call evolution and cosmology what they are: the most compelling explanations we currently have for the cosmos and the extraordinary diversity of life that we enjoy on Earth.

Thank you for allowing me to share my thoughts with you today.

© 2005 The Trustees of Princeton University