Science is the most communal of human endeavors. The vast structures of physics and biology assembled in this century were put together piece by piece by countless people whose identity has by this time been forgotten. The major figures, whose names are stamped there perhaps forever, were gifted in being able to figure out where the key pieces would fit, but the pieces themselves came from other minds and hands. Albert Szent-Georgi once remarked: "Discovery consists of seeing what everybody has seen and thinking what nobody has thought."

It is easier to recall the names of the participants years ago, because there were so few to remember. Science began as a very small enterprise, involving a few amateurs, but even then they were in touch with each other, communicating their findings. Communicating has long been a familiar word in scientific jargon. Papers submitted to the Proceedings of the Royal Society and the National Academy of Sciences are communicated; the word is printed there on the front page under the authors' names. In the case of the Proceedings of the National Academy of Sciences there must also be printed, by federal law, a note at the bottom of that first page stating that the costs of publication of the article have been paid by the author or authors, or by their parent institutions, and that because of this the article must officially be designated as an "advertisement." It is a very small sign of the increasing involvement of government in the style and manners of science, a minor but disturbing signal, to which I shall return.

Today's science has expanded, within just the years of this century, from a small enterprise to an immense industry, from the commitment of a few hundred workers to one involving hundreds of thousands, maybe millions, here and abroad.

And so it is, in the face of the legislated mandates of the various governmental agencies responsible for the sponsorship of their own national science, and despite their zeal to enhance the research and development capacities within their various nations, that there is in being a worldwide community (or nearly worldwide, considering the sidelines position still held in some fields by the Soviet Union) of working scientists who do their work together, across oceans and national borders, without any awareness of national or ethnic or social identities. They make up, in the aggregate, the largest and most cohesive of underground movements to be found anywhere on the globe; subversive in the literal meaning of that word, which is to turn things upside down.

I am not surprised that governments are edgy about the scientific communities under their governance; I am surprised that they are not more worried. Statesmen like to arrange the affairs of state in whatever orderly fashion they find most equable and stable; they do not like the prospect of having the ideas of people under their rule upheaved any day, revolutionized, changed beyond their comprehension. Or, if things like this are to take place, the statesmen would prefer to arrange them in advance, calling the shots for science before science gets round to publishing.

Political doctrines can sometimes be made to conform to scientific truths, I suppose, but only after all the facts are in. It was feasible for a while, in the years after The Origin of Species, to cook up crazy notions about Social Darwinism and to base social policy on those headlong extrapolations, but later on, given more scrutiny of the real facts, the sociological theories lost their underpinnings. Lysenko's pre-cooked ideas about state-controlled plant genetics were attractive to Stalin, but the research had been fabricated meticulously beforehand and then tucked in place to fit with orthodox doctrine and failed, predictably and miserably. It is easier, I think, to fudge in politics than in science. I would be glad to take it farther: it is possible to fudge in politics forever, but only for a limited time in science.

This is not to assert that the facts about nature obtained by working scientists and passed along to the world are always and necessarily true, in the sense that they can then stand forever as immutable, enduring pieces of information to be added to the world's knowledge. To the contrary, great masses of the data that keep coming in are accepted for a period of months or years and then seem to drop away from notice, lose their relevance, and are replaced by new sets of information and new theories to explain them. A surprising number of scientific facts turn out to be biodegradable.

This generalization hangs on what is meant by a scientific fact. I tend to call something a fact when the data, taken together, mean something. It was a simple declarative statement, accepted nearly all round in the mid-nineteenth century, that our sun and solar system were several hundred million years old, but no older than that. Lord Kelvin, already a tremendous figure in classical physics, arrived at the fact by calculations based on the known sources of energy and the known rates of energy loss. The numbers were stressful for Charles Darwin, posing what he termed one of his "sorest troubles" in formulating the theory of evolution. A few hundred million years was simply not a long enough stretch of time into which the process of evolution could be compressed. T.H. Huxley pointed out then that the facts of the matter, based on perfectly straightforward mathematics, were perhaps not as solid as they seemed. But it was not until much later, after atomic physics had emerged and it became possible to calculate the earth's age on the basis of a new level of science, that the real facts were found to fit with Darwin's theory.

Kelvin had still another opportunity to get the facts, the meaning of the data, wrong, and he was not alone in this. In a famous address before the Royal Society around the turn of the century, he celebrated the full maturity of his science. Physics, he stated, had now come almost its full distance as a completed intellectual endeavor. All that remained to be done was to tidy up a few loose ends and straighten out some small, bothersome anomalies, and physics would be home and dry. He would not have advised a young ambitious scientist to come into such an almost-finished discipline. Then came radiation, the quantum, relativity, the atomic nucleus, indeterminacy and all the rest of twentieth-century physics, and a new world of facts and theories displaced the old one.

Recently, the eminent British cosmophysicist Stephen Hawking stated publicly that the fundamental laws of physics are nearing full comprehension, and that the deepest problems in modern physics will all be solved before the end of this century. Others have begun nodding their heads in agreement, and when we reach something like a consensus on this important matter, sometime within the next ten or 15 years, it will again be time for us to be upheaved, all over again, perhaps into another new kind of universe, with another sort of huge puzzle to solve. For the time being, physics seems to have run out of paradoxes, but this does not mean that there are no paradoxes out there, waiting to be found.


One conventional, nice thing for a scientist to say when being praised for a major discovery is, "I stood on the shoulders of giants." This sounds like the most modest of boasts, although admitting to a measure of agility in getting up there to stand. In real life, though, the shoulders are those of other ordinary-sized scientists, some long dead, others just down the corridor. Once in a while, the shoulders are in fact not stood on, merely glanced over.

Most of the time, however, the information is passed around like free gifts, almost as though science were an enormous, perpetual party. A sizable amount of the pleasure in doing research comes at the moment when it is possible to tell someone about it. It is not true that scientists live in the tense fear that someone else will get to the final answer first, nor that they will kill or die when priority is at stake. A good working scientist, exploring his particular puzzle, will rush out into the corridor, even into the street, if he thinks another scientist will be there to listen to his latest piece of news. Most research is exploring in the first sense of that word: "to cry out on finding."

This is true, at any rate, in the general field of basic biomedical research. It is not so true, for obvious reasons, in applied science, where the commercial stakes usually impose a period of silence.

At this point, it may be appropriate to define the terms. The taxonomy of scientific endeavor has become more complicated in recent years, with much blurring of the formerly sharp line dividing basic from applied science, but the difference is still there. If the research is intended to explore a mechanism in nature for its own interest, in order to find out how it works and what the component parts may be, and if the questions on which the experiments are based are tentative questions, raised as guesses in an atmosphere of uncertainty, and if when a guess turns out to be correct the result comes as an overwhelming surprise, you are doing basic science by my definition.

Changing one's mind about the next experiment, even about the original hypothesis, is typical behavior. It makes no difference if something useful and usable emerges at the end as a product; it is the uncertainty at the beginning and throughout, about how the work will turn out, that makes it basic research. The surprise is not essential for my classification, although it is true that successful basic research is often associated with one exhilarating surprise after another, and that is one of the reasons for liking this occupation.

Applied research is a radically different kind of endeavor. Here, you need at the outset an array of solid scientific facts, most of them uncovered previously by people engaged in basic research. The array must suggest that something useful and usable, a product or a new technology, can be assembled by pursuing a line of inquiry indicated by the data at hand. There needs to be a high degree of certainty about the desired outcome before the work begins. Typically, the process involves teams of investigators and a detailed protocol of the steps to be followed by each participant. There must be a general agreement by all parties that they will do exactly what the protocol says, from start to finish, without changing their minds. If the work turns out to be successful at the end, with the desired product in hand, there is quiet satisfaction all round.

In applied research, the surprise comes if the thing does not work. An elegant example of applied science in medicine was the creation, by Jonas Salk and his collaborators, of the first vaccine against poliomyelitis. Once it was known, from basic research done by others, that there were three distinct antigenic types of polio virus, and only three, and that they could be produced in tissue cultures by the methods devised by John Enders, the vaccine became a certainty, and so it was.

In between these two there has emerged a sort of hybrid science. It might be called basic applied science, or purposive basic research, or targeted guesswork. It has become a dominant style in biomedical science in the past decade, and is best illustrated by what is happening today in research on the phenomenon of cancer.

Two methods, each derived from separate lines of basic biological research, are involved. The first comes from the high technology of recombinant DNA, which has made it possible to identify naturally existing cancer genes in many different varieties of normal cells. When these genes are switched on, or moved about from one part of the genome to another, the normal cells are transformed to cancer cells. The mechanisms responsible for switching them on, and the nature of the gene products that then do the transforming, are entirely unknown, anyone's guess, nor is it at all clear what mechanisms come into play to change the physical and chemical operations of the cell. This can be regarded as basic research at its most profound level of uncertainty, in need of any number of testable hypotheses and, very likely, innumerable experiments bound to end in failure and frustration. But it is also applied science, in the sense that it is targeted toward a product even though the product is still invisible.

The second technique is derived from a piece of basic research begun two decades ago, when it was observed that any two unrelated cells, from the same or different species, can be induced to fuse together forming a single cell with a single nucleus. The phenomenon was turned to use by cell biologists, who created lines of mouse-human hybrid cells for learning which enzymes in the cell were coded for by the genes on one or another chromosome. Then, some years later, Milstein and his colleagues in England had the brilliant idea of fusing an antibody-forming lymphocyte with a mouse cancer cell, for the purpose of creating live and immortal factories producing absolutely pure antibodies, known as monoclonal antibodies.

This technology is now being applied to the study of oncogenes, in hopes of identifying the specific proteins and other gene products that are produced by cancer genes. It is pure guesswork that the method will accomplish what is intended, and something like total uncertainty as to what kinds of substances are to be looked for by this immunological approach. Meanwhile, the same sorts of pure anti-cancer antibodies are being explored for their potential as therapeutic agents, in hopes that they will seek out all cancer cells in the body and destroy them one by one.

This is basic science at its best, pure research for the purpose of finding out how things work in cancer cells. But it is at the same time a venture in applied science, for no one doubts that as the news comes in we will not only be provided with a genuine comprehension of how cancer cells work, but will very likely know some useful and usable things to do for the reversal and control of the disease.

This line of work provides, as well, an excellent example of the ways in which biomedical science has been carried out as an international venture, moving back and forth across national boundaries. If the problem of human cancer is solved by one or both of these approaches, the ultimate solution cannot be fairly claimed by any nation, nor by any particular laboratory or group of laboratories within a nation. The work has reached its present stage of high promise as the result of the most intricate network of international collaborators, and it will have to progress in the same way if it is to be successful.

To be sure, there will be the usual strident claims on priority by whatever laboratory is successful in putting the final piece of the puzzle in place, but everyone will know (and, I hope, remember) that the puzzle itself could not have been shaped into being without the most intense cooperation within that international network, over a period of at least 30 years. Crucial bits of information, indispensable for today's level of incomplete comprehension, and even more indispensable for framing the questions that still lie ahead, have come from laboratories in Great Britain, France, West Germany, Switzerland, Belgium, Denmark, Norway, Sweden, Japan, Australia, Canada, Israel, Italy and the United States.

The workers in the field have been keeping in such close touch with each other that all of them know the contents of the latest paper months before its publication in a scientific journal. The word gets around these days at almost the speed of light: the results of the latest experiment in Edinburgh or Boston are known to colleagues in Melbourne or Tokyo almost as soon as completed. The mechanism for the international exchange of scientific information is informal and seemingly casual, resembling gossip more closely than any other sort of information system, except that gossip has the reputation of unreliability and this exchange is generally solid and undistorted. It arrives by telephone, or out in the lobbies and bars of the hotels and institutes where international symposia or congresses are being held, or in quick conversations between people of different nationalities crossing a campus lawn in one country or another. For the moment, the international language of science is broken English, but a surprising number of young English and American investigators are coming along partly equipped with at least French and German.

The information is not just passed around automatically, it is literally given away. This is a curious phenomenon in itself, looking something like altruism in the biological sense of that term. It is intuitively recognized by all the participants that this is the only way to keep the game going. If a piece of one's own brand-new information, just fished up in one's own laboratory, is withheld from another laboratory in another place in the interests of secrecy, the flow of essential information from that laboratory's work may also be stalled, and the whole game will slow down, maybe stop altogether.

Altruism may be too strong a term for the exchange, implying the sacrifice of something personally treasured and important. A better word might be symbiosis, for there can be no losers in the long run if the game is played in this way.


Simultaneously with the international exchange of basic scientific information, there has been an even more active transport of young scientists across borders. Up to now, most of this traffic has involved great numbers of Japanese, Indian, Korean, Taiwanese and Iranian scholars at the postdoctoral level, joining laboratories in this country and in the United Kingdom, in Europe and in Canada and Australia. Some of these young scientists plan to go home again after a period of several years' training, but many of them have changed their minds and hope to stay on as semipermanent visitors. In recent years, postdoctoral students from mainland China have also been coming abroad, all of them with the declared intention of returning to the People's Republic.

This kind of exchange, by the way, has nothing at all to do with altruism, and any resemblance to symbiosis is quite unintentional. Two forces are at work. First of all, the young people crossing our borders to work as postdoctoral fellows are coming here to learn science because there are too few opportunities for them in India, Japan, China, Korea or wherever. And second, we are delighted to have them because they are highly motivated, bright and quick to learn, and willing to work extremely hard for relatively low stipends. Also, sad to say, the numbers of equally qualified young Western nationals who want to come into science in their own countries have gradually diminished in recent years, while the demand for more researchers at this level has steadily increased.

I am not sure what the implications of this phenomenon are for the future. Perhaps it will turn out, as it should, all to the good. The young alien scientists, many of them anyway, will go home highly trained for research and potentially very productive and profitable for their home countries, and in the meantime those countries should have strengthened their own universities and scientific institutions to receive them and put them to work.

On the other hand, it may not go this way. The foreign nationals may feel compelled to stay on in the West because of no openings at home, and we will at the end have effectively drained away precisely the talent needed by the home country for its future. It is a problem that we have not been inclined to worry much about at policy levels, not in the American universities and research institutes anyway. We have been glad to have the needed hands and brains, and expect someone else, presumably in Washington, to think about the long-term future. Or if pressed, we can think up ready solutions for someone else to implement, like directing more foreign aid to be targeted to strengthen Indian or Chinese universities, without stopping to think how hard and complicated a task that might turn out to be.

I do not worry in this way about Japan. Indeed, I hope the young Japanese postdoctoral students keep on coming, for they have added greatly to the quality and volume of American biomedical research. Furthermore, basic science is building nicely in Japan, and before long I expect there to be ample opportunities for the returning research fellows to find good jobs there. If their scientific affairs go as well as they should, I look for a time ahead when some of our best American students will hanker to spend a couple of years learning something new in Tokyo or Osaka. But I do not have this optimistic feeling about India, or Pakistan, or Korea, or mainland China or, heaven knows, Iran.

The migrations of talented young investigators within and among the Western countries themselves have been fairly steady and active, with cyclical ups and downs depending on the nature and progress of research in the numerous institutions involved. I do not know the current numbers, but I see no signs of any draining of brains from one country into another, certainly no massive emigration from Europe and Britain into the United States such as occurred in the decade after World War II. I do see a much more encouraging international phenomenon: there are many more travelling scientists of all ages, ranks and nationalities, ambling into laboratories here and abroad for short stays-a few weeks or months-in order to learn a new technique or to bring in a new technique for quick application to an engrossing problem not available in the home laboratory. This is a wonderful thing for the progress of science in general, and the best of ways for scientists to make friends with their counterparts abroad. I commend it to the attention of foundations interested in international amity as well as science, and I commend it especially to those agencies of government with similar concerns.

I wish that something like this could begin to take place with the countries of Eastern Europe. Even more I wish, but with a lot less hope, that we could begin exchanging biomedical information and young (especially young!) biomedical scientists with the Soviet Union. The basic biological and behavioral sciences seem to have slowed down to a near halt in Russia in recent years, or if this is not so there has surely been something appallingly wrong with the existing systems for information transfer. On the available record, there is little evidence that Soviet biology has been caught up in the contemporary biological revolution.

As for the exchange of working scientists, it has never really been tried, for various obvious reasons, and I suppose these are the worst of times to try again. Nevertheless, I wish it would happen. It would not necessarily be the entirely one-way street it is so often said to be. There are some extremely interesting studies in ecology, both theoretical and in the field, now going on in the U.S.S.R., and I keep hearing-and occasionally reading-of interesting Russian research on the relationship of neurologic and immunologic mechanisms. In any case, there is no doubt that young Soviet researchers now at the postdoctoral level would find enormous interest in the kinds of biology now being done in Western laboratories. But there it is, and, I suppose, there it lies for as far ahead as one can see. I just wish it were different, and incidentally I cannot think of a better audience than this to say this to.

I remember a time, in a stretch of almost 20 years before the close-down in Czechoslovakia, when a small cluster of Czech biomedical researchers in Prague had achieved a position of such eminence in my own field, immunology, that we used to hold regular conferences in that lovely city. The liveliest problem for immunology in those years was transplantation biology, and the laboratories at the Czech Academy of Science and Charles University were among the most interesting in the world. During that time the restrictions on travel were easier, and one or another young Czech immunologist was always at work as a visiting scientist in my department at NYU-Bellevue Medical Center. The ones I knew best were doctrinally pure Marxists, quite convinced that their society was on the right track and ours on the wrong, but we became good friends because of what was happening in our laboratories. Prague closed, and my friends have mostly dropped out of favor and out of sight. I wish it were not so, and it makes me melancholy to think that it may never happen again. Please fix it.

I have colleagues in other fields of biomedical science who tell similar stories about their own experiences with collaborators in Poland, East Germany, Hungary and Romania. Agricultural research is of special interest in Bulgaria. All such contacts have dwindled away in recent years, to the deprivation of science in general. We should be making a concerted effort to restore the connections, and I can see no risks of any kind, to either side in the cold war. I can appreciate, although with deep reservations, the governmental anxieties over the transfer of other kinds of conceivable military technological information, computers, materials chemistry, lasers and all that, but I cannot imagine any risks at all in a free exchange of immunology or molecular biology or neuroscience. What I can imagine, easily, is the creation of close, warm friendships among eager young people whose influence on future judgments in their respective societies may be decisive.


I believe that international science is an indisputable good for the world community, something to be fostered and encouraged whenever possible. I know of no other transnational human profession-and I include here the arts, music, law, finance, diplomacy, engineering and philosophy-from which human beings can take so much intellectual pleasure and at the same time produce so much of immediate and practical value for the species. The problems that lie ahead for the world, endangering not only the survival of our kind of creature but threatening the existence of numberless other species, are proper problems for science. Not to say that science will find answers in time, simply that there is no other way to solve the problems.

Of these, the worst is the impending extinction of more biota than have ever been lost in geological time since the mass extinctions of the Late Cretaceous 65 million years ago, when 52 percent of the world's marine species vanished along with great numbers of terrestrial creatures, including all the dinosaurs. Something on this scale lies immediately ahead, perhaps within the next few centuries, unless man can find ways to avert it. If it occurs, it will be caused by the swarming of human beings over all other ecological niches, and the destruction of those niches. It will come as the result of the deforestation now in progress, the inevitable extension of agriculture everywhere in order to feed the increasing numbers of ourselves, and the simultaneously inevitable depletion of the earth's resources. I plan to deal with this matter in some detail in my third lecture, but I allude to it here because it is, at least in part, a problem for international science, and most urgent of all imaginable scientific problems.

I do not assert that science will solve it. On my blacker days I have a hunch that it is already endgame, beyond fixing. But it needs a try. It is a problem for biologists, even astrophysicists. Maybe especially this last lot, since they command the technology needed most for getting a sustained, close look from the outside at what is happening to our planet.

For work of this kind, in the interest of global habitability, we need a world community of scientists, and we need to have them at each other, talking incessantly, telling each other everything they know the moment they learn it.

I have already noted that we have the beginnings of an international network, needing only the joining up of Soviet researchers and their colleagues in the Soviet satellite countries and the development of scientific enterprise in the emerging nations. It is in the nature of scientists who work at fundamental problems in nature to work together, to pass their information around. Now, I am obliged to pull up short of optimism, to acknowledge that what I would like to see happen may not happen at all or may be put off until too late. Some new things have begun to take place, just in the last few years, that may pull the network down.

There are just the faintest, earliest indications that something is about to go wrong. Nationalism may be about to make its appearance as an influence on basic research, a dead hand on science. The Europeans are talking about the need to create a Third Power, based on science, standing as a buffer and equal competitor between the two existing dominant empires. This is a grand idea, one that everyone should be in favor of, all depending on how the Europeans go about it. They seem to be saying that a United States of Europe is a possibility for science, even though it has not really been feasible for agriculture, steel, wine or other aspects of trade. I agree with this, and pray that it comes about.

But they are also talking, especially the French and the West Germans, about the need to "protect" European science, to stop the flow of free information to America where it becomes a source of enrichment for the United States. This means, from what the conversations I have been hearing sound like, the introduction of new constraints on research communication, confidentiality, secrecy, and thus an end to the kind of scientific gossip that has been moving science along. The British seem to be expressing similar concerns, having seen, too many times, the loss of their own basic research to development and profits in America. They will never forget penicillin. The French are convinced that there is a special French aptitude for a kind of science possible in no other culture, and they hope to lead Europe along their path. I am hearing more and more of what sound to my ear like anti-American sentiments, and they are beginning to come from the upper reaches of the European science policy establishment. I am apprehensive, still only mildly so, but worried for the future.

Another thing has been happening to basic biomedical research everywhere, and I do not know how it will come out. For the first time in the long history of biology, there is money to be made from science. The famous biological revolution is about to turn into an economic and industrial revolution. The recombinant DNA technology and the production of monoclonal antibodies are perhaps only two examples of what may lie ahead. The working scientists at their benches have suddenly discovered money, with which few of them ever had more than a nodding acquaintance in the past. Small corporations are sprouting in the vicinity of universities everywhere, issuing shares to their academic consultants, paying handsome salaries, converting a generation of people long resigned to living the lives of shabby curates into millionaires overnight, on paper anyway. Giant corporations in the chemical and pharmaceutical industry are beginning to invest huge sums in the basic science being done in universities and institutes, and the old arms-length, essentially adversary relationship between industry and the academy is now becoming a partnership.

I have been happy to see this happen, during a time when governmental funds for the support of basic research have been dwindling and are now being cut back. But I am beginning to worry about staying happy. So far, the terms of the new partnerships have seemed both enlightened and generous: the Hoechst and Dupont investments in molecular genetics and immunology at Harvard, for instance, are accompanied by written assurances that the companies will not have any say in what research is to be done in the laboratories; all that is asked for is a first look at the basic research as the results turn up in the laboratory notebooks. Licensing of a potential product is asserted to involve only minimal delays in publication, and all parties protest that there will be no need for secrecy, much less any gentleman's agreement on prolonged confidentiality. It makes good sense for the university to concentrate its attention on what it can do best in science, which is pure, undifferentiated basic research. It makes even more sense for industry to sponsor this kind of inquiry, since it is the long attested record that applied science and development cannot occur, ever, in the absence of fundamental information.

What worries me as an academic scientist is the money, much as I admire the direction of flow. I am concerned about what may be going on in the minds of the youngest investigators, people now working for their doctoral degrees and already looking around for postdoctoral opportunities. No matter what assurances they hear about the freedom of scientific inquiry in the university, about their sole mission to discover how nature works at a deep level, they are bound to have it somewhere in their minds that the surest way up the academic ladder will be to learn something useful, something with a chance of turning into a product. If this notion becomes the common one, driving the engine of science along, we may be about to enter a period when basic research will not be carried out for its own sake, out of pure curiosity, driven along by the imagination turned loose, but essentially controlled behind the scenes by money.

And if anything like this begins to happen, there goes the gossip and those long-distance telephone calls from New York to Basel, or Edinburgh to Melbourne. People long accustomed to telling colleagues everything they know, including everything they can guess at, may fall silent in the lobbies of international meetings. The scientific network around the world will then begin to fray and pop, and we will have lost the chance at a supranational community of amiable intelligences.


I hope these things can be prevented. The corporate world has as much at stake as does the world of science, and ought to be taking a very long, close look at the new arrangements. And governmental policymakers should be reviewing in detail the history of that most elegant and successful of all social inventions, the National Institutes of Health in Bethesda. It should not be forgotten that the early support by NIH of international research, with grants to laboratories in Paris, London, Cambridge, Melbourne and other places during the 1950s and 1960s, had a profound influence on the later development of both American and foreign biomedical science.

These days, very little NIH money is being invested in foreign laboratories. This may be as it should be, since other countries have now become interested in supporting their own basic biomedical research, and are quite capable of doing so for the long term. It is the history that needs remembering: the NIH had an important hand in the early construction of today's international network of the sciences concerned with human health. Any governmental agency that can do this sort of thing should be studied closely, and remembered for the future of international amity.

I said a while back that I could not imagine any security risks in the free international exchange of basic biomedical research data, in immunology, say, or in the new neurobiology, or in recombinant DNA research involving bacteria and viruses. The cold war, I was thinking, can never involve my fields of science in the way that is now deeply distressing to some of my cousins in basic physics, chemistry, holography, robotics and computer science. I was thinking that no agency head or commission in Washington is ever likely to urge the biologists to have a care about what they are publishing lest the information turn out to be useful to the Russians. I said that I could not imagine it, and this much is true. But just because I cannot imagine it doesn't mean that someone else will not cook up the notion, sometime in the future, especially if our international relations become even nastier than they are today. Someone will think of biological warfare, and out will come the cautionary memoranda, and we will be in the same boat with our colleagues in the harder sciences.

If I were a Soviet bureaucrat, envious over the stunning progress in fundamental research being made in the outside world, nothing would cheer me up more than to learn that the Western world is beginning to talk about the need for constraints on the communication of basic science. There are ways for governments to slow down research, by interposing too many committees, or by mandating more paperwork in the applications for support, or by trying to call all the shots in advance, predicting which fields in basic research are likeliest to turn out profitable in the future. Any of these can damage science, but only transiently. The one sure way of killing it off, surefire, once and for all, would be to make basic research a secret occupation. If ever the international frontiers are closed down, science will at that moment be dead on its feet.


There are now approximately 4.5 billion members of our species alive, and sometime within the next half-century that number will be almost surely double. Something around one third of us, residing in modern, industrialized societies, enjoy what we should be calling reasonably good health, living out almost the estimated maximum life-span for normal human beings. The rest, the majority of mankind, the citizens of impoverished nations, have less than half a chance at that kind of survival, dying earlier and living miserably for as long as they do live, threatened by constant hunger and an array of debilitating diseases unknown to the lucky third.

The central foreign policy question embedded in these loose statistics is the obvious one: what should the relatively healthy 1.5 billion human beings be doing to bring the other three billion into the twentieth (or twenty-first) century? I shall take it as given that there is an obligation of some sort here.

It is, in the first place, a moral obligation, but one driven by deep biological imperatives as well as by our conventional, cultural view of human morality. We are, like it or not, an intensely, compulsively social species. The reason we have survived thus far, physically fragile when compared with most other mammals, prone to nervous unsettlement by the very size and complexity of our forebrains, and competitively disadvantaged by our long period of absolutely vulnerable childhood, is that we are genetically programmed for social living. I can no more imagine a solitary, lone human being, totally and permanently unattached to the rest of mankind, than I can envision a single termite making it through life on its own, or a hermit honeybee. What holds us together in interdependent communities is language, for which we are almost certainly as programmed by our genomes as songbirds are for birdsong.

I do not mean to suggest that we are very good at this, nor that we have been successful up to now. If that were the case, we would not have swarmed around the earth in our present dense masses, increasing our population by logarithmic increments up to today's risk of crashing, in the fashion of the last several centuries. We are fairly good at family life, allowing for the fact that some families drive their members, literally, crazy. Each one of us has a circle of close friends, trusted and even loved by them, and each of those friends has another circle, and you would think the expanding circles would extend in waves to include everyone, but it is not so.

We have succeeded in working out long periods of survival in tribal units, allowing for the tendencies of tribes to make war against each other. It was in the invention of nation-states that we began to endanger our place in nature, by the implicit violation of all rules of social interliving. Instead of developing as a homogeneous unit of social animals, like an expanding termitarium, we took to splitting into colonies of ourselves and now all of them have become adversaries. Some, by luck or geography or perseverance, have turned out to be rich and powerful, others dirt-poor and weak, and here we all are, in trouble. Mankind is all of a piece, a single species, and our present situation will not do.

The only excuse I can make for us is that we are new at the game and haven't yet learned it. It is a mistake to think that the cultural evolution of humanity has been in any way analogous to biological evolution. We haven't been here long enough to talk about our living habits in the terms used by paleontologists and geologists. In the long stretch of epochs called time by the earth scientists, the emergence and development of social humanity could have begun only a few moments ago. We are almost brand-new. It may even be a presumption to say that we are already a juvenile species. The life of the earth is almost four thousand million years, and the evolution of species is recorded in spans of many million years each.


Although we turned up in something like our present form around one million years ago, we probably did not become what we would call human until we acquired human speech. We do not know when language began, but there is evidence that this may only have happened when there were enough of us living in hunter-gathering or agricultural communities, a few-score thousand years ago, to permit critical masses of the children to be aggregated together and at each other. Derek Bickerton has discovered that the Hawaiian Creole language was strung together, with all its grammar and syntactical rules installed, by the young children of the multi-tongued plantation workers who were brought to the islands in the 1880s. When the language appeared, the adults could neither speak it nor comprehend it. This suggests a biological role for the children of our species, and a justification for the long duration of an otherwise unproductive period of life. We all know about the special gift that children have for learning language, but the possibility that they make language, and the derivative possibility that they may actually have carried the responsibility for inventing it in the first place, thousands of years and thousands of languages ago, is a new thing to think about. If we are held together as a social species by language, as I believe, and if human speech with its unique features of metaphor and ambiguity comprise for us the equivalent of an ecological niche on the planet, then a long childhood is, in strict Darwinian terms, of the greatest selective value for our species. And if this is so, the survival and health of our children is a biological imperative for the survival of our species as homo sapiens.

In another sense, we may all be going through a kind of childhood in the evolution of our kind of animal. Having just arrived, down from the trees and admiring our thumbs, having only begun to master the one gift that distinguishes us from all other creatures, it should perhaps not be surprising that we fumble so much. We have not yet begun to grow up. What we call contemporary culture may turn out, years hence, to have been a very early stage of primitive thought on the way to human maturity. What seems to us to be the accident-proneness of statecraft, the lethal folly of nation-states, and the dismaying emptiness of the time ahead may be merely the equivalent of early juvenile delinquency or the accidie of adolescence. It could be, as some are suggesting, that we will be killed off at this stage, that what we are living through is endgame; I do not know, but if so we will be doing it ourselves, and probably by way of nuclear warfare. If we can stay alive, my guess is that we will someday amaze ourselves by what we can become as a species. Looked at as larvae, even as juveniles, for all our folly, we are a splendid, promising form of life and I am on our side.

I would feel better about our prospects, and more confident for our future, if I thought that we were going to solve the immediate problem of inequity. It is one thing to say that some of us are smarter and more agile than others, more skilled in the management and enrichment of our local societies, therefore bound to be better off in living. It is quite another thing, however, to say that there is anything natural or stable about a world society in which two-thirds of the population, and all the children of that two-thirds, have no real chance at human living while those of us who are well off turn our heads away.


This is not the time, in today's kind of world, for me to be talking about equity in terms of the redistribution of the world's money, nor, surely, is this the place. Nor is this a matter that I possess qualifications for talking about, much less thinking about. But I do see the possibility, at least in technological terms, for doing something about the existing gross differences in the health of peoples in various parts of the earth. Moreover, I believe that this country, and the other countries like it in the so-called industrialized world, are under a moral obligation to do whatever they can to change these inequities, simply because we are members of a social species.

There are also, I suspect, obligations of a political nature, with substantial stakes of self interest in having a stable, predictable world. The disease problems in the undeveloped nations are, in part, the result of poverty and malnutrition, and these in turn are partly the result of overpopulation. But the problem turns itself around: the overpopulation is, in part, the result of the disease, poverty and malnutrition. To get at the situation and improve it, something has to be done about all of these, in some logical sequence. It is not one problem, it is a system of problems. To change it without making matters worse will not be easy. Making well thought-out changes in living systems, as Jay Forrester demonstrated for city models, is a dangerous business. Fixing one part, on one side, is likely to produce new and worse pathological events miles away on the other. The most dangerous of all courses is to begin doing things without even recognizing the existence of a system, and in this case a system in which all people, including the citizens of this rich country, are working parts.

If we should decide to leave matters roughly as they are, and to let nature simply take its course, it is hard for me to see how the future events can be politically acceptable, never mind morally. If a majority of human beings are to continue dying off before having a chance of life, succumbing to diseases that are, at least in principle, preventable, and in many cases dying of starvation with or without the associated diseases, this cannot be kept a secret. Television will be there, and as the disasters and the unhinging of whole countries of dying people become spectacular, the scrutiny by television will become closer and more continual. To say that this will have an unsettling effect on the viewing audiences in affluent countries is to understate the likely reaction. Meanwhile, the efforts will intensify among the billions of afflicted people to get out of wherever they are and to cross borders into any place where they can sense food and a hope of survival. Those left behind will continue the tropical deforestation already in progress, extinguishing immense ecosystems upon which other species are dependent and causing global climatic changes beyond predicting, jeopardizing the very life of the planet.

The greatest danger of all is in our own response. Having let nature take its course, we may someday decide that the problem has become insoluble, that the people knocking at our doors with hands outstretched have become enemies at our gates, who can only be coped with by the traditional method of killing them. This scenario is by no means unthinkable, not after this kind of century. We might even persuade ourselves that this is natural behavior, in aid of the whole species. Other kinds of animals, less equipped with brains and technologies than we are, take steps of their own to reduce their numbers by "crashing" from time to time. A crash is a jargon term used by the ecologists for the catastrophic events which follow, inevitably, when any species has overreached itself, outnumbered the sources of food in its ecological niche and overgrown its allotted space on the planet. But other creatures, the "lower" animals as we like to call them, do not crash selectively, they crash all at once, all together.

Within another century it is likely that we will have swarmed everywhere, pole to pole, covering almost every liveable acre of land-space and water-space. Some people are even talking seriously of space-space, theorizing about the possibility of launching synthetic cities and countrysides enclosed in huge vehicles to sail the galaxy and perhaps colonize other celestial bodies.


What, on a necessarily limited scale, can we do? Specifically, what should we be planning for the improvement of the health of the masses of people who are now condemned by circumstance to lives that are, in the old phrase, nasty, brutish and short? Is it possible to do anything, without running the risk of still another expansion of the human population? If people everywhere could become as reasonably healthy as we are in the United States, with birth rates and death rates approximately the same as ours, would the world become intolerably over-populated? Considering the alternative-a massive population explosion already under way and beyond control, based in large part on the reproductive drive among people now deprived of almost everything except reproduction itself-I am not at all sure. It seems to me worth a try, and I am unable to imagine any other course of action.

When we in the Western world use the word "health," we mean something considerably more than survival and the absence of incapacitating diseases. To be healthy, we count it necessary to be happy and rich as well. But for the present discussion, I prefer to keep to the old-fashioned meaning.

Let us assume, in a flight of imagination, an economic state of affairs in which it is financially possible for the richer countries to export replicas of their entire technology of medical care to the poor nations. This would involve, I suppose, prefabricated versions of the Massachusetts General Hospital and Memorial Sloan-Kettering Cancer Center, to be installed in every major city in middle Africa, Asia and South America, plus their professional staffs, plus duplicates of any top-drawer, accredited American medical school. And money enough to sustain these enterprises for, at the least, a period of 25 years.

I believe the net effect of this munificence would be zero, or something less than zero. The affluent and influential members of whatever establishment exists in the local bureaucracy would doubtless enjoy the new institutions, and would save the air fares now needed to fly them to hospitals in London or New York. But the masses of people, especially those crowded into the slums or still living out in rural areas, would be entirely unaffected, or perhaps even adversely affected because of the investment of all available funds on technologies totally inappropriate to their health problems.

We are worlds apart. In our kind of society, today's enormously expensive health care system was put together in the decades following World War II primarily to cope with the medical concerns of people in their middle years and old age. The improvements in the general health of our populations, which began in the nineteenth century, had by this time reached such a high level that premature death had become less of a genuine day-to-day event and more of a nagging, and to some extent, a neurotic anxiety. Our attention is focused on such diseases as cancer, heart disease and stroke; we do not have to worry about dying early from the things that are killing most people every day in the Third World.

There is no question that our health has improved spectacularly in the past century, but there is a running argument over how this came to be. One thing seems certain: it did not happen because of medicine, or medical science, or the presence of doctors.

Much of the credit should go to the plumbers and engineers of the Western world. The contamination of drinking water by human feces was at one time the single greatest cause of human disease and death for us; it remains so, along with starvation and malaria, for the Third World. Typhoid fever, cholera and dysentery were the chief threats to survival in the early years of the nineteenth century in New York City, and when the plumbers and sanitary engineers had done their work in the construction of our cities these diseases began to vanish. Today, cholera is unheard of in this country, but it would surely reappear if we went back to the old-fashioned ways of finding water to drink.

But long before plumbing, something else had happened to change our health prospects. Somehow, during the seventeenth and eighteenth centuries, we became richer people, here and in Europe, and were able to change the way we lived. The first and most important change was an improvement in agriculture and then in human nutrition, especially in the quantity and quality of food available to young children. As our standard of living improved, we built better shelters, with less crowding and more protection from the cold.

Medicine was only marginally involved in all this. Late in the nineteenth century the role of microbial infection in human disease was discovered, epidemiology became a useful science, chlorination of our water supplies was introduced, and quarantine methods were devised to limit the spread of contagions. The doctors had some voice in these improvements, but they did not invent the technologies. Medical care itself-the visits by doctors in the homes of the sick and the transport of patients to hospitals-could have had no more than marginal effects on either the prevention or reversal of disease during all the nineteenth century and the first third of the twentieth. Indeed, during most of the centuries before this one, doctors often made things worse whenever they did anything to treat disease. They bled seriously ill patients within an inch of their lives and sometimes beyond. They administered leeches to draw off blood, spread blistering ointments over all affected parts of the body, purged the bowels with toxic doses of mercury, all in aid of eliminating what they called congestion of diseased organs, a figment of Galen's first century A.D. imagination.

In retrospect, there is nothing puzzling about the stunning success of homeopathy when it was introduced by Hahnemann in the mid-nineteenth century. Homeopathy was based on two notions, neither of them supported by any kind of science, both pure speculation on Hahnemann's part. The first was what he called the Law of Similars, that "like cures like." If a drug caused symptoms resembling those of a disease, fever or vomiting for example, then that drug should be used for treating that disease. But it was his second notion that assured his success and that of his practice: the drugs should only be given in fantastically small amounts, diluted out to one part in ten billion or more. In effect, homeopathic therapy was no therapy at all beyond reassurance, and a great many patients were thus protected against the conventional medicine of the day. No doubt they felt much better, and had a considerably improved prospect of recovery.

It was not until the early twentieth century that anything approaching rational therapy emerged for human disease, and it was not until the middle of the century that we came into possession of rational and powerful technologies for the treatment and prevention of infection on a large scale. Now that we have them, they are blessings indeed. We no longer need worry, as everyone did when I was a medical student, about tuberculosis. At that time, in the 1930s and 1940s, people in this country were frightened by tuberculosis as they are now fearful of cancer. When tuberculosis occurred in the very young and the very old, the diagnosis was a flat death sentence. We no longer live in fear of tertiary syphilis, which once filled more beds in the insane asylums than schizophrenia; I have not heard of a case of brain syphilis in a New York medical center in the last 15 years. We probably got rid of tertiary syphilis, by the way, not by the skill of public health departments but because of the overuse of penicillin by practicing physicians; it was a sort of accident; by creating a virtual aerosol of penicillin across the land for treating upper respiratory infections, in which penicillin has no real effect, we probably wiped out most of the latent spirochaetes in the tissues of infected people. We still have acute primary syphilis, of course, but the elimination of tertiary disease was an absolute triumph for the country's health.

In the same way, partly by accident and the overuse of antibiotics, acute rheumatic valvular disease, which was once the dominant form of heart disease in our society, has almost disappeared.

Pneumonia is no longer the death threat for young and middle-aged people that it was 50 years ago.

And surgery has been transformed as a profession. Simultaneously with the development of antibiotic therapy-and in large part because of it-surgery underwent a comparable revolution. In the years since, surgical techniques have become vastly more sophisticated and powerful. Along with the control of infection, the surgeons have learned enough about maintaining blood volume and electrolyte balance so that open-heart surgery, organ transplantation, the repair of minute blood vessels, the replacement of severed limbs, and extensive procedures for the removal of cancers once considered unapproachable, have become everyday, routine procedures.

Now, can we package all this technology up, and send it along to the impoverished nations? Should we? Would it be useful, or as the phrase has it these days, would it be cost-effective? I think not.


I believe that what is needed for the health of Third World societies is the same base of general hygiene that was put in place in America and Europe before the introduction of modern medicine. Unless this is done first, the adding on of our highly expensive and sophisticated technologies simply cannot work.

The people who are dying prematurely in Central and South America, and in most of Africa, and large parts of Asia, have a different set of problems. They must raise their families in the near-certainty that half or more than half of their children will die in infancy and early childhood, and accordingly they produce as many as they can as early as they can. The losses are mostly due to diarrheal diseases, caused by contaminated water supplies and by faulty hygiene. The vulnerability of young children to lethal infections is enhanced by inadequate food, and to some extent as well by inadequate information about the selection of food for young children.

Aside from the infant and child mortality from infection and malnutrition, the major health problem for people living in tropical and subtropical regions is parasitic disease.

Here is a set of health problems for which this country can really make itself useful, and may indeed be in the process of doing so. We can perhaps do a certain amount in helping to provide today's methods for the prevention and treatment of parasitic disease, but I am obliged to say quickly that these technologies are only marginally effective at their best, and they involve formidable logistical problems in getting them into the regions where they are needed. There are even more difficult obstacles-bureaucratic, cultural and financial-in seeing to it that sick people actually receive them. We should be trying harder, even so, to do what we can to help.

We should not be trying to transport our high-cost, middle-class, middle-age and geriatric health care systems to our poor neighbors to the south, nor to their poor neighbors in Africa and Asia. They cannot cope with the high and still escalating costs of our kinds of high technology, nor is it what they really need at this stage. Our system, at its present stage, is designed to assure our citizens a chance at old age. What they hope for is a better chance at life itself. If we want to be useful, as we should, we ought to find ways to transfer another kind of governmental instrument which was, in our own laboratory, essential for our protection against contagion, malnutrition and ignorance about health. This was the local health department as we had it during the late nineteenth through the first half of the twentieth century.

By this time, our own local health departments have shrunk to vestigial organs, run out of things to do, and may be at the edge of extinction. But organizations like these, when they were working at top speed and had more missions than they could possibly cope with, are precise models for what an underdeveloped country needs. Not the typical, highly centralized bureaucratic ministry in the capital city, not even the partly decentralized but still too large organizations resembling our state health departments. What I am talking about is the small, old-fashioned Board of Health, as local and autonomous as possible, overseeing at first hand the health affairs of a county, a town or a string of villages. These are instruments that have really worked in our own past.

A few countries in Africa already have networks of organizations somewhat like these, but they are generally underfunded and understaffed by adequately trained personnel. More centers are needed, and the means would readily be found for providing the professionals to do the work and to train up a cadre of local professionals. This, I suggest, is where we might come in. We have not forgotten how to manage a local health department, and we have the people who understand the business.

They are the nurses. The profession of nursing was founded by energetic young women who learned how to handle most of the problems that bring patients to a doctor's office or a hospital clinic. Many of these people, around the turn of the century, left the hospital setting and established themselves in the poorest neighborhoods of our cities. They were called public health nurses, or visiting nurses. Their professional descendants are still being trained (better trained, indeed, than ever before), and are called nurse practitioners or physicians' assistants, attracting now some highly motivated young men as well as women.

If our government does intend to be helpful to its impoverished neighbors in the field of health, I do not believe that deploying American-trained physicians and surgeons in large numbers will meet the needs of foreign societies, even if it could be done. The real health problems are, in a sense, too fundamental for the products of our medical schools, most of them trained at very high cost for at least 12 years in universities and teaching hospitals before being certified, elaborately armed with medicine's highest technologies and aimed toward specialty practices. Very few of these people are either professionally or temperamentally prepared to cope with the day-to-day health problems of an impoverished and primitive society.

Nurses, by and large, come into their profession because of a straightforward ambition to be useful and helpful, and they hope to do this by simply "looking after" people who need them. With two or three years of nursing education (usually laid on these days after two years of undergraduate college education), they can take on the role once played by old-fashioned family doctors in our own rural communities, and they can do some things a good deal better. They can educate people who lack any understanding of hygiene and nutrition, they can organize systems for immunization of whole communities, and they can diagnose, or learn to diagnose, the endemic diseases of the region. When new drugs become available for the treatment or prevention of parasitic diseases, it is nurses who will be best-equipped to see to it that they are properly employed in the field. They are perfectly capable of diagnosing and treating the common bacterial infections with the appropriate antibiotics.

The nurses in this country are also obliged, in the course of their training and early professional experience, to become good administrators. As a physician, I would have no qualms at all in seeing the nurses take charge of the running of local health departments.

Hospitals are needed as well, of course, but not on anything like the scale in this country or Europe. A modest-sized network of small regional hospitals, designed after the fashion of Scotland's cottage hospitals, would be valuable if staffed by a limited complement of physicians and surgeons. Some of these professionals already exist in the countries concerned, and more could be trained in this country if we would turn our minds to it. The present difficulty is that they lack hospital opportunities and adequate incomes in their own countries, and tend to emigrate whenever the opportunity presents itself. An investment in hospital construction and maintenance is obviously necessary, but only on a small scale compared with what we do in this country.

These are what are needed: sanitation, decontaminated (or better still, uncontaminated) water supplies, antibiotics and vaccines, and a distribution system assuring access to these things throughout the population, a chance at access to whatever new agents turn up for treating parasitic infections, plus a network of small hospitals with professional competence in primary medical care, plus a corps of visiting nurses, all of it run by nurse practitioners and physician's assistants trained at the outset in this country and its affluent neighbors-to be later replaced or succeeded by similarly trained nurses from within the developing nations. These are the basic requirements for raising the standards of public health.

It must seem that what I am asking for is a new version of the Peace Corps, but one that must be considerably larger, drawing its professionals from all industrial nations rather than just the United States, and concentrating its attention on hygiene, infectious diseases and nutrition. China has a health care system somewhat like this, already in place.

The rewards for the professionals are there to see, and I need not dwell on them. It is not given to many young people to feel useful in society, and this is incentive enough for those chosen to serve overseas. But the nursing profession is in trouble here at home because of the derisory salaries paid, and to bring more bright young people into nursing schools from high schools and colleges will take more money than we have thus far been willing to pay. This is as true for Europe and the United Kingdom as here at home.

So, one obstacle to what I hope to see happen is money, but it is not a vast sum considering our other expenditures on our connections with the Third World, from defaulted loans to military apparatus. Money, in any case, is not the chief problem.


Our real contribution to the parasitic disease problem, for which we already possess the facilities and talent needed, will be in research. All of the diseases in question represent problems which are essentially unsolved, beyond knowing the taxonomic names of the parasites involved. Our methods for dealing with parasites are really quite primitive when compared with the technologies we have for treating and preventing bacterial and virus infections. Many of the chemicals commonly employed are nearly as toxic for the host as for the parasite, and the few effective ones-such as the current antimalarial drugs-are agents which the parasites quickly learn to resist. There is an enormous amount of pharmacologic and immunologic work still to be done.

The sheer numbers involved in the burden of parasitic disease seem overwhelming. Amebiasis affects ten percent of the world's population, most of it in the South. The population at risk from malaria exceeds 1.2 billion, with an estimated 175 million people actively infected today. African trypanosomes (the cause of sleeping sickness) and American trypanosomiasis hold 70 million people at risk, and infect about 20 million right now. Schistosomiasis, worldwide, afflicts no less than 200 million people, filariasis and Leishmaniasis 250 million, hookworm 450 million, onchocerciasis, a common cause of blindness in the tropics, 20 million.

One thing that should catch the eye immediately in this list of numbers is that these are not mortality figures, but morbidity statistics. The actual deaths that are caused each year by these diseases represent a very much smaller number-probably no more than one million deaths a year for malaria in all of Africa, for example. The disease burden is not so much a dying problem for the poor of the world, it is the problem of living on for a somewhat shortened life span with chronic, debilitating, often incapacitating disease. Trachoma does not kill patients, but blinds more than 20 million. There are about 15 million people with leprosy, one of the most chronic of all diseases. Malaria and schistosomiasis, which between them affect a billion and a half of the world's population, are vastly more important because of the human energy that is drained away each year than because of the lives lost.

This provides an answer, of sorts, to the question I raised earlier: will solving the disease problem in the developing nations simply increase their populations to intolerable levels and make matters worse? It is probably not so. What might be accomplished would be the prospect of reducing the incidence of chronic invalidism, and vastly increasing the energy and productivity of billions of people.

Rough calculations can be made of the human energy costs entailed in certain diseases. A single day of malarial fever consumes more than 5,000 calories, for example. It has been estimated that this one disease represents a loss of about 20 percent of the total energy yield from grain production in the societies affected.

For many years, parasitic diseases have been thought of as problems unapproachable by real science, only to be dealt with by empirical and often exotic therapies. This view is changing rapidly. The cell biologists have recently learned how to cultivate malarial parasites, the immunologists are fascinated by their surface antigens, and the molecular biologists are now about to clone the genes responsible for the surface markers by which the parasites protect themselves against the infected human host. This means that a vaccine against malaria can now be thought of as an entirely feasible prospect for the near future.

The trypanosomes are becoming objects of fascination in contemporary genetics research because of their remarkable capacity to change their surface antigens whenever the host begins to mobilize an immune response, and the genes responsible for these evasions are already being studied at first hand. If a vaccine to prevent trypanosome infection in humans and farm animals can be devised, thus eliminating African sleeping sickness, this step alone would open up for agriculture a fertile African area the size of the United States which is now uninhabitable. My guess is that parasitology will soon become one of the most active fields in advanced biomedical science, and we should soon be finding ourselves in possession of an array of brand-new technologies for both immunization and treatment.

Basic research on tropical infection and parasitism will be of enormous benefit for the health, welfare, survival and economic productivity of the impoverished countries, but there is another area of science which can be of equal importance in the long term. We are just entering a new scientific frontier in agriculture, thanks to the recent advances in molecular genetics and the recombinant DNA technique. There is now a real possibility that genetic manipulation can be used to transform the stress tolerance and disease resistance of current crops and grasslands. It has been predicted by Frank Press, president of the National Academy of Sciences, that "some 40 percent of the world's uncultivated but potentially productive land can be brought into production," if fundamental problems in plant genetics can be solved.

Here is also an opportunity for the Third World nations to begin developing their own science base in biological science and biotechnology. This matter has been the subject of a wrangling debate within the United Nations Industrial Development Organization. At a meeting in Madrid in early September 1983, ministerial delegates from 25 countries formalized an agreement creating, on paper, an international center for research and training in biotechnology, but they were unable to agree on a site for the center. India, Pakistan, Thailand, Tunisia, Bulgaria, Italy, Spain and Belgium each proposed themselves as host countries. The discussion broke down in an argument over the question of the attractiveness of the center's location to world-class scientists. There was the predictable polarization: the representatives of the Third World countries insisted that the center should be based somewhere in their region, while the American, British, French, West German and Japanese spokesmen were uniformly negative. The name of the center was agreed upon, but little else: it will be called the International Center for Genetic Engineering and Biotechnology.

I should think that the dispute could be made less political and more scientific if the objectives of the proposed center were narrowed down and focused sharply on one high priority area of research. There is little need in the Third World, at the moment anyway, for a major biotechnology research installation doing research and development on genetic engineering across the board. The manufacture for profit of products like growth hormone, interferon, insulin or industrial enzymes is unlikely to be of much use to the economies of such societies. On the other hand, the application of genetic manipulation to agricultural research would be directly relevant. Moreover, the logical place to do this work would be in the regions of the planet where, for a variety of reasons, agriculture is technically infeasible or inadequate. The potential crops and feed animals to be improved exist in India and Africa, not in Belgium.

Indeed, rather than having just one center in a single impoverished country, it would be more useful to set up a network of collaborating agricultural research centers in various countries of the Third World. The problems in agricultural research have become of engrossing intellectual interest to many scientists throughout the industrialized world, and I have no doubt as to the feasibility of recruiting investigators to centers where the regional problems are both novel and urgent. Indeed, there is already an informal establishment of excellent Third World scientists trained in Europe and the United States who have expressed enthusiasm for the installation of biotechnology centers in their home countries, and who are confident that these institutions can, in time, become centers of excellence. It is a very different thing from the past (and failed) attempts to introduce heavy industrialization in hopes of transforming a poor country's economy. The scientific improvement of agriculture, and as a result, the transformation of a society's nutrition, has a greater potential for the improvement of human health than any other aspect of modern technology.

In conclusion, the objection usually cited against proposals of this sort, privately if not publicly, is that the survival assured to more children and the longer life span assured to more adults could abruptly increase the population beyond the resources of any conceivable food supply. I do not believe this. My guess is that populations given some confidence that living itself is possible, and that live children are possible, would be stabilized as never before in their histories. With luck, birth control could be accepted as a necessity for living (as it is now in China) and the present disastrous upswing in population might begin to level off. Without such a change in basic health standards, the curves will surely keep ascending, straight up until the final crash. It is worth the risk, I believe, and more than worth the relatively modest investment of money and talent from our side.

But my final argument, in my last ditch, is the simplest, most primitive and perhaps least persuasive of all in terms of foreign policy. We owe it. We have an obligation to assure something more like fairness and equity in human health. We do not have a choice, unless we plan to give up being human. The idea that all men and women are brothers and sisters is not a transient cultural notion, not a slogan made up to make us feel warm and comfortable inside. It is a biological imperative.


We are not finished with great extinctions. The current anxiety in some biological quarters is that the next one may be just ahead, and will be the handiwork of man.

At a national meeting of biologists and biogeographers held in Arizona in August 1983, the history and dynamics of extinction were the topics of discussion. The consensus was that the number and diversity of living species may be on the verge of plummeting to a level of extinction matching the catastrophe that took place 65 million years ago, and that this event will probably occur within the next 100 years and almost certainly before 200 years.

It will be caused, when it occurs, by the worldwide race for agricultural development, principally in the poorer countries, and by the appalling rate of deforestation. Although tropical forests cover only around six percent of the earth's land, they harbor at least 66 percent of the world's biota, animals, plants, birds, and insects. They are currently being destroyed at the rate of about 100,000 square kilometers per year. Elsewhere on the planet, urban development, chemical pollution, especially of waterways and shoreline ecosystems, and the steady increase in atmospheric carbon dioxide are posing new problems for a multitude of species.

The animal species chiefly at risk for the near term is humankind. If there is to be a mass extinction just ahead, we will be the most conspicuous victims. Despite our vast numbers, we should now be classifying ourselves as an immediately endangered species, on grounds of our total dependence on other vulnerable species for our food, and our simultaneous dependence, as a social species, on each other.

But do not worry about the life of the earth itself. No extinction, no matter how huge the territory involved or how violent the damage, can possibly bring the earth's life to an end. Even if we were to superimpose on the more or less natural events now calculated to be heading toward a mass extinction the added violence and radioactivity of a full-scale, general nuclear war, we could never kill off everything. We might reduce the numbers of species of multicellular animals and higher plants to a mere handful, but the bacteria and their resident viruses would still be there, perhaps in greater abundance than ever because of the expanding ecosystems created for them by so much death. The planet would be back where things stood a billion years ago, with no way of predicting the future course of evolution beyond the high probability that, given the random nature of evolution, nothing quite like us would ever turn up again.

If the ecologists are right in their predictions, we are confronted by something new for humanity, a set of puzzles requiring close attention by everyone. It is something more than an international problem to be dealt with by the specialists in each nation who deal with matters of foreign policy. Human beings simply cannot go on as they are now going, exhausting the earth's resources, altering the composition of the earth's atmosphere, depleting the numbers and varieties of other species upon whose survival we, in the end, depend. It is not simply wrong, it is a piece of stupidity on the grandest scale for us to assume that we can simply take over the earth as though it were part farm, part park, part zoo, and domesticate it, and still survive as a species.

Up until quite recently we firmly believed that we could do just this, and we regarded the prospect as man's natural destiny. We thought, mistakenly, that that was how nature worked. The strongest species would take over. The weak would be destroyed and eaten or used in other ways, or pushed out of the way. Nature red in tooth and claw. All that. We are about to learn better, and we will be lucky if we learn in time.


Getting along in nature is an art, not a combat by brute force. It is more like a great, complicated game of skill.

Altruism is one of the strange biological facts of life, puzzling the world of biology ever since Darwin. How can one explain the survival of any species in which certain members must, as a matter of routine, and under what appear to be genetic instructions, sacrifice their own lives in the interests of the group? At first glance, the theory of natural selection would seem to mandate the permanent elimination of any creatures behaving in this way.

Altruism is on its face, a paradox, but it is by no means an exceptional form of behavior. It is extremely interesting to biologists, but not because it is freakish or anomalous. In most of the social species of animals altruism is essential for continuation of the species, and it exists as an everyday aspect of living. It is seen in its most outlandish form in the social insects, where the hive or nest depend for their survival on outright suicide. The sting of a worker honeybee is a barbed spear, and in the act of stinging a predator the bee is necessarily eviscerated. Among the colonies of termites, wasps and ants there are hordes of volunteers ready to rush out in defense of the group, getting themselves killed in the act. It seems like unselfish behavior, hard to explain in terms of the individual insect, but it makes good sense for the species.

Haldane was the first to perceive the genetic mechanism underlying this kind of altruism. Briefly put, the creature sacrificing itself for the group is actually acting in aid of the survival of its own genes, and the extent of this peculiar kind of survival is determined by the degree to which these genes are shared by other members of the group. As Haldane put it, "I would give up my life for 2 brothers or 8 cousins." W.D. Hamilton, years later, turned the intuition into a consistent mathematical formulation, the theory of inclusive fitness, which not only accounts for the survival of the individual altruist's genes but, more important for the evolution of this kind of behavior, predicts the spread of genes for altruism throughout the population.

Altruism is perhaps not so much an everyday aspect of human behavior, and there is no way of proving or disproving a genetic basis for its display when it does occur. Sociobiologists, E.O. Wilson for example, believe that human altruism is genetically governed and exists throughout our species, whether or not in latent or suppressed form. Others, the anti-sociobiology faction, do not believe there is any evidence for altruistic genes at all, and attribute behavior of this kind solely to cultural influences. They do not, of course, deny the existence of human altruism, they simply deny that it is a heritable characteristic. For all I know, either side of the argument could be right, but I would insert a footnote here with the reservation that human culture itself is not all that non-biological a phenomenon. We may not be inheriting genes for individual items of cultural behavior, but surely we are dominated by genes for language, hence for culture itself, whatever its manifestations.

Altruism remains a puzzle, but an even deeper scientific quandary is posed by the pervasive existence of cooperative behavior, all through nature. To explain this we cannot fall back on totting up genes and doing arithmetic to estimate the evolutionary advantages to kinships. And yet it is there, and has been since the beginning of life. The biosphere, for all its wild complexity, seems to rely more on symbiotic arrangements than we used to believe, and there is a generally amiable aspect to nature that needs more acknowledgement than we have tended to give it in the past.

This would be an impossible intellectual situation for biologists, especially the evolutionary biologists, if it were simply left there unexplained. Now, thanks to some work with computers by Axelrod and Hamilton, we can be persuaded that cooperating is not only a nice thing to do, it is the thing if you are looking for ways to get through very long stretches of evolutionary time in the presence of numberless other creatures with whom you are obliged to interact.

Axelrod designed a computer game, a modification of the well-known iterated Prisoner's Dilemma game. The game allows the players to achieve mutual gains from cooperating, but also provides for the possibility that one player will exploit the other by cheating, thus increasing his gain, or the third possibility that neither will cooperate, thus gaining nothing for either side.

The plain existence of cooperation in nature raises three important questions for evolutionary theory: Can cooperation be a successful strategy when applied in a situation where there are multiple encounters over time between two players, and if so, is there a particular stratagem more successful than others? Second, in genetic terms, how could cooperation have started out in the first place? And, third, how could genes for this type of behavior be selected for, in conventional evolutionary terms?

When two players are given a choice between cooperating and defecting, and if they encounter each other on only one occasion in contesting for a prize-food, say, or space, or reproductive success-there is no question as to the advantage of defecting. No computer game is needed for arriving at that conclusion. Intuitively, and also in accordance with game theory, cheating pays, once anyway.

But intuition cannot be applied to the problem of what happens in the case of many encounters. Axelrod conducted a tournament by mail in 1979, sending out requests to some experts in game theory for computer programs to be matched against each other in a game in which cooperating and cheating were to be the sole choices.

Axelrod then played fifteen entries against each other, including a variant game of his own which involved flipping a coin on each move: heads for cooperating, tails for cheating. Each candidate program was played against every other program, including an exact duplicate of itself, 200 times.

The hands-down winner was also the shortest and simplest, a strategy devised by Anatol Rapaport at the University of Toronto. This program, designated by the capital letters that have become a standard part of computer jargon, is called TIT FOR TAT, and the name is entirely appropriate. Nothing could be simpler: the player cooperates on his first move, and thereafter he copies whatever the other player does.

When played out over a sufficiently long period of time, and with enough other players with different programs, TIT FOR TAT defeats all other strategies. It does not win all of the time, but with enough opponents it wins most of the time, and in nature that is real winning. Other programs were set up to be tricky, inserting clever moves to gain an advantage by single, unpredictable shots at cheating here and there in the course of the game, cooperating most of the time but taking the tiniest chances at betrayal. All of these programs lost the game, through what seems to have been the mathematical equivalent of a breakdown of trust. Since the point of the game is that each player has something to gain at each move by playing it, to end the game is to lose it.

Success of a single program in a computer game should not, I suppose, be extrapolated all the distance I would like to take it. But it is, at the least, a small item of comfort to learn that being nice in nature, most of the time, is a successful strategy. Not, mind you, being nice all the time, only nice in return for nice behavior by another. The long-term winners in evolution, indeed the whole animal kingdom and all the other great families of living things, seem to me to behave in this way, with the conspicuous exception of ourselves. Thus far in the evolutionary history of our species, in our assemblages as nation-states, in our dealings with all the other life forms as well as each other, we have tended to exploit, to cheat whenever the occasion seems to provide a short-term advantage from cheating, and-our worst mistake-to ignore the fact that it is bound to be a long, long game. For a computer to tell us this is not, in my view, artificial intelligence, it is intelligence, the real thing.

We are not bound by our genes to behave as we do. Most other creatures-not all, surely, but most of them-do not have the option of introducing new programs for their survival, at will. They behave as they do, cooperate as they generally tend to cooperate, in accordance with rigid genetic specifications. It may be, probably in fact is, that we are similarly instructed, but only in very general terms, with options for changing our minds whenever we feel like it. Our options, and our risks of folly, are made more complicated by the possession of language. Using language makes it easy for us to talk ourselves out of cooperating, but the very changeability of our collective minds gives us a chance at survival. We can always, even at the last ditch, change the way we behave, to each other and to the rest of the living world. Since the time ahead cannot any longer be counted as infinite time, and since we tend to keep talking by our very nature, perhaps we still have time to mend our ways.


There are two immense threats hanging over the world ecosystem. Both of them are of our doing, and if they are to be removed, we-humankind-will have to do the removing.

The first is the damage to the earth we have already begun to inflict by our incessant demands for more and more energy. Although we have not yet changed the earth's climate, it is a certainty that we will do so sometime within the next two centuries, probably sooner rather than later. We are not only interfering with the balance of constituents in the atmosphere, placing more carbon dioxide there than has ever existed before by the way we burn fossil fuels and wood, risking several degrees of increase in the mean temperature of the whole planet. We are also risking a significant depletion of the thin layer of ozone in the outer atmosphere, principally by the nitrogen oxides associated with pollution. It is a telling example of the way we think about global problems that we always talk of the ozone layer as our own personal protection against human skin cancer, as if nothing else mattered. The ecological outcome of a significant depletion of the ozonophere would matter considerably more. A 50-percent increase in the ultraviolet band would increase the amount of UV-B at the higher energy end of the band by a factor of about 50 times. The energy of these wavelengths would have highly destructive effects on plant leaves, oceanic plankton, the immune systems of many mammals, and could ultimately blind most terrestrial animals.

We ought to be learning much more than we know about the day-to-day life of the earth, in order to catch a clearer glimpse of the hazards ahead. One way to begin learning would be to make better use of the technologies already at hand for the world's space programs. Somewhere on the list of the National Aeronautics and Space Administration's projects for the future is the so-called Global Habitability program, a venture designed to make a close-up, detailed, deeply reductionist study of the anatomy, physiology and pathology of the whole earth. The tools possessed by NASA for this kind of close, year-round scrutiny are flabbergasting, and better ones are still to come if the research program can be adequately funded. Already, instruments in space can make quantitative records of the concentrations of chlorophyll in the sea-and by inference the density of life, the acre-by-acre distribution of forests, fields, farms, deserts and human living quarters everywhere on earth, the seasonal movements of icepacks at the poles and the distribution and depth of snowfalls, the chemical elements in the outer and inner atmosphere, and the upwelling and downwelling regions of the waters of the earth. It is possible now to begin monitoring the planet, spotting early on the evidences of trouble ahead for all ecosystems and species, including ourselves.

The Global Habitability program could become an example of international science at its most useful and productive, if it could only be got under way. Right now, the chances of getting it set at the high priority it needs for funding seem slim. It has the disadvantage of offering only long-term benefits, which means political trouble at the outset. It is no quick fix. It is research for the decades ahead, not just the next few years. And it cannot be done on the cheap, which means wrangles over the budget in and out of Congress. And finally, it will require a full-time, steady collaborative effort by scientists from many different disciplines in science and engineering, and from virtually every country on the face of the earth, which means international politics at its most difficult. But it ought to be launched, and soon, no matter what the difficulties, for it would be a piece of science in aid of the most interesting object in the known universe, and the loveliest by far.


I said a moment ago that there were two great threats to the planet's viability as a coherent ecosystem. The second one is not a long-term one. It hangs over the earth today, and will worsen every day henceforth. It is thermonuclear warfare.

It is customary to estimate the danger of this new military technology in terms of the human lives that are placed at risk. We read that in the event of a full-scale exchange in the Northern Hemisphere, involving something around 5,000 megatons of explosives, perhaps one billion lives would be lost outright from blast and heat and another 1.5 billion would die in the early weeks or months of the aftermath. With more limited exchanges, say 500 megatons, the human deaths could be correspondingly reduced. We even hear arguments these days over the acceptable number of millions of deaths that either side could afford in a limited war without risking the loss of society itself, as though the only issue at stake was human survival.

But a lot of other things would happen in a thermonuclear war, more than the general public is aware of or informed about. What we call Nature is itself intimately involved in the problem.

According to a study by a committee of biologists and climatologists for the Conference on the Long-Term Biological Consequences of Nuclear War, recently published in Science, the following are among the probable events that will occur.1 Some of these were made public on television in October, attracting a certain amount of what I fear will be transient public concern.

Assuming that most or all of the detonations take place at ground level, the amount of dust and soot exploded into the atmosphere will darken the underlying earth over the entire Northern Hemisphere for a period of months to one year. The sunlight will be 99 percent excluded, and the surface temperatures in continental interiors will fall abruptly to below -40°C, effectively killing most plants and all forests. In the tropical zones, the loss of forests will destroy a majority of the planet's species. The photosynthetic and other planktonic organisms in the upper layers of the oceans will be killed, and the foundation of most marine food chains eliminated.

The new and extensive temperature gradients between the oceans and land masses will bring about unprecedented storms at all coastal areas, with destruction of many shallow-water ecosystems.

Radioactive fallout in areas downwind from the fireballs is estimated to expose five million square kilometers to 1,000 Rads or more, most of this within 48 hours. This exposure is much higher than in any previous scenario, and is enough to kill most vertebrates and almost all forms of plant life in the affected area, including the conifers that comprise the forests in the cooler regions of the Northern Hemisphere.

Later on, months after the event, things will get worse. The ozonosphere will be gone, or nearly gone, and the planet will then be exposed to the full, lethal energy of ultraviolet radiation as soon as the dust and soot have cleared away. It was only because of the protective action of ozone that complex, multicellular organisms were able to gain a foothold in life a billion years ago, and most of these creatures are still as vulnerable as ever to ultraviolet light.

The Southern Hemisphere will be less affected, assuming that the nuclear exchange is confined to the North, but extensive damage is still inevitable throughout the globe, most of it due to chilling.

Bacterial species are less vulnerable to radioactivity and cold than are the higher organisms, but many species in the soil will be lost in the initial heat of fireballs or in the later firestorms and wildfires covering huge areas.

It is not known how many forms of life would be permanently lost. After a period of years, some of the surviving species might reestablish themselves and set up new ecosystems, but there is no way of predicting which ones, or what sorts of systems, beyond the certainty that everything would be changed.

In such an event, the question of the survival of human beings becomes almost a trivial one. To be sure, some might get through, even live on, but under conditions infinitely more hostile to humans than those that existed one or two million years ago when our species first made its appearance.

Civilization, and the memory of culture, would be gone forever. Given the kinds of brains possessed by our species, and the gift of memory, all that might be left to the scattered survivors, staring around, would be the sense of guilt for having done such damage to so lovely a creature, and a poor heritage for a poor beginning.



You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print, online, and audio editions
Subscribe Now
  • Lewis Thomas, M.D., is Chancellor of the Memorial Sloan-Kettering Cancer Center in New York City. Dr. Thomas is the author of Lives of a Cell: Notes of a Biology Watcher, Late Night Thoughts on Listening to Mahler's Ninth Symphony, and other works. This article is adapted from his Elihu Root Lectures, delivered at the Council on Foreign Relations on November 1, 3 and 9, 1983. Copyright (c) 1984, Lewis Thomas.
  • More By Lewis Thomas