That scientists today crucially affect decisions on national and international security-and therefore the fate of us all-will come as no news. After radar and jets and the A-bomb and the H-bomb and intercontinental rockets, the statement surely is obvious enough. But what does it mean? Like much else that is obvious, it is not very clear. Just how do the results of scientific research and the methods of science and the scientists themselves actually figure in decisions on arms and arms control? And how is the role of the scientist in such matters related to the more familiar functions of the politician, the military man and the ordinary citizen? Above all, what does "scientist" mean in such statements?

Even partial answers to these hard questions might help us deal with some others that are even harder and trouble us more. If by "science" is meant a difficult and specialized discipline currently accessible only to the few, a trained minority, what does this do to the democratic process? At the end of his term in office, President Eisenhower spoke of the danger that "public policy could itself become the captive of a scientific- technological élite." On the other hand, scientists, it seems, might become the captives. When scientists are drawn into the pulling and hauling of "politics," what happens to the freedom and objectivity of science or scientists? Again, given the partially hostile world in which we live, defense decisions must sometimes be made in secret. Where scientists are involved in such decisions, what does this mean for the vital features of science as a fallible but open, verifiable and self-correcting enterprise?

Especially in the two years or so since Sir Charles Snow's Godkin Lectures, discussion of these and related issues has been intense, sometimes bitter, and I think on the whole useful. But the issues have provided matter for both of Sir Charles' renowned "Two Cultures": exciting literary material and a supply of blunt weapons for the factional quarrels and feuds among scientists. As a result, while there has been some light shed, there has also been much mystification.

EDITOR'S NOTE: This article is based on a longer monograph bearing the same title, presented at a conference of the Council of Atomic Age Studies at Columbia University, of which Mr. Christopher Wright is executive director.

The Godkin Lectures, delivered at Harvard in the fall of 1960, begin with the dark words:

One of the most bizarre features of any advanced industrial society in our time is that the cardinal choices have to be made by a handful of men: in secret: and, at least in legal form, by men who cannot have a first-hand knowledge of what those choices depend upon or what their results may be.

When I say "advanced industrial society" I am thinking in the first place of the three in which I am most interested-the United States, the Soviet Union, and my own country. And when I say the "cardinal choices," I mean those which determine in the crudest sense whether we live or die.[i]

This opening sets C. P. Snow's major theme. He illustrated it, of course, with a dramatic story of the two English scientists, Sir Henry Tizard and F. A. Lindemann, Lord Cherwell, and their role in relation to the vital decisions on air defense and strategic bombing in England just before and during World War II.

If Sir Charles is right, the cardinal choices of the United States, the United Kingdom and the Soviet Union can, it seems, be directly understood only by scientists and yet are and must be, "at least in legal form," made by non-scientists who are exposed to the advice of only a few. Sir Charles, who is at home in both the Scientific Culture and the Literary one, moves so easily from one to the other that we are never quite sure how to take lessons from what he calls his "cautionary" tales. Are they literally true? Or are they literature? The critical response to the Godkin Lectures by less partisan participants in the events they describe suggests that these stories may be fables. None the less, even a fable may contain a useful moral: the troubling questions remain.

In the limited sense of Snow's definition of "scientists," the cardinal choices referred to by Snow are not simply, as he suggests, a domain of "science." The decision at the start of World War II to develop a fission bomb, or the decision to use it against Japan, or the decision to develop an H-bomb, or to bomb German cities during World War II, called for much more than natural science and engineering. Such decisions have narrowly technological components, but they involve just as essentially a great many other elements: military operations and counter-operations by an enemy, the economics of industrial production, the social and political effects of bombing on populations, and many others. Some of these other factors are qualitative. Many are quantitative, and in this very broad sense "technical." (They involve numbers and may be related in a numerical model.) However, even these do not fit into any of the traditional disciplines of natural science or engineering. They do not, for example, come under the head of electrical engineering or physical chemistry. And natural scientists and engineers do not normally acquire a professional acquaintance with subjects such as the cost of buying and operating a fighter bomber or the disaster behavior of urban populations. Nor do they ordinarily find these subjects essential in the course of engineering work in developing a bomb.

In fact, in addressing the complex cardinal choices, one of the inadequacies sometimes displayed by natural scientists is that they may ignore, or assume implicitly, or simply receive, or themselves casually estimate without enough study, the values of those variables that fall outside the traditional natural science disciplines. The cardinal choices, in Snow's sense, cannot be well made solely on estimates of the feasibility or infeasibility of some piece of hardware. They are political and military, strategic decisions. Technology is an important part, but very far from the whole of strategy.

Snow is not alone in creating this confusion. It is a very widespread practice among scientists concerned with public policy, and especially among those who direct urgent popular appeals. In the letter that Bertrand Russell sent in 1955 to heads of state enclosing a call for what later became the Pugwash Conferences, he began: "I enclose a statement, signed by some of the most eminent scientific authorities on nuclear warfare." The signers were indeed without exception eminent scientists, but among the ten physicists, chemists and a mathematical logician who were included, not one to my knowledge had done any empirical study of military operations likely in a nuclear war.

Similarly it is usual to find, at the head of petitions advocating some specific nuclear policy, sentences that run: "As scientists we have knowledge of the dangers involved," followed by the signatures of tens or even thousands of scientists, only a few of whom have examined the empirical evidence on more than one or two of the many alternative dangers involved in the policy choice. Simply as a scientist no one has a knowledge of these complex choices.

The bombing controversy in 1942, one of the "cardinal choices" recounted by Snow, was somewhat ill-defined, as is not unusual in such policy disputes. It had to do among other matters with the relative emphasis in air strategy on offense or defense-with how, for example, to allocate resources between the strategic bombing of German towns and the air defense of coastal shipping. This is hardly the sort of thing one would normally submit to a vote of the Fellows of the Royal Society, as Snow suggests, or to the "general population" of natural scientists and engineers. And not simply or in principle because of the difficulties of secrecy.

A good answer to the allocation question depended on a great many things, including, on the one hand, how rapidly bombers would be manufactured, how soon after manufacture they could be made an operational part of the military forces, losses that might be expected from enemy defenses, the expected number of sorties in the operational life of these bombers, the shape and population density of German cities, and the types of building in them, the efficiency of the German fire-fighting services, the reaction of populations to the stress of air raids; and, on the other hand, the effectiveness of these same bombers against enemy ships of war, the allied shipping and supplies likely to be saved, the military worth of these supplies, etc., etc. These are not matters found in physics textbooks. Nor could the Fellows of the Royal Society be expected to qualify for independent judgment on them in the course of a week.

Over a longer period, such questions are open to study, but-and this is a critical point-they are open to study and answer by a much wider group than engineers and natural scientists. And there is little evidence to suggest that at such study the technologists are signally best. Not only are some of the principal variables subject matter for the behavioral sciences rather than physics, but the appropriate methods of study also may be closer to the methods of some behavioral sciences. Blackett, writing in 1943, not long after the bombing controversy, pointed out that the mathematical methods he employed are in general use

. . . in those branches of science whose subject matter has similar characteristics. These characteristics are that a limited amount of numerical data is ascertainable about phenomena of great complexity. The problems of analyzing war operations are almost all of this type and are therefore rather nearer, in general, to many problems, say, of biology or of economics, than to most problems of physics, where usually a great deal of numerical data is ascertainable about relatively simple phenomena.[ii]

Perhaps even more important, while the job of gathering and examining relevant empirical data might be very laborious, the gist of the methods for using the data is quite generally accessible. The methods are within the grasp of an intelligent administrator, and, given time, open to his skeptical questioning.

I have stressed the phrase "given time." Without time on these complex questions, a government official is not likely to have a full understanding and may make some poor choices. This is also true, however, of the technologist or the analyst of tactics or strategies. One of the principal differences between our present situation and the circumstances in which decisions had to be made in World War II is that today we frequently have time. It is a salient difference bearing on the question of how technologists, strategists, military and political men may figure in cardinal choices. For the major peacetime decisions are seldom final. In short, given time, the decision-maker without a degree in physics, mathematics, or for that matter, mathematical economics, is quite capable of having a "first-hand knowledge of what those choices depend upon and what their result might be."[iii] And he often will have time.


In the view of some scientists, it would appear that judgment does not really require time or a great deal of grubby work. It is, according to Snow, more a matter of intuition, an attribute of a few gifted men, a kind of "prescience." This quality evidently is present especially in scientists, who "have something to give which our kind of existential society is desperately short of: so short of, that it fails to recognise of what it is starved. That is foresight."[iv] Foresight is "not quite knowledge," but "much more an expectation of knowledge to come . . . something that a scientist, if he has this kind of sensitivity latent in him, picks up during his scientific experience."[v] Some men other than natural scientists have this gift but, we gather, much more rarely and in a lesser degree.

The popular fantasies relating the pursuits of science to sorcery and an almost superhuman thaumaturgy make such a view of prescience rather widely credible. Not only may the layman talk in these terms of the mysteries of science, but also statesmen; the "Wizard War" Churchill called the technological race in World War II. We may not, as Mr. Eisenhower fears, become captive of a scientific élite, but it would seem that scientists, or at least the best scientists, may indeed be The Elect. And many of them have felt charged with a prodigious mission and a great moral urgency. Spurred by an apocalyptic vision of world annihilation, they urge a drastic transformation in the conduct of world affairs in the immediate future. They have been passionately sure that the choices are stark and clear: annihilation on the one hand or a paradise on earth. "Remember your humanity and forget the rest," read the invitation to the first Pugwash Conference. "If you can do so, the way lies open to a new Paradise; if you cannot, there lies before you the risk of universal death."

For many scientists there is very little time. C. P. Snow predicted at the end of 1960 that if events proceed on their present course, nuclear war is "a certainty . . . within at the most ten years."[vi] Which does not leave much time. The clock on the cover of the Bulletin of the Atomic Scientists started so near twelve that a while ago it had to be set back.

In bringing about a new sort of world, the scientists feel that they have a special responsibility. They are free of the insincerities and dubious motives of the traditional actors on the political scene; they are interested only in clarification and truth. Furthermore the coöperative and potentially universal nature of the scientific enterprise is at hand as a model for a future world order, and the scientists can be vital agents in bringing that order about. "Scientists of the World, Unite!", the title of an article by a Princeton physicist appearing immediately after the war, sounds the right note.

This vision of the responsibility of the scientist-"a greater responsibility than is pressing on any other body of men,"[vii] according to Snow-puts him in a very different role from the scientist as technologist or the scientist dealing by tentative and empirical methods with broader questions or cardinal choices. It is fortified, however, by the confusion between technologist and strategist and by the related notion of the scientist as specially endowed-a seer or prophet.

The notion bears a strange resemblance to that of the prophets in the chiliastic and apocalyptic movements that swept Europe centuries ago in times of great disorientation, anxiety and instability. It has some inspirational uses, but a great many disabilities. Like past eschatology, it encourages schismatics, and the feuds among the scientists have been intolerant and implicitly rather bloody. Snow's tale of Lindemann and Tizard unconsciously illustrates the point: Lindemann is the dark angel, sadistic and violent, without the gift of foresight. And Blackett, a passionate battler against the forces of darkness, uses the story in his innumerable present feuds.

But most important, this urgent, tense feeling of mission can sometimes bias the technological studies and, even more, tends to discourage the use of the patient and tentative method of science, as distinct from the authority of science, in assisting the cardinal choices of which Snow speaks. It has led in some cases to a rather surprising anti-rationalism.


Anyone who has searched diligently for a device, which in hostile hands might demolish what he had been building the previous year, is not likely to forget the sickening sensation of finding it. Yet that is the occupational hazard of a working strategist, a conscientious designer of what may be called "conflict-solving systems"-that is, systems for keeping the peace or fighting a war, where the opponents' countermeasures must be taken into account. Thus the honest strategist must wear two or more hats, and this can be something of a personal strain. It can actually lead to quarrels among friends and organizations. The inventor of an ingenious measure may come to regard the inventor of an even more ingenious countermeasure with some distaste or even detestation. Whose side does the fellow think he is on?

All of which is true enough for the design of some national or alliance weapon system for possible use in a war. The personal strain and the strain on friendship is likely to be even worse where the system to be designed is an international control system. For while with national defense measures the element of at least partial opposition by an enemy is, or always should be, as plain as can be, it is not so plain in the case of an international system. Here one has an agreement with an adversary, and it is tempting to believe that he will coöperate. A scientist who works on evasion schemes is almost certain to be regarded as a leper. Isn't he opposing the agreement and ruining the possibility of international control? This is a nearly universal attitude. It was frequently voiced in protest against studies of possible ways to evade the test ban. Now it may be that some of the men who find it easiest to work on evasion schemes are those who oppose the agreement. None the less, anyone who is soberly in favor of an agreement with adequate safeguards should systematically and seriously wear both hats all the time. Two illustrations will suffice to show how, in the case of the test ban, each of the two principal factions has found it hard to deal with countermeasures, except where these support a point of view it is propounding anyway.

First, Edward Teller: Dr. Teller in my view has performed an important service in helping to develop a test ban with adequate controls, by thinking ingeniously about the possibilities of evading the various control systems that have been proposed. On the other hand, when it has come to supporting his views on the importance of testing, he has argued that we would lose more than the Russians would if we both stopped testing. As the defender, in contrast to the aggressor, we have a harder job. Therefore, he reasons, testing will enable us to develop the more sophisticated weapons we need for use in defense. However, in this argument he ignores the fact that the Russians will also be developing their weapons of aggression as counters to our defense, and there is no a priori reason for believing that they won't make more rapid strides in their "easier" job than we in our difficult one. In the past the development of nuclear weapons has favored the offense. In short, when it comes to the exploitation of tests in the development of weapons, Dr. Teller ignores countermeasures; they do not suit his argument. He has been extremely ingenious in considering enemy countermeasures to thwart control systems; these countermeasures do suit his argument.

Next, Hans Bethe: Dr. Bethe has been the symmetrical opposite of Dr. Teller on this matter as on others. As far as evasion schemes are concerned, he has said that he was embarrassed at presenting to the Russians the possibility conjured up by another American because it "implied that we considered the Russians capable of cheating on a massive scale. I think that they would have been quite justified if they had considered this an insult."[viii] This suggests that it is all right to set up a police system, but not against potential crooks. His own energies in any case were devoted to the measures rather than the countermeasures. On the other hand, when it came to evaluating the military worth of weapons that might be developed with the aid of testing, such as anti-missile missiles, Dr. Bethe could frequently think of nothing except enemy countermeasures that would reduce their military worth nearly to zero. Dr. Bethe, like Blackett, is, without any extensive study, quite certain that enemy countermeasures like decoys would make a defense against ballistic missiles useless-or even harmful-in any reasonably likely contingency.

There are two points which emerge from this discussion of countermeasures. First, most physical scientists and engineers find it hard to deal with an enemy countermeasure, except where it spoils a system they themselves dislike on other grounds. This, I believe, is sometimes associated with an aversion to putting the fact of hostility in the center of their attention. Many of the articulate scientists, especially when considering arms-control agreements, prefer to think of harmony rather than conflict. The difficulty they have in contemplating countermeasures stems from hostility to the fact of hostility itself. In this way they slip more easily into the role of prophet and agent of a perfectly peaceful world.

The second point is that the evaluation of countermeasures in military conflict systems is likely to be very complicated, requiring painstaking analysis, seldom undertaken by the technologists themselves. It involves for one thing an extensive canvass of potential military operations on both sides and their possible interactions, and sometimes a consideration of allies and more than one adversary; I believe neither Dr. Teller nor Dr. Bethe has done this sort of systematic analysis of the military worth of the weapons they talk about. Both are experts in the basic technology of bomb design, but that is quite another matter.

Questions of military worth are broader than physics and in some ways harder. They of course are not purely military questions any more than they are purely technological. They may involve a forbidding nest of problems including political and economic, as well as military and technological, questions. However, on the questions that have called for systematic analysis, characteristically there has been no experience that was precisely relevant. For these questions relate to a near or distant future affected by novel techniques and political uncertainties. Experts are seldom "expert beyond experience," and analysis is needed, not to replace intuition, but to sharpen and supplement it, and to make it more public and verifiable.

The role of uncertainty in decision-making as well as in system studies to aid decision is so prominent that it is worth dwelling on, especially as it is related in several ways to some recent obscurantism. Attempts to prepare for anything other than the probable events are branded as "paranoid" by Erich Fromm. By this definition, all of us who live in normally fire-safe neighborhoods and houses and none the less take out fire insurance are paranoid. On the contrary, it would be simply irrational to stake everything on a "most likely" event where the uncertainties are so large and intrinsic. This would be true even if we were quite sure we knew which were the "most likely" events and could agree on what are useful objectives in these contingencies.

The scientists have been very far from agreement. In retrospect, their views since World War II on major strategic issues-the feasibility and usefulness, in deterring or fighting a war, of active or civil defense, of the ability to bomb enemy industry or cities or military forces, of tactical nuclear weapons, of restraint in nuclear war, and many others-show an extraordinary sequence of sudden and repeated reversals.[ix] The principal factions of scientists have remained in opposition-sometimes, however, almost exactly changing place. Moreover, the thought devoted to defining these issues and the evidence gathered for resolving them in no case warranted the certainty with which opposing views were propounded. This is not to say, of course, that "the politicians and the generals," with whom the physical scientists are contrasted, have been right. It would be hard to show, however, that the scientists have been on the whole more realistic or more prescient. Moral certainty and feelings of prescience have been a pretty uncertain guide to the future-even to the immediately next future beliefs of the prophets.

The gift of prescience is not only hard to come by for oneself; it is difficult to identify in others. Snow, who should be a great connoisseur of prescience, has run into difficulties. He derides Lindemann for backing infrared detection: "This seemed wildly impracticable then. ... It seems even more wildly impracticable now."[x] Chinese Communist pilots downed by Sidewinder missiles with infrared homing devices would disagree. This would appear to be a case, in short, where Lindemann's prescience exceeds Snow's present knowledge of what has long since happened. Snow-and Blackett-take much too literally one of the lessons Snow draws from his cautionary tale: "The prime importance, in any crisis of action, of being positive what you want to do. ... It is not so relevant whether you are right or wrong."[xi]

In fact, serious study of the large uncertainties in the major strategic choices we have had to make suggests the opposite. Bertrand Russell in a better day once said, perhaps overstating the matter a bit: "The opinions that are held with passion are always those for which no good ground exists; indeed the passion is the measure of the holder's lack of rational conviction."[xii] Passionate assurance on these intrinsically uncertain matters is not justifiable on logical grounds. Some technologists who are most articulate on matters of public policy in the defense and arms-control field should worry us most in their moments of boundless conviction, when they assume the role of seers. The tentative and fallible methods they have used professionally seem even more appropriate in the complex and uncertain areas of cardinal choice.

Don Price, in a brilliant article, "The Scientific Establishment," has developed with admirable lucidity the difference between the role of the scientist in the United States and the picture that Snow attributes to the United States, the United Kingdom and the Soviet Union.[xiii] In the United States the scientists have had unmatched opportunities for getting a direct political hearing for their ideas on policy. On every one of the cardinal choices cited by Snow, scientists have been heard, and by top decision- makers. On the other hand, I know of no clear evidence that in the Soviet Union scientists have affected the cardinal choices either on the basis of their prescience or on the basis of systematic study of major alternatives.

In the United States the problem of scientists and strategists is, I think, by and large not so much in being heard as in saying something, that is, saying something that is the result of thought and empirical study.


It is high time that we recognized the extreme implausibility of the notion that war may become "impossible" in the next short space of time. On the other hand, neither is nuclear war inevitable in the next ten years, or many more. Since reducing the likelihood of war will preoccupy us for many years to come, it is appropriate to think of the probable consequences of this persisting preoccupation, some of which are already visible.

Decision-makers are likely to acquire a deep familiarity with these problems in the course of time, and to grow in professional competence in the continuing work on their solution. This is happening today, for example, in the Department of Defense. A year ago The New York Times published a statement of the Secretary on the issues in the choice of strategic bombardment vehicles for the late 1960s and after.[xiv] Whether or not we agree with the specific choice it explains, the document is an impressive one. In its thoughtful treatment of the uncertainties and the essential technological as well as operational and economic problems, it compares very favorably in sophistication with the analyses done by scientists to aid decision during World War II. Moreover, anyone who follows the Congressional Hearings will be quite convinced that such statements are comprehensively understood by a good many current decision- makers. These cardinal strategic decisions in general are made by them.

There is a good deal of hocus-pocus in Snow's pronouncement that the decision-makers "cannot have a first-hand knowledge of what those choices depend upon."[xv] There is, of course, a sense in which nobody can have first-hand knowledge of all the things such decisions depend upon. They depend upon a great many things besides technology, in many fields. However, the choices that Snow dwells on, for example in his cautionary tales, are not all that obscure, and a first-rate Cabinet officer or military man can master the essentials of much more complicated matters, especially if they keep coming up. And they do.

The other side of this picture is that the natural and behavioral scientists, who offer advice or do analyses to assist decision, may experience a growth of professional competence too. Offhand judgments of individuals and crash studies by committees will always be with us and should. But expertise and committee activities have limitations. An expert on the whole range of problems involved in even one of these complicated choices is hard to find, and if one is discovered, the way in which he reaches his conclusion may be difficult to reproduce and verify; this in turn affects whether his judgment will be subject to criticism by more than a "handful of men." Inexplicitness is likely to be even worse with committees, since they proceed frequently by bargaining rather than reason. But explicit statement of the way conclusions are reached and of the evidence is part of the normal method of science, and what I mean by "conflict-systems studies" is simply the application of the method of science to the analysis of political-military strategic alternatives.

This suggests a little of the answer to at least one of the large questions with which we began: Both the physical and the behavioral sciences have a role to play in component research on cardinal choices. And in the course of studying strategic alternatives the methods of science can be used to reach conclusions going beyond the skills of any of the individuals involved. The important point is that on these complex cardinal questions, answers are won precariously and intermittently, in the course of hard empirical inquiry into the major factors affecting choice. Intuition and intelligence help, but do not make superfluous the study not only of the vital technologies, but also of the behavior of men and nations using, and affected by the use of, such technologies. No one has the gift of reliable foresight on these cardinal choices. The primary thing, then, is not to be positive. The basic failure of the physical scientists and engineers in their turbulent history during the cold war is not their lack of prescience but their acting frequently as if they had it.

[i] C. P. Snow, "Science and Government: The Godkin Lectures at Harvard, 1960." Cambridge: Harvard University Press, 1961, p. 1.

[ii] "Operational Research," British Association for the Advancement of Science, Burlington House, Piccadilly, London. Reprinted from The Advancement of Science, v. 5, April 1948, p. 29. (Quarterly Journal of the British Association).

[iii] "Science and Government," p. 1.

[iv] Ibid., p. 81

[v] Ibid., p. 82.

[vi] "The Moral Un-Neutrality of Science," Address to the 1960 meeting of the American Association for the Advancement of Science, reprinted in Science, Jan. 27, 1961, v. 133, p. 259.

[vii] Ibid, p. 259. I have discussed this and related predictions by scientists in "Nuclear Snaring: NATO and the N + 1 Country," Foreign Affairs, April 1961.

[viii] "The Case for Ending Nuclear Tests," The Atlantic Monthly, August 1960, p. 46.

[ix] An account of this history is given in my monograph, previously referred to, which will be published in the fall.

[x] "Science and Government," op. cit., p. 34.

[xi] Ibid, p. 73.

[xii] Quoted by Charles Hussey in "Earl, Philosopher, Logician, Rebel," New York Times Magazine, May 13, 1962, p. 10.

[xiii] Science, June 29, 1962, p. 1099-1106.

[xiv] New York Times, March 16, 1962, p. 1 and 12.

[xv] "Science and Government," op. cit., p. 1.

You are reading a free article.

Subscribe to Foreign Affairs to get unlimited access.

  • Paywall-free reading of new articles and a century of archives
  • Unlock access to iOS/Android apps to save editions for offline reading
  • Six issues a year in print, online, and audio editions
Subscribe Now