Rationality In Times of Crisis

This paper will focus on the decision makers during the Cuban Missile Crisis in 1962, and more importantly how catastrophic conflict was averted. The crisis can be seen as a non-event in the history of conflict, in that there was no direct conflict or military action taken. Instead we can think of this event as a psychological war of sorts, there was a strong need for the two great leaders to negotiate and come to a truce through diplomatic dialogue. Yet there were pitfalls in this process they needed to avoid, that could have caused very different outcomes, if this crisis was to be defused with minimal consequences. Why and how war was averted can be thought through this frame: thoughtful individuals who strained their rational capacities to overcome their bias, and external corrupting pressures, to sidestep disaster. Systemic errors in procedure both on the individual level (thinking) and on the group dynamic level (bureaucracy, legislative bodies, etc) were skirted around by having key decision makers who could traverse them skillfully.

First of all, how close exactly did we come to the brink of nuclear annihilation? There are two statements that we can synthesize here to estimate it, as “the mounting evidence of narrow misses during the crisis suggests that luck played a pivotal role, and that the outcome could easily have been tragic.” Also, “how close we came to nuclear war is inextricably linked to counterfactual questions about what might have happened.”1 In other words there were many points in time that could have caused the opposite outcome, and it is up to educated conjecture as to what could have happened. Second of all, how exactly is the Cuban missile crisis related to the subtopic of rationality? The crisis and its outcome, is portrayed as the triumph of cool heads under excessive pressure, with an especially favorable light cast on president Kennedy. This presumption needs to be more closely examined. Also, what is meant exactly when one tosses out the word (and its correlates), rationality? One can easily understand by connotations, but what I want to make explicit going in is the denotation. It is a belief or reliance on reason as the best guide for belief and action, above and beyond things like authority figures, institutions or wishful thinking. This is closely related to the idea of skepticism as well, which makes one hold a doubting attitude towards knowledge. While one can rationalize almost anything, it is the skeptic attitude that holds it in check and helps mitigate human hubris. For our purposes here, it can be looked upon as a decision-making approach. The approach can lead to two different and broad ends, the instrumental maximization of a preferred goal, or an end that is determined by procedures and processes.

Rationality as a process may be a little too vague, so to enlighten the subject we can break it down into constituent parts. First of all, the rational unit, individual or otherwise has goals it wishes to achieve. These goals are placed in a hierarchical list by the rational agent, as some are more preferred than others. This implies a cost-benefit analysis between the goals. The second component is the strategies in the unit’s mind that can be used to achieve these ends, this includes not just an assessment of possible action but also entails the time and energy needed for the development of new strategies.2 That is an important point to keep in mind in the context of the Cuban missile crisis. The final component is where the outcomes, or consequences of each possible choice is weighed against each other and in conjunction with the other two components. For Kennedy and his ExComm staff, this largely meant two things: the wrong action can escalate to nuclear attack, and we must take into consideration the intents and possible interpretations of the Soviets of our actions.

In order to understand and appreciate the events that brought us to the brink of annihilation, we should take stock of the pitfalls during the process of decision making. They could easily snag decision makers if they are not careful, to step on one of these mines is to spell ruin. For starters, when in a time of crisis one has limited time to make decisions and this lack of time compounds the stress put on the decision maker. This stress has a wide range of debilitating effects on that decision maker or makers. As Levy states: “high levels of stress reduce individuals’ tolerance for ambiguity, reduce their insensitivities to others’ perspectives, and increase tendencies towards scapegoating.”3 Worse still is that it also “affects search, and results in the dominance of search activity by predispositions, prior images, and historical analogies rather than by a more balanced assessment of the evidence.”4 In other words, lack of time in a situation where decisions need to be made, induces people to fall back on faulty decision making mechanisms. These mechanisms point to cognitive errors in the decision making process, one such example is what we can call the availability-heuristic. This heuristic (i.e. a mental shortcut) is the first thing that comes to mind and gains prominence because adoption means a quick solution (and thus a quick relief), thus it is a tempting option.

While in a context with less at stake, such mechanisms provide a convenient release valve, it is incredibly inconvenient in times of crisis when there are high stakes involved. When people presume of leaders a calm and collected demeanor and rational mind, it is a reflection of the tacit understanding of this type of pitfall and the importance in avoiding them. It is important however to keep in mind that the hopes for rational decision making in order to create optimal outcomes is still just an ideal even in the best of scenarios. It is never a guarantee that a leader has full knowledge, in that he can only reasonably conjecture at the consequences of any one decision. Ultimately he or she can only control for intended outcomes. This control is increased, at least in presumption with more heads around-the-table, so to speak. In this way some stress can be offloaded of the individual and perhaps even new and innovative solutions can be brought to the fore. However there is a danger that interests within involved groups could overtake rational analysis and implementation. The most obvious danger was that those in the military during the crisis, would push for military action, however “Kennedy kept the Joint Chiefs at arms length during the crisis, using Taylor to represent their views and interests on ExComm.”5 On the flip side dutiful advice from others besides the executive could reflect the most rational decisions. “It is now clear that Khrushchev’s deputy opposed deployment and later in the crisis objected to military action that risked conflict. Brugioni also suggests that there were ‘conservative marshalls’ who had opposed the venture.”6 Yet despite warnings Kruschev went ahead and put missiles on the island of Cuba. To summarize, there was a danger in circumnavigating these circles on both ends, i.e. either by adopting too much of one groups views or by not taking in the right ones.

The bureaucratic apparatus surrounding John F. Kennedy was meant to lighten the load on his decisive shoulders, and also to serve as a checks-and-balance against erroneous decision making.

Crisis management requires reasonably high capabilities to acquire, manage, and process data rationally in accordance with effective theories about how the world works. Neither government demonstrated such capabilities. The United States was surprised that the Soviets placed missiles in Cuba. Its theories about Soviet capabilities and intentions were wrong, and its data about the missiles came in late-almost too late.7

It is important that intelligence gathering operations are able to find the signals in all the noise of data. “In other words, the estimating process was all about data: collecting it, interpreting it, distilling it, and assessing what it meant.”8 However, as was implied by the quote from Limits of Crisis Management there are systemic issues in agencies and organizations that stymie the effectiveness and purpose they are meant to serve.

Where consistency was a given, inconsistency had to be explained, justified, and defended. Changing a previous estimate required taking a fresh look, marshaling both new and old facts, and laying out what had shifted, and why.9

The burden to put forth new views and posit possible future events of rival countries is mostly avoided or crushed in the gears of such agencies. Even though effective intelligence means to predict as well as prescribe, it is unlikely to do so. For, “although it is clear that U.S. intelligence officials discovered Soviet missiles in Cuba days before they became operational, it is equally clear that they utterly failed to anticipate the presence of Soviet missiles in Cuba every day before then.10 Kruschev’s actions were so far outside the norm that the apparatus failed to foresee it, indeed could not foresee it. The collection and distillation of relevant information is crucial for the rational actor to choose appropriately, but the intelligence gathering organs had seemed to have failed their duties. This is largely in part because to guess at future contingencies one usually looks at past behavior and extrapolates from there. Why the agencies failed in anticipating such behavior may only be a reflection of a radical departure by Kruschev; his action simply could not be foretold based on past experience.

The point here however is not to get entangled with bureaucratic politics or operational process, but instead to focus on elite rational actors involved in the crisis. Decisions in matters like these involve more than one rational unit, thus intentions need to be communicated properly. This is most true in times of crisis because a misperception of a signal can – and will likely – lead to an unintended response. Also a signal might be received as meaningless noise rather than a information conveying content and meaning. This is why substantive action needs to be taken and perceived by the other party in order to signal credibility. “Kennedy’s willingness to take action behind the backs of his North Atlantic Treaty Organization (NATO) allies in a way that risked the cohesion of the alliance indicates his determination to avoid war.”11 Behind even seemingly irrational actions there are in fact reasons. While certain actions are entirely in the realm of ones prerogative, or right, an absence of action can speak volumes. “One of the most significant moments of the crisis was when President Kennedy chose not to retaliate against Soviet surface-to-air sites when Major Anderson’s U-2 was shot down on 27 October.”12 However this could have signaled to the Soviet Union that they do not need fear retaliation from the US, or in short, that the US is weak. “Doing nothing, it was feared, would undermine U.S. international standing and embolden the Soviets to seize West Berlin, which they were constantly threatening to do.”13

The important point there is what might seem like a rational decision can lead to unpredictable outcomes. We must look past these outcomes and down the causal chain of events to why missiles were put in Cuba in the first place. First we should question: given the risk of nuclear armageddon, why would someone place missiles in a location that would likely provoke a response from a powerful enemy? “Kennedy’s incipient suggestion that the Russians would never fire their missiles, he says, assumes their rationality … something that the ExComm had good reason to wonder at given the great risks the Soviets were running in placing missiles in Cuba to begin with.”14 It can be understood why on initial appraisal this seems irrational, however according to another hypothesis, the “Soviet decision ‘had two principal motivations and purposes’: first, to redress global strategic inferiority, and second, ‘to deter an anticipated US attack on Cuba.’”15 To deconstruct the argument here, Cuba was a contested socio-political space that the US and the USSR were vying over. For the US, Cuba was an illegitimate state and this view is evidenced by the Bay of Pigs fiasco the previous year in 1961. They also feared a domino effect in Latin America as subsequent countries would fall under the pall of the Iron Curtain. For the USSR, Cuba was both an ally state, who also happened to have a strategic geo-political location. To put it simply, missiles were placed in Cuba to deter another Bay-of-Pigs-type operation, secure Castro’s state, and to balance the power.

The United States was adding to its arsenal faster than the Soviets. There was a missile gap, and the Soviets were on the wrong side of it, with or without the Cuba deployment. The Soviets were worried about an American first-strike capability.16

Actions like this were meant to signal and communicate to the other not to use certain strategies or to invade their allies. To limit a rivals options is rationally strategic, even if it means aggressive posturing and possibly provoking a response. However, “unless there was a risk of an American invasion, there was no risk of nuclear weapons being used … The readiness state of nuclear weapons did not simply determine the risk of their use.”17 Thus, such actions were entirely viable and rational, even given Kruschev’s nuclear gamble, for a decision maker can never have one-hundred percent certainty in every situation. Instead of thinking, or presuming, deterministic outcomes we should instead think in terms of probability. When there are multiple rational actors, one must probe the others and see how they will respond, make appraisals and then adjust future behavior. There is a learning component, which complements the fact that a state may not just misperceive another states intentions, but also its capabilities. “At each of the stages of the game, into the final two days, the players seem not to have understood the intentions of the other side nor its capabilities-particularly the nuclear capabilities of the Soviets, which the United States was to underestimate throughout.”18 However was this learning behavior possible under times of such a novel crisis? Learning implies iterative interactions, however this event signaled a historical break, meaning it could not be measured in terms of the past. There was a misperception that this event could be compared to another, this is problematic for rationality advocates, because of its unique position in history.

However this fact didn’t prevent analogous reasoning to take hold of those sitting around at the White House. These men had shared experiences that shaped their thinking. There was the World War two example of Munich, as “the lesson from The Second World War shared by all the American participants in the events of the autumn of 1962 was that aggression had to be faced resolutely, and that there could be no shrinking from the use of force to counter opportunism.”19 Also the events of Pearl Harbor was not that distant either. “[there was] moral outrage felt by the Americans at the surprise attack by the Japanese on Pearl Harbour in 1941 … the Japanese attack was held to be despicable and, thus, America could not launch a surprise, pre-emptive attack on Cuba in similar circumstances.”20 While past experience can be informative and guide our actions, it doesn’t make sense in situations that have never occurred before. Past events can only help in a topical fashion, as most reflection of past events in analogous reasoning only go so far as to what happened rather than why something happened.

The pressure to adopt such views is large because uncertainty is endemic in such situations. However to account for all possible choices and their outcomes are precluded from possibility by this situations novel nature. Analogous reasoning to past historical examples were moot, and procedural processes were useless for the same reasons. This is true at least normally, however “when leaders wanted organizations to perform actions that were not part of these organizations’ repertoire, they caused organizational planning to be redirected to expand the repertoire.”21 Thus rigid routine and operating procedures could be overridden by someone like the President. “The capacity of leaders to negate their constraining effects, provided they take time to learn what lies behind them and to override them personally.”22 So we must be wary to think that it was Kennedy and Kruschev alone that managed this crisis. The bureaucratic apparatus could function to generate options: “organizational rigidity is partly a function of leaders’ lack of effort to monitor and shape planning …the planning process can be responsive to leaders’ desire for more alternatives.”23 This would largely mitigate pressure in times of crisis by spreading the burden. The crisis lasted thirteen days, from initial awareness to its denouement. Allowing the generation of options needed, indeed there was an abundance of options because “the results of this process [consulting with bureaucracy] were used to generate new options almost daily.”24 It might be hasty to generalize how much pressure was put on the rational capacities of the key decision makers. However we should be skeptical on how many of these options were true options in that they weren’t just slight modifications of other plans.

This isn’t to say there are no built in fault-lines to organizations and bureaucracies. There is the potential “politicization of intelligence:”

Intelligence officers might consciously adjust their estimates of adversary intentions or capabilities because they believe that the failure to provide ‘intelligence to please’ might result in the loss of their jobs … or loss of influence.25

In times of crisis and uncertainty, intelligence gathering gains critical importance, but this is tempered and constrained by lack of time and human fallibility. A lot can ride on the trust of experts and analysts. Regarding the photo analysis for example, Robert Kennedy said: “‘I for one, had to take their word for it. I examined the pictures carefully and what I saw appeared to be no more than the clearing of a field for a farm or the basement of a home’. He was ‘relieved to hear later that this was the reaction of virtually everyone at the meeting, including President Kennedy.’”26 Who can reasonably say it is improbable these experts were immune from error?27 Pure rationality seems impossible to exercise, and a leap of faith is needed to come to decisions. “This is emphasized by Clausewitz in his concept of the ‘fog of war’ and by game theorists in their concept of ‘incomplete information.’”28 Choices are usually made with such information, or lack thereof.

Awareness of this lack especially by an opposing side can be played to ones advantage. In short, a bluff can be made and asserted through posturing which can potentially lead to the outcome the decision maker wants. As well, this misrepresentation might be unintentional and play into a states hands. For example, “[there were] exaggerated intelligence estimates of Soviet strategic nuclear capabilities provided by the US intelligence community.”29 It might be suggested that posturing was happening in the US case: “so concerned was President Kennedy … he made explicit public warnings on September 4th and again on September 13th that if the Soviets placed offensive weapons in Cuba, ‘the gravest issues would arise,’ a warning understood to imply potential nuclear confrontation.”30 It can be surmised that this was a ploy to communicate to the Soviets to stop messing around. However we can only conjecture that Kennedy really meant it employ such measures, or if was just trying to secure compliance. Beyond posturing and misrepresentation strategies, there was also serious consideration at least by some to implement aggressive militaristic options.

While some see the hawks of Congress and the military to be too aggressive, these options can be understood rationally. Military action was only seen as legitimate when framed in an instrumental way, i.e. military action is politics by other means. In other words military action and diplomacy are not mutually exclusive but can reinforce each other. This is why during this event there was a danger, one side simply couldn’t back off, yet they could not sit idly by and ignore it either. To threaten invasion, or to threaten a missile attack was to induce the other to bend to their will. This is the rationalist strategic mindset. In the past this sort of mindset was fine and could work, but with the advent of weapons of nuclear power it became no longer viable because it raised the stakes and possible consequences to unacceptable proportions.

Kennedy circumvented these issues in a useful manner, “he adopted a Socratic method, proceeding by questioning.”31 This enabled the president to take on a role of facilitator instead of authoritarian decision maker. This would as one would imagine smooth the emergence of a rational consensus for action and would reveal muddled thinking. Given the historical novelty of the crisis, giving breathing room to allow solutions to develop in the mind was both crucial and necessary. The formulation for a strategy was needed, not the adoption of the same-old same-old procedures or routes of action; old ideas could not serve as panacea but only for specific cases. “The strategy process, then, was emergent, in that the course of action adopted was continually being influenced by, and validated against, the course of events.”32

New treads were needed, and it was done so on-the-spot, so to speak. “Although many facts were available, nothing was known of the aims and intentions of the Soviets and so the rationality of the discussions was limited, leading to much generative thinking guided by the considerable experience of the participants.”33 I think a common assumption that would be made is that this elite group of decision makers engaged in deliberation would weigh all options equally and choose the best course of action, to maximize the chances of their preferred goal. This goal being the diffusion of tensions and the removal of the missiles from Cuba. However the actual process was different from that which would be supposed.

Contrary to the conventional decision-making model in which all options are set side by side and compared, with the best option selected based on calculations about expected outcomes, the ExComm’s actual decision-making process involved a succession of yes-no choices on binary options … with each choice shaping the options subsequently encountered.34

We need to look beyond the process then and look elsewhere to explain how rationalism came to the rescue. There should not be an underestimation of the role of the key decision makers. While there were dangers on the individual level, and the organizational level to make erroneous decisions, once these were compensated for and the executive was given strong options the function or necessity to reach consensus was no longer necessary. In other words, the executive was then free to act, by directly skirting around accepted channels of decision making. “To resolve the missile crisis, Kennedy and Khrushchev relied on their own prerogative powers. They used Channels of Trust (doveritel’nyye kanali) … President Kennedy could bypass the ExComm, and Khrushchev could bypass the Presidium as they worked out a deal to trade the missiles in Turkey for the missiles in Cuba.”35 This, given the circumstances, can be seemed as completely legitimate. This should be even applauded or admired given all that could have gone happened in the process and gone wrong.

To summarize, we have considered the framework of rationality and then attempted to apply it to a very unique case study that is the Cuban missile crisis. We then had to take into account the problematic nature of being a rational actor given a confluence of contexts. There was the context of psychological, or cognitive, errors that needed to be avoided. Then there was the errors of organizational bodies and their undue influence on decision making; we recognized that such bodies have both positives and negatives to them. It was shown that the common view that organizations are rigid and get in the way of decision making was dismantled by pointing out that leaders such as Kennedy can circumvent such inadequacies. There was the recognition of action based on incomplete information, and on misperception. We qualified what is meant by misperception and it was explored how this could be either a nuisance, or a strategic boon. This concept was tied into the role of “signals” and communication which was also explored. Finally, there was a discussion on Kennedy and his methods which helped to defuse potential pitfalls in the decision making process. The corollary point attached to this was that a new path of action was created through the process and there were potential problems that Kennedy navigated past. To conclude, it is folly to think or to create a strawman argument that executive decision making by presidents or otherwise is so simple. The process of rational crisis decision-making is problematic and full of stumbling blocks for the rational agent. What makes the Cuban missile crisis so interesting is that the executive decision makers were seemingly able to wade through these problems and come to a solution, for things could have just as easily swayed the other way.

  1. Zegart, “The Cuban Missile Crisis as Intelligence Failure,” 28. 

  2. Levy, Causes of War, 131. 

  3. Ibid., 156. 

  4. Ibid. 

  5. Scott, “Eyeball to Eyeball: Blinking and Winking, Spyplanes and Secrets,” 352. 

  6. Ibid, 351. 

  7. Pious, Richard M. “The Cuban Missile Crisis and the Limits of Crisis Management.” Political Science Quarterly 116, no. 1 (Spring 2001): 92. 

  8. Zegart, Amy B. “The Cuban Missile Crisis as Intelligence Failure.” Policy Review, no. 175 (November 2012): 38. 

  9. Zegart, “The Cuban Missile Crisis as Intelligence Failure,” 36. 

  10. Ibid, 27. 

  11. Scott, Len. “Should We Stop Studying the Cuban Missile Crisis?” International Relations 26, no. 3 (2012): 257. 

  12. Scott, “Should We Stop Studying the Cuban Missile Crisis?”, 263. 

  13. Gibson, David R. “Speaking of the Future: Contentious Narration During the Cuban Missile Crisis.” Qualitative Sociology 34 (2011): 507. 

  14. Gibson, “Avoiding Catastrophe: The Interactional Production of Possibility during the Cuban Missile Crisis,” 388. 

  15. Gibson, David R. “Speaking of the Future: Contentious Narration During the Cuban Missile Crisis,” 507. 

  16. Pious, Richard M. “The Cuban Missile Crisis and the Limits of Crisis Management.” Political Science Quarterly 116, no. 1 (Spring 2001): 86. 

  17. Scott, “Should We Stop Studying the Cuban Missile Crisis?”, 259. 

  18. Pious, “The Cuban Missile Crisis and the Limits of Crisis Management,” 93. 

  19. Grattan, Robert F. “The Cuban Missile Crisis: Strategy Formulation in Action.” Management Decision 42, no. 1 (2004): 56. 

  20. Ibid. 

  21. McKeown, Timothy J. “Plans and Routines, Bureaucratic Bargaining, and the Cuban Missile Crisis.” The Journal of Politics 63, no. 4 (November 2001): 1165. 

  22. McKeown, “Plans and Routines, Bureaucratic Bargaining, and the Cuban Missile Crisis,” 1177. 

  23. Ibid, 1177. 

  24. McKeown, “Plans and Routines, Bureaucratic Bargaining, and the Cuban Missile Crisis,” 1169. 

  25. Levy, Causes of War, 175. 

  26. Scott, “Eyeball to Eyeball: Blinking and Winking, Spyplanes and Secrets,” 346. 

  27. One only need to look at the photographic “evidence” the government of George Bush had for Iraq WMDs. 

  28. Levy, Causes of War, 131. 

  29. Scott, “Eyeball to Eyeball: Blinking and Winking, Spyplanes and Secrets,” 350. 

  30. Zegart, “The Cuban Missile Crisis as Intelligence Failure.” 26. 

  31. Grattan, Robert F. “The Cuban Missile Crisis: Strategy Formulation in Action.” Management Decision 42, no. 1 (2004): 61. 

  32. Grattan, “The Cuban Missile Crisis: Strategy Formulation in Action,” 65. 

  33. Ibid, 67. 

  34. Gibson, “Avoiding Catastrophe: The Interactional Production of Possibility during the Cuban Missile Crisis,” 366. 

  35. Pious, “The Cuban Missile Crisis and the Limits of Crisis Management,” 89.