Learning to see

Featured in

  • Published 20040907
  • ISBN: 9780733314537
  • Extent: 268 pp
  • Paperback (234 x 153mm)

In my experience, the efficiency of an intelligence service depends … on the willingness of those who receive its information to pay attention to it when it contradicts their own opinions.

– Markus Wolf (former East German spy master), 1997

There were situations Vietnam was an example in which the US Government, starting ignorant, did not, would not learn. There was in Vietnam a whole set of what amounted to institutional “antilearning” mechanisms working to preserve and guarantee unadaptive and unsuccessful behaviour.

– Daniel Ellsberg (referring to mid-1967), 2002.

OVER THE PAST three years, since 9/11, there has been a greater than usual public interest in the uses of “intelligence” in the making of international security policy, both here and abroad, but particularly in the United States. There has also been a good deal of animated discussion about the uses of intelligence, in the more ordinary sense of the word, that is to say, the capacity for good judgement, in the making of such policy. Can we trust the judgement of George Bush, Tony Blair, John Howard? Is the CIA competent? Is it politicised? Are our own intelligence services too reliant on American analyses? Are our political leaders too compliant with American policy?

There are good reasons for asking all these questions, but those who ask them are seldom well equipped to answer them dispassionately. Above all, most people anxious about intelligence or policy challenges, or perceived failures, are rather hasty in assigning blame and too prone to presume that they know better or would have done better. Seldom is it self-evident that this would be so. For good judgement and the effective use of intelligence reports require cognitive skills and disciplines that run counter to deeply ingrained and universal human proclivities

No such proclivities are more fundamental, or more insidious, than the tendencies to belief preservation and confirmation bias, which lock us into plausible, but false, hypotheses, inefficient methods for assessing whether we have things right and resistance to information that challenges our perceptions or judgements. This is not a matter of stupidity or ignorance or gender. It’s just the way we are wired up as human beings. It is the root of our numerous superstitions, prejudices, ideologies, intractable disputes – and “intelligence failures”. It’s worth taking time out to reflect on this, rather than getting caught up in the politicised blame game.

The key to effective use of intelligence is the capacity of policy makers and decision makers to learn. This means three things. First, being educable as to basic realities. Second, being able to adapt policy as those realities change. Third, being capable of reframing policy itself when intelligence findings cast doubt on fundamental assumptions. These might be summed up as the capacities for awareness, understanding and rethinking. “Intelligence failures” occur when one or more of these conditions are not met, whether due to the deficiencies of the intelligence process or the obduracy of policy makers. Policy disasters occur when the third, in particular, is not met.

Effective intelligence agencies, therefore, must be equipped to educate policy makers in basic realities, to monitor changes in those realities with sufficient sensitivity to allow for adaptation, and to understand the policy framework well enough that they can identify problems for fundamental assumptions and draw these to the attention of policy makers in a clear manner. These tasks all involve technical and cognitive challenges of a high order. They also require institutional processes that facilitate, rather than inhibit, critical analysis and communication.

 

TO ILLUSTRATE ALL this, let me use an example from Australian history: a debate that took place in Canberra, in December 1974, regarding Indonesia and East Timor. It was a meeting of policy and intelligence officers at the First Assistant Secretary level. It included, inter alia, Gordon Jockel, head of the Joint Intelligence Organisation (JIO), Michael Cook, who was to become head of the Office of National Assessments some years later, and Richard Woolcott, who was due to become Ambassador to Indonesia a few months later.

Both Jockel and Cook observed that intelligence reports indicated the East Timorese would not accept incorporation into Indonesia and would mount a stout armed resistance. This, they noted, was contrary to a tacit assumption built into Australia’s policy on the matter and called for a rethinking of that policy. After all, Jockel remarked, such resistance could become “a running sore for Indonesia” and it would be best to attempt to head off the possibility by discouraging Indonesia from armed intervention.

Woolcott’s response was threefold and each aspect of it merits reflection. First, he countered that JIO’s assessment could be in error, since JIO had erred on other matters. In other words, he refused to accept JIO’s description of the basic realities. Second, Woolcott said, the Prime Minister, Gough Whitlam, “wants to see incorporation” – implying that this was not something that should be challenged. Third, if things come to a clash of arms, if they turn nasty, he said, the Prime Minister has “escape clauses” – which is to say, Whitlam could always claim that he had not endorsed and did not condone the use of force.

This is not the place to revisit the history of what then occurred, except to observe that Indonesia did, of course, a year later, resort to force and the East Timorese did mount a stout armed resistance.

The consequence was an island holocaust in which very large numbers – probably around 200,000 – of East Timorese lost their lives, along with a substantial number of Indonesians. The occupation of East Timor did, indeed, become a running sore for Indonesia and an insoluble problem for Australia. JIO had not been in error – but Whitlam certainly invoked the escape clauses to which Woolcott had referred before the event. He has continued to do so to this day.

What occurred in this case was that the intelligence arm of government, doing its primary job well, had a handle on the basic realities, while the policy makers, led by the Prime Minister, chose to construe those same realities according to their own lights. Despite the intelligence analysis indicating what could (and did) go badly wrong, there was no significant adaptation of policy. There was certainly no reframing of that policy – for example, a preparedness to take measures to head off Indonesian armed intervention and resolve the matter by other means.

 

SOMETHING SIMILAR SEEMED to occur in1999, when the dismal history of East Timor came to a head in a referendum that gave it independence. While the relevant documents have not been declassified in this case, as they have been for 1974-75, the visible outlines of what occurred suggest that, in this case, too, there was a troubling disconnect between intelligence and policy. The policy makers tried to find a way to work with the Indonesians and bent over backwards not to confront them, even as intelligence reports accumulated showing that the Indonesian military had lethal intentions in regard to any East Timorese who voted for independence.

In 1999, of course, the outcome was not as dire as it had been in 1975-76. The numbers killed were two orders of magnitude fewer and East Timor gained its de jure independence, with belated Australian protection and support. Yet terrible things happened that had been fully anticipated by intelligence reports, while their probability had been denied by policy makers for months before the crisis broke. Indeed, even after it had broken, those same policy makers were at pains to deny what intelligence reports had apparently made plain: that the Indonesian military had planned and co-ordinated the mayhem and destruction attributed to pro-Jakarta militias.

Such were the strains produced within the system by this dissonance between intelligence and policy that they triggered the suicide of senior military intelligence officer, Defence Intelligence Organisation (DIO) liaison officer, Merv Jenkins in Washington, after security officers harried him for sharing intelligence on East Timor with the Americans; prompted unidentified dissidents within the Government to leak to the press the DIO’s early warnings of what had occurred; and, as we learned in April this year, led an exasperated Lieutenant Colonel Lance Collins to allege that both intelligence and policy were being dominated by a “pro-Jakarta lobby”.

However, in 1999, one could argue, the policy makers did better than in 1974-75. They tried one thing – working with the Indonesians to keep the lid on in East Timor – then, when this became unworkable, adapted their policy. The suggestion for a referendum was a marked adaptation of long-established policy. The creation of the International Force East Timor (INTERFET) was a radical reframing of that policy, the implications of which are considerable. That a sustained effort was made to avoid an open breach with Jakarta over the matter is hardly either surprising or, in itself, culpable.

That reframing occurred at all in Australia’s Indonesia policy in 1999 is notable. The Howard Government, much pilloried on various counts, deserves some credit for what rethinking it did in that case. For such reframing often does not occur at all, or only after inordinate costs have been absorbed. Consider Israeli occupation of the Gaza Strip and the West Bank after the 1967 Six-Day War, or the Soviet invasion of Afghanistan. Consider the potential costs of a Chinese unwillingness to reframe its policy regarding realities in Taiwan.

 

THOUGH THERE ARE any number of case studies that lend themselves to analysis along these lines, there are few as famous as the Vietnam War – the notorious “quagmire” in which five American presidents wallowed, at enormous cost in blood (especially Vietnamese blood) and treasure. It remains fertile ground for exploring the relationship between intelligence and policy-making, because it was so prolonged, is so well documented and involved so profound a dissonance between intelligence and policy-making.

The case for this last claim is more often assumed than made, but the assumption is well grounded. The dissonance was felt at the highest levels of the American Government even in the early 1960s, before Lyndon B. Johnson escalated the war. It came to be centred in the person of Robert S. McNamara, LBJ’s Secretary of Defence, as the recent documentary, The Fog of War, showed rather well. McNamara was so struck by this dissonance that, in 1967, he commissioned a team of intelligence analysts to re-examine the whole history of decision-making on the Vietnam War. Having read and absorbed its findings, he resigned.

The study became known as “The Pentagon Papers”, when one of the intelligence analysts, Daniel Ellsberg, leaked 7000 pages from it to The New York Times, which serialised it. Ellsberg, a brilliant young decision theorist from Harvard and RAND, described the experience of working on the study as shattering his faith in the corrigibility of policy by rational analysis. For a decade he’d worked on the assumption that bright analysts like himself could ensure that the executive branch had the best possible, the right advice on which to base policy. Looking back at the history of Vietnam policy since 1945, he concluded that such advice had failed abysmally in its purpose.

Ellsberg’s reflections, set out in their initial form more than 30 years ago, in Papers on the War (Simon and Schuster, 1972), and enlarged into a full-length memoir, Secrets (Viking, 2002), are classics, which every policy maker and intelligence analyst should read closely. They are classics because Ellsberg was finely educated in the specific skills of analysis required to make sense of complex policy challenges; because he had been a firm believer in the system and had worked deep inside it; and because he discovered that, in fact, the system was failing at all three levels to induce policy makers to learn.

Indeed, one of the things that seems to have contributed to McNamara’s own decision to initiate the Pentagon study was a memo Ellsberg wrote to him in June 1966 from Saigon. The memo, Ellsberg wrote in 1972, expressed his “long-time concern to understand policy-making”. It read, in part:

“One of my main motivations in leaving John McNaughton’s office [in the Pentagon] to come to Vietnam last August was a feeling that a year of reading official cables from that country had not satisfied my “need to know” about the nature of the problems there. Too many events came to me and, it seemed, to others in the building as surprises, too much behavior seemed puzzling and unmotivated, the reasons for our persistent failures and setbacks there seemed too uncertain. At the end of a year’s work on Vietnam affairs I felt scarcely more educated on the situation than at the beginning. I took the chance to come to Vietnam with General Lansdale as, in large part, an opportunity to reduce that ignorance.

After only a few months, I was fully convinced of what I had suspected before: that official reporting (including Nodis and Eyes Only, Back Channel and what have you) is grossly inadequate to the job of educating highlevel decision makers to the nature of the essential problems here. It did not tell them what they needed to know. Nor did official highlevel visits to Vietnam (though somewhat better) in practice fill that lack. Nor are there reports to be read in Saigon that answer the questions; to rely entirely on the official reporting to Saigon from the field (as many high officials in Saigon do) is to remain untutored on many critical problems of Vietnam, as I felt and was in Washington.”

 

ELLSBERG FOUND THAT the system was not even fulfilling its most fundamental task – to educate policy makers about basic realities. On the contrary, he found that such realities were being screened out by “progress-hungry” nerve centres as field reports flowed up the chain. There was lying, there was self-deception, there was a plague of optimistically false reporting, there was a failure to even record operational experience or mistakes, there was a rapid turnover in personnel, resulting in the constant deletion of whatever institutional memory was being formed.

All these processes Ellsberg dubbed “anti-learning mechanisms”. Given his faith in the uses of rational analysis, he was appalled by what he found. Indeed, he wrote in 1972: “The process of reaching these conclusions was, quite simply, the most frustrating, disappointing, disillusioning period of my life.” And, he realised, to all these anti-learning mechanisms in the field, “the high-level decision-making process adds the barrier of extreme secrecy”. Such secrecy generated myopia and amnesia inside government, where access to information was highly compartmented, thus screening and, therefore, encouraging mendacity.

Helping the Government to learn, he reflected, had been his vocation. His various studies of crisis decisions, other than those involved in the Vietnam War, had convinced him that “a prerequisite to improving the Government’s performance was that it become self-aware, that it begin systematically to discover and analyse its own ‘hidden history’ “. Too many institutional processes and practices inhibited that. As a result, he deduced, policy was not adapting, much less being reframed – with disastrous consequences.

 

THE PENTAGON STUDY was supposed to trigger self-awareness and learning, but Ellsberg found that almost no one was reading it and the default reflex in the Pentagon was to keep it classified because, in McNamara’s words, “they could hang people for what’s in there”. In recent years, McNamara has finally attempted to contribute to some learning, but it was Ellsberg who did the most to ensure that others would read it and learn from it.

In an interesting review of Ells-berg’s memoir in The New Yorker in October 2002, Nicholas Lemann argued that Ellsberg’s faith in inducing rational decisions by informing people, whether presidents or the general public, was misplaced. The fact is, he wrote, that nobody really had been lying to the White House about the probability of success in Vietnam “and they engaged anyway”, and escalated every time things looked bad. This contradicted both Ellsberg’s explanation for what happened and his whole view of the world, Lemann argued.

That view of the world was that “if decision makers can be given good information they will make rational choices”. But in reality, “it’s hard to get people to change a course they’ve set based on bad information, even after you’ve given them better information”. Why? Because they are committed on the basis more of values and emotions and ideological convictions than probability assessments and intelligence estimates. Decisions to go to war are, he asserts, “ideological, not informational” and, therefore, not readily corrigible. “It’s not what we know, but what we believe in that makes all the difference.”

There is an important sense in which Lemann is right, but he misses the main point. Information is always seen and interpreted through the sets of lenses which human beings use to make sense of the world. Those lenses are best understood as mind-sets. They include ideological convictions, religious beliefs, party political loyalties, emotional and psychological dispositions, policy commitments, beliefs about how the natural world works and what have you. The question is, precisely: to what extent are such beliefs corrigible when confronted with information that contradicts them? The answer is, as a rule: not very.

The consequences are in evidence every day in every way. With regard to the making of national security policy, they can have such baneful consequences that it is more than usually important for both intelligence analysts and policy makers to have special sets of lenses to put on when they are considering critical matters. Such lenses are critical-thinking lenses: the disciplined and principled willingness and capacity to do precisely what is hard and does not come naturally: re-examine one’s convictions and opinions and open them to revision. That’s what intelligence is for – in both senses of the word.

Share article

More from author

More from this edition

Out of the slipstream

EssayWE WAITED IN the cold all morning. It was a Saturday in early June, only a few days before my fourteenth birthday. An early...

Addicted to Celebrity

IntroductionONCE UPON A time in a far-off land, with cobblestone streets and gingerbread buildings, a handsome prince marries a beautiful girl from a distant...

Life without reputation

ReportageI HAVE HEARD some significant gossip about the Howard Government Attorney-General, Philip Ruddock. Or rather, since I heard it from two reliable and independent...

Stay up to date with the latest, news, articles and special offers from Griffith Review.