Bolt from the Blue: How Companies Can Handle the Unexpected, from Government Regulation to Cyber Crime

Bolt from the Blue: How Companies Can Handle the Unexpected, from Government Regulation to Cyber Crime

NOOK Book(eBook)

$10.49 $14.99 Save 30% Current price is $10.49, Original price is $14.99. You Save 30%.
View All Available Formats & Editions

Available on Compatible NOOK Devices and the free NOOK Apps.
WANT A NOOK?  Explore Now
LEND ME® See Details

Product Details

ISBN-13: 9781909653313
Publisher: Elliott & Thompson
Publication date: 09/01/2014
Sold by: Barnes & Noble
Format: NOOK Book
Pages: 216
File size: 2 MB

About the Author

Mike Pullen is head of international trade and a partner at DLA Piper, specializing in European Union competition law. He advised a number of eastern European governments on their accession to the EU and his views have been sought by institutions around the world including the Department of Trade and Industry, the European Parliament, and the US Department of Commerce. John Brodie Donald is head of DGIR, the D Group's research operation. He has worked as a geopolitical risk consultant with Aegis and later trained as a mediator. He has more than 20 years experience in financial services, working with Jardine Fleming and ING Barings.

Read an Excerpt

Bolt from the Blue

Navigating the New World of Corporate Crises


By Mike Pullen, John Brodie Donald, Louis Mackay

Elliott and Thompson Limited

Copyright © 2014 Mike Pullen and John Brodie Donald
All rights reserved.
ISBN: 978-1-909653-31-3



CHAPTER 1

The five key principles of crisis management


When war does come, my advice is to draw the sword and throw away the scabbard. General 'Stonewall' Jackson, speech to cadets at the Virginia Military Institute, March 1861


It is often said that 'generals always fight the last war'. This is because they spend most of peacetime studying history and developing tactics for known threats. However, advances in technology change the nature of the battlefield. This means that their plans are generally obsolete from the moment a new war starts. No plan survives first contact with the enemy, as German General Helmuth von Moltke famously said.

The massed cavalry charge, which was so devastatingly effective in the Napoleonic era, became suicidal in the American Civil War. Smoothbore muskets had been replaced by rifles, which had a much longer range and could cut down a cavalry charge with ease. As a result, the cavalry spent most of their time dismounting and fighting on foot like infantry. These tactics were, in turn, undermined 50 years later as technology further developed.

The vulnerability of infantry was demonstrated in the First World War. Machine guns and fortified trenches gave defenders an overwhelming advantage. Advancing infantry was mown down like reaped wheat in the crossfire from the opposing trenches. The overwhelming strength of fortified defences was a lesson learnt at appalling expense, and paid for with the lives of millions of foot soldiers. French military planners, having learnt this lesson, built the Maginot Line in the 1930s: an impenetrable string of concrete fortifications and interconnected bunkers that ran the length of the Franco–German border. Unfortunately, it was trounced by Germany's 'blitzkrieg' tactics in the Second World War. German panzer divisions simply went around the end of the Maginot Line through the Ardennes and Belgium. So, despite this impenetrable wall, France fell within six weeks. This was an inconceivable outcome to any soldier who had spent the First World War engaged in static trench warfare, just one generation previously.

Generals always fight the last war. At school my beleaguered history teacher, faced with an unruly class, would often quote an even more timeworn adage: 'those who do not study history are condemned to repeat their mistakes'. But it seems that those who do study history are also condemned to making mistakes, particularly when a war or crisis looms. It's not just generals who make this error. Several years on from the financial crisis of 2007, economic growth in the West is far from robust (particularly in Europe) despite interest rates being held at unprecedentedly low levels for an extraordinarily long period. This situation prompts many commentators to quip that 'economists are fighting the last depression'. Through quantitative easing, central banks have flooded the market with cheap money by buying back government bonds. High interest rates were seen as the cause of the Great Depression of the 1930s. Central bankers, having studied history, have floored interest rates with unprecedented vigour but it doesn't seem to be working. They may well be fighting the previous war.

As with generals and economists, so with corporate risk management. Every major company has a contingency plan in its bottom drawer to deal with a crisis such as the kidnapping of the CEO. It is sitting there, prepared in exhaustive detail by the business continuity and corporate security departments, ready to be pulled out at a moment's notice. But the last time a CEO was kidnapped was in the mid 1970s. Technology has changed and risks have moved on since then. How many companies have a cyber security plan? Given the magnitude of this type of threat, the answer is – not enough. The death of a CEO, though tragic, is not fatal for the organisation. There is always a new CEO waiting in the wings. In contrast, the theft of critical intellectual property in a cyber attack could be a far more serious blow.


Ignoramus et ignorabimus

At a press briefing on 12 February 2002, the US Secretary of Defense Donald Rumsfeld was addressing the absence of evidence of weapons of mass destruction (WMDs) in Iraq. He offered the following argument in support of his decision to go to war:

Reports that say that something hasn't happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say, we know there are some things we do not know. But there are also unknown unknowns – the ones we don't know we don't know.


This tortured and convoluted language is reminiscent of the muscular writhings of an octopus in a confined space; not unlike the tight spot that Rumsfeld found himself in. It prompted much hilarity and ridicule in the press and he was awarded the 2003 Foot in Mouth award as a result. But the point he was trying to make is both valid and important. It is more pithily summarised in the Latin phrase 'ignoramus et ignorabimus', meaning 'we do not know and we can not know'.

A CEO never knows where the next crisis is coming from. Crises always come out of the blue, which is a tautological statement because it is its very unexpectedness that makes a crisis a crisis. So there can be plenty of contingency plans in the bottom drawer dealing with the 'known knowns' or even the 'known unknowns' but the crisis will blow in from the third area: the 'unknown unknowns' – the event that there is no plan for. In fact, an even more dangerous fourth area exists, which Rumsfeld overlooked. In order to untangle the writhing octopus, however, we need to express it in a different way.

The word 'unknown', used in this context, can mean two things: either that the observer is unaware of an event or that a particular outcome is unpredictable. So we can recast Rumsfeld's statement using the two concepts of 'predictability' and 'awareness', as illustrated in Figure 2.

In the first quadrant are the 'known knowns'. These are issues that a CEO is aware of and whose outcomes are fairly predictable. This is the quadrant where business planning is most effective and is the normal focus of management attention. A good example is next quarter's revenue figures. Although some uncertainty always exists, most companies can predict these figures with a reasonable degree of accuracy. The issues affecting the outcome are fairly well known: new product launches, consumer demand forecasts, advertising budgets, the strength of competitors, major new customers on the horizon and marketing incentives are all somewhere in the mix. These factors are both well understood by management and fairly predictable and so can be modelled in a spreadsheet to produce a plausible forecast.

The second quadrant, 'known unknowns', contains issues of which management is aware but that are inherently unpredictable. Examples include the risk of a catastrophic flood, a kidnapping, a terrorist attack, an earthquake destroying a key subcontractor's facility or a coup in a foreign country. These types of risk are quite binary. The likelihood of them occurring is very low but if they did happen they would cause major disruptions. These issues would not normally be factored into a model forecasting future revenues. Rather, they would be addressed by business continuity planning. Risks are clearly higher here, but can be mitigated to some extent by examining a number of possible outcomes and conducting sensitivity analysis for each potential issue. In this way, some of the darkness of uncertainty can be partially illuminated.

The third quadrant, 'unknown unknowns', contains issues that no one has even thought of yet. They are not just unpredictable but, to make matters worse, no one is even aware of them. It is the third quadrant that Rumsfeld was trying to focus the media's attention on. His argument was that, even though it was uncertain whether or not Saddam Hussein had WMDs, the risk warranted the decision to go to war. Invasion of Iraq was justified by the 'unknown unknowns'; in other words, the threats of an unspecified nature that are not even suspected to be there.

A moment's reflection will show that this is a pretty flimsy argument. First, since WMDs were actually specified, this must surely be a known threat: it belongs in the second quadrant not the third. Second, an 'unspecified and unknowable' threat is surely the opposite of justification because the latter implies a set of facts both specified and known. Put the other way round, the third quadrant in Rumsfeld's logic could be used to invade any country in the world at whim, which may well have been his intention.

It is the neatest of ironies that Rumsfeld was undone by events in the fourth quadrant; the quadrant he did not even recognise or mention. These are the 'unknown knowns': facts that are known at some lower level in the organisation but of which the CEO is unaware. It was the scandal caused by the torture and abuse of prisoners in Abu Ghraib that led to his resignation. How fitting that the cause of his demise occurred in the place he least expected it – not among the 'unknowns' but among the 'knowns' he had overlooked.


RULE 1. Do not deny anything before you are in full possession of the facts

From a CEO's perspective, the fourth quadrant is the most dangerous. Though the crisis will enter in the third quadrant, it is in the fourth quadrant that most damage will occur. The natural and most instinctive response to a crisis is denial, usually before the full facts have been established. When facts subsequently emerge that contradict the CEO's initial statements, that CEO is dead meat. The general public will normally forgive an unforeseen event, but they will not forgive a cover-up. So for a CEO to deny things before they are in full possession of the facts is extremely dangerous. The first response in a crisis should always be to investigate the 'unknown knowns': what facts are known at a lower level in the organisation of which the CEO is unaware? What have the divisional managers and subordinates been hiding from them?


The egg cup and the pea

You will notice in the above diagram that the two axes have a key distinguishing factor. The 'awareness' axis lies inside management control while the 'predictability' axis does not. A CEO can always make themself more aware of the facts through better internal communications, red-flagging protocols and more thorough management reporting. However, this is not the case with the 'predictability' axis because, as Danish physicist Niels Bohr once wryly observed, 'prediction is difficult, especially about the future'. The problem is compounded when you are dealing with a 'non-linear' system.

Most of the mathematical tools in common usage are linear in nature; in other words, the output is predictably proportional to the input. So for a simple equation like y = 3x + 1, once you know the input (x), you can calculate the output (y) by multiplying by three and adding one. This holds true regardless of whether the value of x – the input – is six or a million. In a linear system, the initial conditions are unimportant.

Everything you learn on an MBA course is based on linear mathematics, including accounting, probability theory, demand modelling, optimal pricing, profit forecasting and investment analysis. Unfortunately, the real world is non-linear. Here, initial conditions are very important and outcomes are unpredictable. The best way to illustrate this concept is to imagine a dried pea sitting at the bottom of an egg cup. This is an inherently stable linear system. If you randomly knock the pea in any direction, it will rattle around a bit but ultimately end up back at the bottom of the egg cup exactly where it started. Even though a bit of randomness occurs at the beginning, it is quite easy to predict the final outcome. Now imagine that you turn the egg cup upside down and carefully balance the pea on top. This is an inherently unstable, non-linear system. If you randomly knock the pea in any direction, it will roll down the side of the egg cup, bounce across the table, fall onto the floor and end up in the far corner of the room. The chance of you being able to predict where the pea will land is almost zero. A slight randomness in the input leads to an unpredictably wide range of possible outcomes. The pea inside the egg cup is on the left-hand side of the diagram (quadrants 1 and 4); the pea balanced on top of the egg cup is on the right-hand side (quadrants 2 and 3).

To use a slightly more complicated analogy, think of a game of billiards. In order to knock the black ball into the pocket, you must strike the cue ball so that it moves in a particular direction at a particular speed. You could theoretically calculate the correct angles and momentums involved using Newton's laws, although, of course, a skilled player is doing precisely that instinctively by eye. The point is that a game of billiards is a linear system. Once you know the direction and speed of the cue ball, everything else is predictable. It is also repeatable. If, sometime later, you put all the balls back in exactly the same positions and played precisely the same shots, the outcome would be the same.

A non-linear system is a billiard table on a yacht. As the boat is gently rocked by the waves, the billiard table tilts in an unpredictable way. The shot you play is now very dependent on the initial conditions; are you playing slightly uphill or downhill? It is no longer predictable or repeatable. You have to take account of the slight slope on the table at the exact moment you strike the ball. Let's compound the difficulty by supposing that the billiard table is not flat. Imagine the green baize covering an undulating landscape of shallow bumps and valleys. That is yet another complex environmental factor to take into account. The system is now so complicated that the only way to guarantee that the ball will end up in the desired place is to steer it there by continuously pushing it across the table.

By steering the ball across the table rather than just striking it once, you have introduced a control mechanism. The only way to get the required outcome from an unpredictable, non-linear system is by establishing continuous control based on some sort of feedback mechanism. That knowledge will not come as a surprise to any pragmatic CEO. It's just a long-winded way of saying that complex systems, like companies, need steering to get the right results. Sometimes, in fair weather, only a light touch now and again on the tiller is needed. But, in heavy weather, steering the right course requires constant attention, making manual adjustments moment by moment in response to local conditions. That, in engineering terms, is what is meant by a feedback control loop.

A control system driven by 'active feedback' means that the output is connected in some way to the input. The output from the system is monitored and used to amplify or attenuate the input. The best example is the high-pitched yowl from a live mic on stage. The microphone is picking up the sound coming from the speakers and passing it to the amplifier, which then pumps it out to the speakers again, so completing the feedback circuit. The result is an ever-increasing spiral of amplified white noise, which you hear as a high-pitched atonal screeching (of course, at a heavy metal concert, that's the lead singer and he is supposed to sound like that).

Microphone feedback is an unintended error, but feedback is normally a beneficial process and vitally important for controlling complex, non-linear systems. Cruise control in a car is a good example. If you want your car to travel at a constant speed, the simplest solution is to hold the throttle in a fixed position. The problem is that, when you start to go up a hill, the car will slow down. So a better solution is to constantly monitor the speed of the car and then adjust the throttle to keep the speed constant; opening it when going up a hill and closing it when going down. This is what a cruise control system does. The output (the speed of the car) is used to control the input (the throttle) through a feedback loop. Of course, anyone actually driving a car without cruise control uses the throttle in this way automatically, without even thinking about it; the active feedback controller is the driver themself.

You may be wondering by now whether this lengthy technical discussion about active feedback loops has a point since the conclusions so far seem fairly banal. But there is a key engineering principle at the heart of control theory that has important implications for crisis management. It concerns the controllability of the system and can be easily demonstrated with a simple practical experiment.

For this experiment you need a pencil and a long-handled broom. Try balancing a pencil on the end of one finger. You will find it almost impossible. Within less than a second it will have fallen off. Now try with the broom. If you can, remove the brush from the end and just use the broomstick. After a bit of practice, you should find it quite easy to do for more than ten seconds or so.


(Continues...)

Excerpted from Bolt from the Blue by Mike Pullen, John Brodie Donald, Louis Mackay. Copyright © 2014 Mike Pullen and John Brodie Donald. Excerpted by permission of Elliott and Thompson Limited.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Contents

Introduction: The unexamined risk,
Chapter 1: The five key principles of crisis management,
Chapter 2: What's in my burger?,
Chapter 3: Says who? Extraterritorial legislation,
Chapter 4: Rogue employees: The threat from within,
Chapter 5: The rock and the whirlpool,
Chapter 6: NGOs and organisational activism,
Chapter 7: The cyber threat,
Conclusion,
Notes,
Index,
Copyright,

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews