I read earlier this year Richard Rumelt’s book Good Strategy Bad Strategy, much acclaimed when it was published in 2011. And you can see why: it is lucid, well-writtem, and largely free of jargon, which already marks it out from the average business book. It also has a clear view of what strategy is (and what it is not), which is welcome, given how much the word is abused. And the business stories he tells illuminate his argument.
Rumelt is entertaining on the differences between bad strategy and good strategy – and I’ll come back to the bad strategy later. Good strategy, he says, is composed of a kernel of three elements (p77):
- A diagnosis that defines or explains the nature of the challenge. A good diagnosis simplifies the complexity by identifying the critical aspects of the situations.
- A guiding policy for dealing with the challenge.
- A set of coherent actions that are designed to carry out the guiding policy.
In particular, I found his advice on diagnosis valuable. A good diagnosis “should replace the overwhelming complexity of reality with a simpler story, a story that calls attention to its crucial aspects.” This is, in effect, a sense making exercise. And a good strategic diagnosis does a second critical thing: “it also defines a domain of action.” Good strategy can then be built on a diagnosis that points to areas of leverage over outcomes. (more…)
In the first part of this post, I looked at the impact of the economy, and its business history, on HMV’s collapse. In this second part, I’m going to turn my attention to changes in the music market, the impact of the internet (there’s two stories here, not one), and the business’ strategic reponse.
The received wisdom about the collapse of the British entertainment chain HMV and its acquisition by the distress specialists Hilco is that it didn’t see the internet coming. And doh! Actually, the truth has a lot more to do with economics and the way finance dominates business. This long post is broken into two parts: part 2 is here.
The immediate cause of HMV’s collapse, of course, was the British recession, which has gone on longer than anyone expected, and the economy is now teetering on the edge of an unprecedented triple dip recession. Here’s the NIESR chart showing comparative GDP since the pre-recession peak for the past six recessions. The black line at the bottom is the current recession, and yes, this chart should be on the wall of every economic policymaker in the UK.
In my last post I wrote about how Art Kleiner’s idea of the Core Group helped us to understand Barclay’s recent history. In this post I am going to develop this idea. If Bob Diamond survived in his position despite his division breaching the rules in 1998 because he’d become a member of the Core Group, we still need to understand why. In the rest of this post I am going to go further into the history.
How the idea of the ‘Core Group’ helps us understand the Barclays’ rate-fixing scandal
Much of the writing on Barclay’s rate-fixing scandal – which has forced the resignations of both the chief executive, Bob Diamond, and the chairman, Marcus Agius – focused on issues of culture, but without asking where culture inside organisations comes from or how it is set, other than maybe an over-simple idea that it comes ‘from the top’. Of course, some of the interest in culture was prompted by the ex-Chief Executive, Bob Diamond, in the BBC lecture he gave last year, which I imagine seemed like a good idea at the time. Andrew Rawnsley quoted a couple of lines in an article in the Observer earlier this month:
“Culture,” he [Diamond] said in a BBC lecture last year, “is difficult to define. But for me the evidence of culture is how people behave when no one is watching.”
So it seems a good idea to look at this through the lens of Art Kleiner’s neglected book, Who Really Matters: The Core Group.
A few months ago, I wrote about risk – suggesting that the ‘discipline’ of risk management tended to focus on risks which were (a) understood and (b) for which probabilities could be estimated, and that this led to far too narrow a view of risk. In particular, this meant that companies were usually very poor at assessing their blindspots. This second post has taken longer to write than I expected, but in this one I’m going to look at the factors which mean that companies tend to remain blinded by their blindspots.
It’s hard to know where to start with the BP oil disaster. Commentary has been gushing out almost as quickly as oil. We know the scale of the pollution, and have read ecologists who say the conditions are unlike any other seen on earth, certainly in the anthropocene period. Dark humour is one response; angry satire is another (The Onion: ‘Massive Flow Of Bullshit Continues To Gush From BP Headquarters‘).We have a good idea that BP’s conduct over the drilling was somewhere between careless and reckless (and that sooner or later a court is likely to decide on which), and that regulatory agencies were compliant or ineffective. One area that seems to deserve more thought – especially from a futures perspective – is the way in which essentially man-made disasters such as this are to a significant extent produced by a limited set of ideas about risk, both the way it gets assessed and the way it is managed.
It’s notable that in the past week or so the murmurs about Facebook’s slack approach to privacy have gone from a whisper to a scream. And at least some of the noise has been coming from very select members of the digerati; Wired, Gizmodo, danah boyd, Jeff Jarvis, and David Weinberger have all joined in. They seem to be playing to an enthusiastic crowd. But why now, when Facebook’s slackness on privacy has been known for years (it’s one of the reasons I’m not a member)? I think it comes down to two things: firstly, speed of change, and secondly, scale.
The start and the end of the documentary The End of The Line, which has now been released on DVD at least in the UK, (and which I blogged about when it was first shown in the cinema), is dominated by traditional ‘National Geographic’ type images. You know the sort of thing; sunlight streams through the water showing the richness and diversity of the sea, illuminating the many different and brightly coloured species below. It’s filmed in one of the few Marine Protected Areas, which together comprise about 1% of the ocean area, where fishing is not allowed. For the rest of it the story was dismal.
Now that the froth from the iPad launch has blown past, it’s worth stepping back a bit. For me, the most telling comments were not the ones which talked about functionality, but those which looked at what the iPad proposition told us about the state of the device and app market. Which is this: the computer technology market is now moving out of its technology-led phase.
A moment of theory might help. I’m quite influenced by the work of the economic historian Carlota Perez, who’s tracked five long phases, or surges, of technology innovation, going back to 1771. Each phase runs for 50-60 years and follows a common pattern (there’s more detail in the diagram below). There’s an ‘installation’ phase, in which the new technology platform spreads in visibility and usage (device penetration increases, underlying infrastructure is developed). There’s a bubble and a crash, in which investors get over-excited about the prospects. And then there’s a deployment phase, in which the applications associated with the technology platform deepen and broaden, and the underlying impacts on society become more profound. The ICT surge started in 1971, with the invention of the microprocessor. We’ve finished the installation phase, we’ve had the crash (dot.com, not global financial crash, though the two may be linked), and now we’re several years into the deployment phase. The iPad launch was another confirming sign of this.