Monday, December 20, 2004
Predictions, extrapolations and forecasting disasters
Posted by David Smith at 05:56 PM
Category:

There are many reasons to regard this time of year with a deep sense of trepidation. This is the season of overcrowded and overheated shops, office Christmas parties, long-lost relatives – and those who you pray would get lost – and false bonhomie.

But it is also, and this is much worse than any of that, the forecasting season. This is the time when normally sensible people cannot resist the urge to “look into the crystal ball”. In newspapers and magazines all over the country, journalists are putting together features on the outlook for 2005 and beyond, safe only in the knowledge that by about January 10 everybody will have forgotten them.

Nor are they operating in a vacuum. Just about now every other press release contains so-and-so’s “top 10 HR predictions for 2005”, or simply “Outlook 2005”. Given that China remains the hot economics story, there will plenty of predictions of the “What will the year of the rooster bring?” variety. The Economist, which seemingly cannot make room for all its predictions in a normal issue, has for some years done a separate publication. “The World in 2005” is available in all good branches of W.H.Smith.

For connoisseurs of forecasting, however, there is one problem with the current wave of predictions. Too many people these days are so unwilling to risk putting their foot in their mouth that there is a dull uniformity to this huge output of forecasts. Most are so blindingly and boringly obvious that they have little chance of achieving the only real accolade in this area – being recognised by future historians as a truly awful forecast.

Collectors of truly awful forecasts have had no shortage of material over the years. For economists, Malthus’s Essay on Population, which had the effect of making economics the “dismal” science, was published nearly 200 years ago but has enduring power. Malthus, of course, predicted that the world would run out of food. “Population, when unchecked, increases in geometrical ratio,” he wrote. “Subsistence only increases in an arithmetical ratio. A slight acquaintance with the numbers will show the immensity of the first power in comparison with the second.”

Malthus failed to take account of sharp improvements in agricultural productivity and methods and was wrong. Even today, when there are millions of starving people in the world, the problem is not a shortage of food but its distribution.
That has not stopped Malthus-type forecasts from appearing over the years. Just over 30 years ago an international think tank called the Club of Rome published The Limits to Growth, which argued that the global economy would be forced to slow down because of a shortage of natural resources. Even now, after a period in which oil prices have hit a record high of $50 a barrel there are predictions that the world is about to run out of oil. It is not, although we may have to get used to paying a bit more for it.

Laura Lee, an American journalist, wrote a whole book, Bad Predictions, on the subject of truly awful forecasts. Many, like those infamous predictions of Y2K disaster five years ago, when we faced disaster at the hands of the millennium bug, involve technology.

From the Roman engineer Sextus Julius Frontinus, who said in AD 10 that mankind had run out of things to invent, even forecasters paid to know about these things have got in wrong on technology. In 1952, famously, IBM predicted a total market for computers of 52 units. Thirty years later, with the advent of the PC, it had sensibly raised this to 200,000. That is roughly the number it ships every week today.

Perhaps IBM was just being modest about its product, as was Alexander Graham Bell when he predicted that one day there would be a telephone in every American city. A common forecasting error is to assume, like Sextus Julius Frontinus, that no further progress is possible, or likely. John von Neumann suggested in 1949 that we might have reached the limits of computer technology, while Dr Arthur L.Samuel wrote in the New Scientist in 1964 that computers were unlikely to get any faster. Bill Gates, the great computer visionary, is reported to have said in 1964: “640K ought to be enough for anybody.”

Transport is another favoured area for the truly awful forecast. Thomas Tredgold, the British railway designer, told us in 1835 that the prospect of any system of conveying passengers exceeding 10 miles an hour was “extremely improbable”. In 1902 Harper’s Weekly told its readers: “The actual building of roads devoted to motor cars is not for the near future, in spite of many rumors to that effect.” A year later the president of the Michigan Savings Bank told Henry Ford’s lawyer not to risk investing in Ford’s company, because “the horse is here to stay; the automobile is only a novelty”.

Forecasters and futurologists fall into two timing traps. The first is to pitch things too far into the future. Science Digest opined in August 1948 that it would take mankind 200 years to solve the technical problems and enable a moon landing to take place. In the event it took just over 20. The other trap is to exaggerate the pace of change.

Many of us grew up expecting that by now we would be commuting in flying cars and routinely holidaying on other planets. Some of us have probably imagined a future in which travel would consist of being beamed across continents. These things may happen, but not for a while. Arthur C Clarke predicted in an article in Vogue in 1966 that by 2001 houses would be made of ultra-lightweight material and be capable of flying. “Whole communities may migrate south for the winter,” he said. Perhaps he was just having a bit of fun at the expense of the fashionistas.

It is easy to scoff but it gets a little uncomfortable when it becomes too close to home. Some of the strongest candidates for the roll of honour of truly awful forecasts relate to the job market and the way we work. And not all of them date back to the golden age of futurology, the 1950s and 1960s.

John Philpott, the Chartered Institute of Personnel & Development chief economist has one recent favourite on his bookshelves. “Without, I hope, being too defensive, serious economists seldom espouse absolute nonsense,” he says. “This is mostly confined to popular futurologists. Remember Jeremy Rifkin’s 1994 ‘classic’ ‘The end of work’? This was published just as the US economy started to move back to full employment. Rifkin seems to have transferred his attention to ‘the coming environmental crisis’ though I guess his earlier work will be given a retread in the light of equally absurd talk in the US at present of permanent ‘jobless growth’.”

Where futurologists – and some employers - have got it most wrong is on the way that things would evolve when it came to working hours. Automation, labour-saving technology and the inexorable decline in the average working week from more than 65 hours in the 1850s appeared to have only one possible result.

Computer scientist Dr Christopher Evans, in a 1978 piece for Science Fact called “Computers and Artificial Intelligence”, predicted that: “By 1990 people will be retiring at 40 or thereabouts.” We can laugh, but in the 1990s, with occupational pension schemes apparently in a state of permanently rude health, many companies had policies of retiring people at 50.

The godfather of forecasters, the man for whom the truly awful forecast came very easily, was Herman Kahn. Kahn, a robust and quirky individual who is said to have been the model for Dr Strangelove, was a man of strong views and boundless imagination, particularly when it came to predictions. His 1967 book ‘The Year 2000’, still available on Amazon, genuinely is a classic.

Kahn did not do too badly when it came to some of his predictions, getting it more or less right on home computers, mobile phones, video recorders and satellite dishes, although overdoing it with his forecasts of underwater cities, new forms of energy and housecleaning robots.

He probably thought his safest predictions were on work. By 2000, he suggested, nobody would be working more than 30 hours a week and 13 weeks of annual holiday would be the norm even in workaholic America. Like many futurologists Kahn thought the challenge in advanced societies would be filling the many hours of leisure created by technological and economic advance, although his solution was less conventional than most. He predicted that by 2000 humans would routinely hibernate through the darkest and coldest months of winter. That may be how it sometimes feels but it is some way away from reality.

On a more mundane level, there is a phenomenon well known in newspaper offices. If a journalist knows one person who is doing something different to the norm, he or she is just out of the ordinary. If there are two, that becomes an interesting phenomenon worthy of note. When there are three, it is a trend.

As Philpott puts it: “Many current scares are really versions of myths that have been around for a long-time. These include: the ‘end of the job for life’; the death of permanent full-time employment; and the end of the job as we know it and the rise of self-employed portfolio careers for all. These myths derive from over-extrapolating the experience of some individuals or groups in the economy. In some quarters this is known as ‘doing a Charles Handy’ – good for book sales but not well grounded in reality.”

Perhaps the job market trips us up so often because, more than some parts of the economy, it is so intimately tied up with human behaviour. Behaviour can and does change, and often in ways we do not expect.

In late 1992, Gordon Brown was emerging as a heavy-hitting senior politician and the scourge of the Conservatives. Unemployment had risen strongly through the “Tory recession” of 1990 to 1992, although the pain was not severe enough to give Labour its expected election victory in April 1992. Most forecasters believed unemployment would climb above the symbolically important three million level, as it had in the 1980s. Brown certainly did and, while falling short of a promise to eat his hat if it did not, was left with egg on his face when instead of continuing to rise the total began to fall. The “jobless recovery” gave way to a period of employment growth that continues to this day. Employers were not as gloomy as the politicians and pundits.

Brown got his own back on the Tories and most economists a few years later. Ahead of Labour’s 1997 election victory and the introduction of the national minimum wage, most economists thought it would lead to higher unemployment. Michael Howard, leading the charge for the Tories, said it would destroy two million jobs. Employers in certain sectors, particularly retailing and catering, warned of the dire consequences of the policy.

As far as it is possible to tell, however, the minimum wage has had a negligible impact on jobs or, if it did, the effect was swamped by that of a generally strong job market. The direst of the predictions were based on the belief that workers further up the income scale would seek to maintain differentials with those benefiting from the minimum wage. That has not happened.

On a more fundamental level, few of us expected the steady fall in unemployment that has characterised Labour’s period in office. And in many ways we were right to be sceptical.

“Any structural break in behaviour can make models based on past relationships unreliable,” says Philpott. “Having observed the unusually early shakeout of jobs and associated strong productivity growth when the UK economy went into recession in 1990-91 – interpreting this as a sign of a more flexible ‘hire and fire’ labour market – I expected a similar response and some rise in unemployment when the economy slowed in the late 1990s. In the event, unemployment continued to fall, mainly because employers preferred to hoard rather than fire staff in what by then was a relatively tight labour market.”

The job market has also caught out economists in another way. For years an essential weapon in their armoury was the notion of a certain level of unemployment at which wage settlements, and therefore inflation, would start to take off. This, the clumsily named non-accelerating inflation rate of unemployment (Nairu), was thought to be 7 or 8 per cent of the workforce, two million or more on the wide Labour Force Survey jobless measure. But unemployment has come down to 5 per cent on the LFS measure, and under 3 per cent on the claimant count (meeting the traditional definition of full employment), without triggering a new bout of inflation.

“This version of doom and gloom again proved wrong,” says Philpott, “though there are still disputes as to whether the Nairu theory was wrong, the Nairu was below two million to start with, or whether it has since fallen.” On such matters big decisions can rest. The Bank of England, in the early days of independence after 1997, was convinced that unemployment was getting close to levels at which inflation problems would re-emerge.

So we have got it wrong, over and over again, although sometimes we have got it right. Where are we likely to be getting it wrong when it comes to the current batch of predictions about the future?

One useful rule of thumb is to beware the herd. When everybody expects something, there is a good chance it will turn out to be wrong. The trouble is that it passes unnoticed when the consensus is right, or we forget that there was a healthy debate at the time. Take the collapse in dot.com and other technology shares five years ago. Hindsight tells us that we all got sucked into this mood of irrational exuberance and that nobody questioned it. In fact, many economists and market analysts warned repeatedly of the danger – and some fund managers lost their jobs by staying out of technology shares – but failed to convince enough investors, including many of the people paid to look after our money, or their arguments.

There is a similar healthy debate about the housing market now. If there is a crash, nobody can say that they were not forewarned. The fact that until recently people chose to ignore the warnings was their prerogative.

Are we getting it wrong on some of the really big things? Another of those healthy debates is in full swing on the question of global warming and climate change. While most of the scientific establishment now argues that the case is unanswerable, others take a different view. Bjorn Lomberg, author of The Sceptical Environmentalist, has assembled a group of Nobel prizewinning economists and others to argue that, even if we accept the science of global warming, it is far from clear that this should be a priority alongside the other challenges for mankind.

What about the consensus that says Europe is destined to become an economic backwater, shackled by its ageing population and inability to embrace economic reform? A high level group chaired by Wim Kok, the former Dutch prime minister, and including our own Will Hutton, head of the Work Foundation, as its rapporteur, recently added its voice to the gloom on Europe. The EU, it said, had failed to respond to the reform inititatives launched at Lisbon four years ago. A failure to make the EU economy more flexible and responsive would, it said, threaten Europe’s very civilization.

The consensus may be right. One worry for the new EU entrants from eastern Europe, says Philpott, is that they are adopting “the full panoply of EU laws and employment regulations” which will survive and hold them back long after their labour cost advantage has gone.

But these things change. At the end of the 1980s, America was regarded as a lumbering giant, beset with low productivity growth and a lack of dynamism. It was only a matter of time before the United States would be overhauled by dynamic, rapidly-growing Japan. But the 1990s was the American decade, with a powerful and prolonged economic upturn, and a rediscovery of economic dynamism and optimism. For Japan, in contrast, it was the “lost” decade of weak economic growth, and a loss of individual, business and political confidence.

What about one of those shifts that concern us all in the world of work? In a few short years, the consensus has moved from a position where earlier and earlier retirement would be possible, indeed preferable, and certainly affordable. Now, as the government embraces EU age discrimination legislation and digests Adair Turner’s recommendations, the ground is being steadily prepared for retirement at 70. Many people have moved from the expectation of retirement at 50 to the realization it could be 70 in just 4-5 years.

It is hard to argue with the logic of later retirement as things stand. But perhaps the shift has been overdone. The great futurologists, even those that gave us some of those truly awful forecasts, all believed that technical progress and rising productivity would enable shorter working hours and earlier retirement. Maybe they were not entirely wrong. And maybe some of the gloom about “working till you drop” has been overdone. Most forecasts, after all, are wrong.

Top five forecasters’ excuses

1. “The statistics were wrong.” A favourite one this, particularly among economists. If recent history is clouded in uncertainty, even the most talented forecaster will struggle to get the future right, or so they say. But statistical revisions often work to the forecaster’s advantage. Funny, they never seem to mention that.

2. “It was a bolt from the blue.” Or the act of God, or the “all bets are off” excuse, an unexpected shock that is so big that would throw any forecast off its tracks. Sounds reasonable enough, except that these bolt-from-the-blue events are often the excuse for more forecasting howlers. After the 9/11 attacks on America almost everybody predicted a deep recession. The US economy, in fact, had already been in recession for several months and the action taken in the wake of 9/11 – lower interest rates and a big government spending boost – lifted it out.

3. "They heeded my warnings.” Otherwise known as the Y2K excuse, the consultants, technical experts and IT companies who made billions collectively out of fear of the millennium bug argued that without this expenditure there would indeed have been a technological disaster when the clock struck midnight on December 31 1999. It looks like an uncheckable claim, except for the fact that countries like Italy, which did not spend, also avoided Y2K disaster.

4. “I’ll be right in the end.” The old forecasters’ adage – give a forecast and a date but never the two together – is rather too close to the truth for comfort. It is a feature of the predictive art that a forecast is rarely wrong, just ahead of its time.

5. "If the facts change, I change my mind. What do you do, Sir?” The language, as you might expect from somebody as cultured as John Maynard Keynes, is elegant. Behind it, however, is the perfect catch-all forecasters’ excuse – things turned out differently than I expected. People did not behave as our forecasting model said they would but never mind, we will know next time.

From People Management magazine

Comments
Post a comment









Remember personal info?








    •