Where have all the numbers gone?
19-May-14by Annie Byrne
It is an age we live in now where Google AdWord analysis means that every single statistic about every promotional campaign is quoted to the nth degree. Every media form generates more and more numbers about what it can achieve, how it can offer proof of its efficacy, and what you get for your money.
And where does the exhibition industry stand? Facing in exactly the opposite direction.
For two years now (and for the first time since 1985) the industry has not been able to produce the facts or anything to take their place. Since senior EMAP staff asked Melville to stop producing industry wide analyses of visitor numbers in 2008, even the best informed organisations shy away from producing anything with which the industry might present itself.
Over the past eight years the Association of Event Organisers and the Events Industry Alliance have recognised data on 1,023 separate events. In 2008 organisers and venues provided visitor information on 505 events. By 2012 that number had fallen to 399 events, of which only 332 gave any information on either net or gross square metres. Of the events we know about, less than a third are providing any sort of information.
The first graph shows the rapid decline in the number of events that report any data at all (Chart A).
There were only 253 events which provided information in both 2011 and 2012. The various graphs show some of the interpretations that can be drawn from using these 253 events. Average visitor numbers rose, for instance – from 5791 to 6125, just under six per cent (see Chart B). This number is slightly misleading – consumer shows were largely flat or declining slightly, and the gain was almost all in trade shows.
And, looking at every show for which there was reasonable information, total sold net square metres appeared to be static – at 5,345sqm (Chart C). Both of these findings are encouraging and suggest that the tide of recession since 2008 has finally turned.
The graph showing the Average Change in Visitors to Reporting Trade Shows (Chart D) is particularly interesting – 2012 showed the biggest increase in visitors for over a decade. It should be stressed that this graph only compares like-for-like each year (e.g. the 2005 number compares against exactly the same shows as in 2004), so the sample changes each year. Having said that, the largest 40 or so shows do report most years, adding a degree of consistency.
There are two main reasons why our data is, bluntly, consistently unreliable. The first is survivor bias. The period we have gone through has seen the disappearance and demise of a lot of large shows. So year-on-year we have been seeing the disappearance of big events.
In the year after the Motor Show disappeared, if every other event had welcomed exactly the same number of visitors, the industry as a whole would still have lost eight per cent of its visitors. But if the maths were done on a simple “like for like”, then the Motor Show would not have been counted at all (because it had disappeared from the records) and the change would have been exactly zero per cent. Both numbers – eight and zero per cent – are mathematically correct. But which one you choose changes how our industry is perceived.
The second reason is that organisers and, to a lesser extent venues, have become more selective in the information they provide.
In times when visitor and square metre numbers are rising, everyone is generally happy to publish their information. In the deep recession we have seen since 2008, this has not been the case. I am not suggesting organisers lie – but what happens is a selective approach. Shows that are growing are reported. Shows that are declining are simply not mentioned at all. The result is an inevitable upward (but incalculable) bias in the numbers.
A comparable example may be chain stores which give annual “like for like” sales. Over a 12-month period they may have closed 100 stores, but by only comparing the ones that were open in both 2013 and 2012 they create a double bias – the best ones have been kept open so are more likely to have grown, and the total sales in the 100 closed stores have magically “disappeared” from the calculation.
This has been compounded by the AEO’s decision to abandon the requirement for every member to audit their shows. With the economic collapse in 2008, two of the four largest organisers told the AEO that they would not continue as members if they were obliged to audit every event - the cost of a full audit was around £1,200. The AEO had little choice – its role would change dramatically if the largest players left over such an issue, and there would have been little incentive for the others to keep auditing.
And so we are left with a situation where the information that we are able to assemble is, to say the least, questionable.
Just two examples. Of the largest 100 events in 2012, nine claimed to have sold exactly 12,774sqm – and six of these sold exactly the same in 2011.
And of the largest 100 events, numbers 71 to 80 all claimed an attendance of exactly 20,000.
Indeed, 54 of the top 100 claimed to have an attendance which was a round 1,000. Possible, but rather unlikely.
At the moment it seems unlikely that we will start producing comprehensive information again. This is both frustrating (and lucrative for some) in today’s marketplace. There are at least 15 private equity companies looking hard at the various businesses up for sale - The NEC and two of our larger exhibition groups) - and they are stunned by, for perhaps the first time in their lives, the fact that they cannot discover any reliable information. Interesting times.
This was first published in the May issue of EN. Any comments? Email Annie Byrne