IT analysts produce crap - what to look for in analyst "research"
This post has been podcast
Crap Factoids are pure B.S. that almost sound like a fact, and will be presented so often that everyone will think it true. Let us look closer at a classic Crap Factoid where the results were deliberately skewed, then hyped up by marketing people, and the resulting Crap Factoid thrown to the winds like GM seed. It is time people called analysts to task for this stuff because we all suffer the consequences when decision makers fall for it.
As I said in a recent comment on this blog, my concerns with much research published are that the 'research' is
- commissioned to prove a point, like cancer research paid for by the tobacco industry but with less observers ready to scream "foul"
- created as a revenue generating exercise, therefore the results need to be useful, attention getting and self-serving (grow the market)
- often anecdotal and opinion-based
- Often asked of the wrong person: "How brilliant were you..." "Did you make the right decision to..." "what ROI have you had from your spending..."
- lacking transparency (and hence impossible to reproduce): what was the methodology? what questions were actually asked? how was the sample derived? what controls were there (generally none)? what were the raw results?
- no peer review ('cept blogs like this). Where are the academic and professional journals and conferences with real review boards?
I think IT should be renamed Information Engineering, and should be held up in comparison with the traditional engineering disciplines, where it compares very badly. If a bunch of post-grad engineers set themselves up as self-appointed experts and wrote a paper on how 86% of Chief Engineers surveyed agreed that bamboo is the material of choice for bridges in 2008 (and sold it for $3000 a copy), they'd be torn to ribbons by peer review.
As an example, let us look at a case study: "66% of organisations surveyed around the globe have engaged with the Information Technology Infrastructure Library (ITIL)", according to a survey released in February 2008 by DimensionData. This survey was the subject of a previous Crap Factoid Alert. It is neither the best nor the worst around, but seems nicely typical to use as an example of what to look for when detecting Crap Factoids.
- Dimension Data commissioned the survey from Datamonitor. These are not professional scientific researchers, they are professional market researchers, which is not the same thing. But at least they know how to construct questions and analyse results.
- "The research surveyed over 370 CIOs from 14 countries across five continents." But how did it select them? As DD customers? How did it bias the sample? Not stated, but see below.
- Look at the spin put on the press release: "Two-thirds of Enterprises Engage with ITIL – Is Your Company an IT Service Management Laggard?" The intent of the exercise is evident, which puts the credibility of the research into question. Scientists set up a hypothesis but they try to be impartial about its veracity.
- Look at what the survey measured: people's opinions. "What do you believe to be ...?" "In your opinion, what is the potential impact..."
At least the ITPI research that we have been debating on this website actually measures some hard metrics from the sites,. This opinion-based 'research' is not worth the self-aggrandising wind of the respondents who produced it. They might as well ask "How clever were you..."
- Then we get graphs like "What do you believe to be the primary inhibitors to adoption of ITIL / ITSM best practices?". Well what were the options they chose from? We only get told the "Six strongest". My local Resident's Association recently surveyed the village and asked something like:
Which would you least like to see in The Bay?
- walking paths
- beach improvements
- big ugly housing developments
- gambolling unicorns
- fairies in the dell
"99% of residents agree the last thing they want to see..." (1% = me)
- Note all the way through the Datamonitor paper they are arguing strongly from a pre-assumed position. The intent is clear: to talk up ITSM frameworks in general and ITIL in particular. Remember that analysts are parasites on an industry: they sell information and opinion on it. If the industry grows they grow. If the industry withers they have to go start again somewhere else - expensive. Analysts have a clearly defined motivation to pump up an industry that they have invested in.
And now the doozy from the results document itself from Datamonitor: "Admittedly, this Datamonitor study deals with a somewhat self-selecting sample, as the screener question probed for those that have evaluated, although not necessarily adopted, ITSM frameworks. Methodological nuances notwithstanding, the survey results indicate that over two-thirds of the enterprises interviewed claim that they have engaged with ITIL".
"screener question"? to select the 370 or to select the responses that made it into the results? the graphs show "n=372" so I'd say the 372 were deliberately selected to be already predisposed to ITIL. Either way Datamonitor are freely admitting the results were deliberately skewed. Then they cavalierly brush this aside as "Methodological nuances". Deliberate distortion of data, I'd call it.
- This jaunty approach to statistical science is repeated elsewhere, such as this one on p11: "Granted, the statistical significance of a 10 percentage point differential could be the subject of further scrutiny. Nevertheless, the swing testifies to the positive experience of those that have implemented ITIL and corroborates qualitative evidence in favour of ITSM approaches in general and the ITIL best practice framework in particular... Those that have engaged with ITIL are more optimistic regarding its actual impact"
- Since the survey questions, the methodology, and the raw data are not published we cannot draw proper conclusions. This is a classic attribute of pop-knowledge Crap-Factoid fluff like this that strongly distinguishes it from scientific research: you can't check it out for yourself.
Remember, if you want your Crap Factoid to propagate, exact numbers give it credibility.
[The other major factor in establishing credibility is the name of the source organisation. On the IT Skeptic's CF Name Drop Scale, Gartner scores a factor of 2. Datamonitor scores about 0.5]
Datamonitor said "over two thirds"
Dimension Data's press office said "More than 65% of respondents" [turn it into a number]
but the Dimension Data press release itself said "66% of organisations" and headed the whole thing "TWO-THIRDS OF ORGANISATIONS..."
The real number seems to be in Figure 4 (p11):
but we'll never know as they don't publish the data. If I'm right, Datamonitor's grasp of basic maths is a bit shaky. When I went to school 66% was less than two thirds, just.
The IT Skeptic said some time ago that the IT analyst industry badly needs a code of practice to reduce this kind of pop-knowledge crap. Please spread this article around, get the word out, put some pressure on them.
The analysts survive on their credibility. Based on the bilge they produce, they don't deserve it. If we undermine it, they'll have to do something to improve, to deliver real scientific research. If we don't, they'll keep shovelling this stuff into our managers and you'll live with the results.
Remember: Chokey the Chimp hates Crap Factoids!