A while back I emitted a rant about the use and abuse of quasi-statistical rankings: corruption indices and the like. My main beef is that the media almost always treat these rankings as something like numbers that can be added, subtracted and compared across time, ignoring the warnings against doing that by the compilers themselves, because it’s easy and seems harmless.

So I was interested to read that one of these lists bit the dust last year because, as the compiler admitted, “indices rarely change things.” The Global Integrity organization said its decision to stop compiling the list was “a conscious attempt to reinforce a key belief that we have come to embrace after many years of carrying out this kind of fieldwork: indices rarely change things. Publishing an index is terrific for the publishing organization in that it drives media coverage, headlines, and controversy. We are all for that. They are very effective public relations tools. But a single number for a country stacked up against other countries has not proven, in our experience, to be a particularly effective policy making or advocacy tool.”

Good job, Global Integrity. Score one for common sense.

The trouble about data, statistics, and all that in the media is a) the general public is (apparently) woefully ignorant about this stuff and b) the media, being aware of and complicit in (a), tend to feed the public simplified, often simply wrong conclusions based on a weak grasp of the material. Because it’s easy, saves thinking about the real value of statistical data, and seems harmless.

It’s a shame, because properly applied statistical research can do a power of good. This was brought home to me by an article in Foreign Policy about Patrick Ball, a statistician who testified for the prosecution in the trial of Slobodan Milošević at the Hague tribunal. Ball presented reams of data which he interpreted as showing that the deaths of thousands of Kosovar Albanians during the 1999 war were due to Serbian forces rather than NATO bombs or other causes.

Ball’s methods and conclusions were attacked by another statistician, a witness for the successful defense of former Serbian President Milan Milutinović, and the judges in the Milošević trial in the end did not consider his findings, so the reliability of his work remains unproven. Yet this kind of work at least shows the possibility of using data analysis to unravel the chain of cause and effect in such complex events as wars and genocides. Consider how much easier it would be for the Nazi Holocaust deniers if the murderers themselves had not collected so much data on their crimes! Data itself is neutral.

“Forensic statistics” may yet turn out to be a powerful tool in bringing criminals like Milošević to justice, after the fact. What about using data sets to stop crimes before they happen? What if, say, the people who have set up a whistleblowers’ group in Bosnia had data on the types of corporate and official malpractice, where they occur, and the amount of money involved?

There are efforts to do this under way in several countries. A bribe-monitoring site called I Paid a Bribe in India has its imitators in Pakistan and Kenya, The New York Times reports. What these folks are doing is still a long way from producing testable data, but it could help show the way. Corruption is notoriously hard to quantify, and thus to attack, so new ways of analyzing it are welcome.

Ky Krauthamer

Ky Krauthamer is a senior editor at Transitions Online. Email: ky.krauthamer@tol.org

More Posts