March 8, 2011

Reporting on Experts

Justin Katz

Theodore Gatchel notes a perpetual problem facing a public that wishes to be informed:

There are so many experts on virtually every subject imaginable that anyone who relies on them for information is faced with the problem of determining which experts to trust. Unfortunately, almost everyone falls in that category. Investors rely on experts for market information, patients rely on doctors, governments depend on intelligence agencies, and everyone listens to the weather report.

As experts proliferate, so do the differences of their opinions. President Eisenhower once said about the reports he received concerning the French in Indochina, "There are almost as many judgments as there are authors of messages." The problem then becomes one of determining which experts to believe. Eisenhower's complaint is every bit as applicable today as it was when he made it.

Gatchel suggests a report card system for experts to enlighten readers as to how particular experts' "predictions have panned out in the past." The problem, it seems to me, is that any such attempt does little but create another topic on which experts can proliferate.

Consider a generic weekly columnist for a major national newspaper: the number of claims and implied predictions in his work would quickly become so plentiful, with so much of their accuracy subject to legitimate debate, that it would become easy work to distort his overall success by selecting particular predictions and interpreting real-world outcomes in a particular way. The result would be the translation of opinion into ostensibly objective data — like a PolitiFact score sheet for the honesty of public figures.

Comments, although monitored, are not necessarily representative of the views Anchor Rising's contributors or approved by them. We reserve the right to delete or modify comments for any reason.

In his most recent EconTalk interview, Nassim Taleb gave a humorous anecdote about an economic panel he sat on in which a representative of the International Monetary Fund stood up and authoritatively made precise economic forecasts based on their data for 2011, 2012, and 2013. Nobody questioned him. Taleb became enraged, stood up, and advised the audience that they would be fools to listen to the man, pointing out that the IMF had been grossly mistaken in its forecasts for 2008, 2009, and 2010. Similarly most of the government experts in the Fed, Fannie and Freddie, predicted a healthy housing market through 2007 and 2008 right up until the collapse. Another anecdote in Taleb's economics book The Black Swan tells about the Department of Defense predicting the price in oil 50 years into the future. When the price unexpectedly soared in the 1970's, it became clear that their predictions were terribly off target. Rather than admit the inherent foolishness of the endeavor at that point, they simply revised their estimates to create new 50-year forecasts based upon the new numbers.

Experts know more than the average person, but they are also far more prone to confirmation bias and are very poor at delineating what they don't or cannot know.

A good illustration of the quant jock hubris is seen in a recent contribution over at RIFuture, in which "Frymaster" again and again touts the reliability of his own selected data and self-proclaimed expertise to commenter jgardner03 -

"The facts make opinions irrelevant."

"You seem to dabble in these things; I'm a full on wonk. You need more facts and fewer feelings in these discussions."

"That's what the FM Global data say. The "why's" of that question are as irrelevant as the data are indisputable."

"I have DATA THAT PROVE higher co-pays are counterproductive in every way."

Content of the argument aside, the inherent problems present here should be obvious to any detached observer. There is no place for reasonable discussion when one side claims to have a complete monopoly on facts, expertise, and understanding. And the potential for confirmation bias and overreach is enormous here. There is no acknowledgement of the limitations of his own data and error margin, and he then astoundingly purports to be able to solve every problem from healthcare to education if only he is given the requisite amount of top-down control.

This is not to say that we should ignore science, data, or experts - quite the contrary. But we should be wary of experts (especially the self-proclaiming kind) who fail to acknowledge the basic limitations of their own data and models, especially when insourmountably complex systems such as the health care (in totality), the stock market, housing, etc. are involved. The error rates in such systems becomes monstrous almost immediately out of the gate. Any one man who claims to be able to model and forecast such systems in totality should be met with the most severe skepticism, especially when they are incapable of acknowledging their own basic biases and sets of assumptions.

Posted by: Dan at March 8, 2011 11:17 AM

Every lawyer knows about examining "experts". Before you can begin to challenge their opinion, you have to challenge their assumptions. Are they including "facts" which are not included in the facts on which they opine?

"An expert is anyone from more than 40 miles away" A. Lincoln

Posted by: Warrington Faust at March 8, 2011 3:18 PM
Post a comment

Remember personal info?

Important note: The text "http:" cannot appear anywhere in your comment.