Tag Archives: DAGGRE

tedshot

Who’s predicting the next big thing?

SciCast is comprised of more than 7,000 science and technology experts and enthusiasts from universities, the private sector and professional organizations such as AAAS, IEEE, and ACS.  The SciCast team thought it would be fun to find out more about what motivates SciCasters to predict the next big thing. 

Meet SciCaster Ted Sanders, 26, who resides in Stanford, CA and is pursuing his PhD in Applied Physics at Stanford University.

tedshot Q: How did you get involved as a SciCast participant?

I learned about SciCast when it evolved out of the DAGGRE project, which I had joined from reading Robin Hanson’s blog. However, I was not active on SciCast until recently, when SciCast announced gift card prizes and the College Bowl competition. My participation also stems from a desire to support the legalization of prediction markets in the United States.

Q: What do you find most interesting about SciCast?

Continue reading

0

Decision Analysis Journal Article: Probabilistic Coherence Weighting for Optimizing Expert Forecasts

We’re excited to announce that the Decision Analysis Journal has published Probabilistic Coherence Weighting for Optimizing Expert Forecasts about some work last year related to DAGGRE.

It’s natural to want to help forecasters stay coherent as we ask related questions. For example, what is your confidence that:

1.      “Jefferson was the third president of the United States.”

2.      “Adams was the third president of the United States.”

People are known to be more coherent when these are immediate neighbors than when on separate pages with many unrelated questions in between.  So it’s natural to think it’s better to present related questions close together.

We found that’s not necessarily a good idea. On a large set of general knowledge questions like these, we got more benefit by allowing people to be incoherent, and then giving more weight to coherent people.  At least on general knowledge questions, coherence signals knowledge. We have yet to extend this to forecasting questions.

We found other things, too – cool and interesting things.  Here’s the abstract, but be warned, it gets technical:

Methods for eliciting and aggregating expert judgment are necessary when decision-relevant data are scarce. Such methods have been used for aggregating the judgments of a large, heterogeneous group of forecasters, as well as the multiple judgments produced from an individual forecaster. This paper addresses how multiple related individual forecasts can be used to improve aggregation of probabilities for a binary event across a set of forecasters. We extend previous efforts that use probabilistic incoherence of an individual forecaster’s subjective probability judgments to weight and aggregate the judgments of multiple forecasters for the goal of increasing the accuracy of forecasts. With data from two studies, we describe an approach for eliciting extra probability judgments to (i) adjust the judgments of each individual forecaster, and (ii) assign weights to the judgments to aggregate over the entire set of forecasters. We show improvement of up to 30% over the established benchmark of a simple equal-weighted averaging of forecasts. We also describe how this method can be used to remedy the “fifty–fifty blip” that occurs when forecasters use the probability value of 0.5 to represent epistemic uncertainty.

Read the article!

Christopher W. Karvetski, Kenneth C. Olson, David R. Mandel, and Charles R. Twardy. Probabilistic coherence weighting for optimizing expert forecasts. Decision Analysis 2013 10:4, 305-326

At Decision Analysis | Local PDF  ]

0