The final SciCast annual report has been released! See the “About” or “Project Data” menus above, or go directly to the SciCast Final Report download page.
Exeutive Summary (excerpts)
Registration and Activity
SciCast has seen over 11,000 registrations, and over 129,000 forecasts. Google Analytics reports over 76K unique IP addresses (suggesting 8 per registered user), and 1.3M pageviews. The average session duration was 5 minutes.
The SciCast 2015 Annual Report has been approved for public release. The report focuses on Y4 activities, but also includes a complete publication and presentation list for all four years. Please click “Download SciCast Final Report” to get the PDF. You may also be interested in the SciCast anonymized dataset.
Here are two paragraphs from the Executive Summary:
We report on the fourth and final year of a large project at George Mason University developing and testing combinatorial prediction markets for aggregating expertise. For the first two years, we developed and ran the DAGGRE project on geopolitical forecasting. On May 26, 2013, renamed ourselves SciCast, engaged Inkling Markets to redesign our website front-end and handle both outreach and question management, re-engineered the system architecture and refactored key methods to scale up by 10x – 100x, engaged Tuuyi to develop a recommender service to guide people through the large number of questions, and pursued several engineering and algorithm improvements including smaller and faster asset data structures, backup approximate inference, and an arc-pricing model and dynamic junction-tree recompilation that allowed users to create their own arcs. Inkling built a crowdsourced question writing platform called Spark. The SciCast public site (scicast.org) launched on November 30, 2013, and began substantial recruiting in early January, 2014.
As of May 22, 2015, SciCast has published 1,275 valid questions and created 494 links among 655 questions. Of these, 624 questions are open now, of which 344 are linked (see Figure 1). SciCast has an average Brier score of 0.267 overall (0.240 on binary questions), beating the uniform distribution 85% of the time, by about 48%. It is also 18-23% more accurate than the available baseline: an unweighted average of its own “Safe Mode” estimates, even though those estimates are informed by the market. It beats that ULinOP about 7/10 times.
You are welcome to cite this annual report. Please also cite our Collective Intelligence 2014 paper and/or our International Journal of Forecasting 2015 paper (if it gets published — under review now).
Congratulations to the winners of the Combo Edits Contest! We’ve awarded $16,000 proportionately to our top forecasters for their efforts during a month long period. This contest encouraged SciCasters to add their own links.
SciCast’s unique feature is the ability to make conditional edits, also called assumptions or combinatorial edits. Now you’ll have a chance to compete for big prizes for yourmaking at least at least 25% of your forecasts conditional forecasts. combo edits!Continue reading →
Q: Expected value? So, if I’m betting that A happens, shouldn’t I just make P(A)=99%?
A: If you knew exactly when we would calculate the expected value, this would be a profitable strategy. But (a) we will randomize, and (b) we have selected questionsmany questions are still likely to resolve, so there is some risk.
SciCast Calls for Science, Technology Experts to Make Predictions
Largest sci-tech crowdsourcing forecast site in search of professionals and enthusiasts to predict future events
FAIRFAX, Va (June 19, 2014) – SciCast, a research project run by George Mason University, is the largest known science and technology-focused crowdsourced forecasting site. So what makes a crowdsourced prediction market more powerful? An even bigger crowd. SciCast is launching its first worldwide call for participants to join the existing 2,300 professionals and enthusiasts ranging from engineers to chemists, from agriculturists to IT specialists.