Tag Archives: Charles Twardy

So Long, and Thanks for All the Fish!

SciCasters:

Thank you for your participation over the past year and a half in the largest collaborative S&T forecasting project, ever. Our main IARPA funding has ended, and we were not able to finalize things with our (likely) new sponsor in time to keep the question-management, user support, engineering support, and prizes running uninterrupted. Therefore we will be suspending SciCast Predict for the summer, starting June 12, 2015 at 4 pm ET.  We expect to resume in the Fall with the enthusiastic support of a big S&T sponsor. In the meantime, we will continue to update the blog, and provide links to leaderboard snapshots and important data.

Recap

Through the course of this project, we’ve seen nearly 130,000 forecasts from thousands of forecasters on over 1,200 forecasting questions, and an average of >240 forecasts per day. We created a combinatorial engine robust enough to allow crowdsourced linking, resulting in the following rich domain structure:

Near-final questoin structure on SciCast, with most of the live links provided by users.

Near-final question structure on SciCast, with most of the live links provided by users. (Click for full size)

Some project highlights:

  • The market beat its own unweighted opinion pool (from Safe Mode) 7/10 times, by an average of 18% (measured by mean daily Brier score on a question)
  • The overall market Brier was about 0.29
  • The project was featured in The Wall Street Journal and Nature and many other places
  • SciCast partnered with AAAS, IEEE, and the FUSE program to author more than 1,200 questions
  • Project principals Charles Twardy and Robin Hanson answered questions in a Reddit Science AMA
  • SciCasters weighed in on news movers & shakers like the Philae landing and Flight MH370
  • SciCast held partner webinars with ACS and with TechCast Global
  • SciCast hosted questions (and provided commentary) for the Dicty World Race
  • In collaboration The Discovery Analytics Center at Virginia Tech and Healthmap.org, SciCast featured questions about the 2014-2015 flu season
  • SciCast gave away BIG prizes for accuracy and combo edits
  • Other researchers are using SciCast for analysis and research in the Bitcoin block size debate
  • MIT and ANU researchers studied SciCast accuracy and efficiency, and were unable to improve using stock machine learning — a testimony to our most active forecasters and their bots. [See links for Della Penna, Adjodah, and Pentland 2015, here.]

What’s Next?

Prizes for the combo edits contest will be sent out this week, and we will be sharing a blog post summarizing the project. Although SciCast.org will be closed, this blog will remain open as well as the user group.  Watch for announcements regarding future SciCast.

Once again, thank you so much for your participation!  We’re nothing without our crowd.

Contact

Please contact us at [email protected] if you have questions about the research project or want to talk about using SciCast in your own organization.

So Long, and Thanks for All the Fish

3+

Users who have LIKED this post:

  • avatar

Webinar Recording: SciCast, Cybersecurity Markets and the Near & Far Future of AI

SciCast participated in the TechCast Webinar Series on May 7, 2015, Forecasting in Turbulent Times: Tools for Managing Change and Risk.

The webinar covered The SciCast Prediction Market (Charles Twardy), Cybersecurity Markets (Dan Geer), and Near and far future of AI (Robin Hanson). Read the full description.  There were a few questions after each segment, and some more at the end.  (Hanson fans: note that Robin’s talk was not about markets this time, but a particular scenario extrapolation using economic reasoning from some strong initial assumptions, and the subject of his forthcoming book.)

1+

SciCast, Cybersecurity Markets and the Near & Far Future of AI

Please join us for a live webinar tomorrow, May 7 at 12PM EST. SciCast, Cybersecurity Markets and the Near & Far Future of AI is the second installment of TechCast Global’s webinar series. In the course of one hour, we will feature three thought-provoking segments and give attendees an opportunity to ask questions and interact with our panelists.

1+

Users who have LIKED this post:

  • avatar

SciCast WSJ Coverage: U.S. Intelligence Community Explores More Rigorous Ways to Forecast Events

SciCast has been featured in a Wall Street Journal article about crowdsourced forecasting in the U.S. intelligence community. We’re excited to share that SciCast now has nearly 10,000 participants, a 50% increase in the last two months - an important achievement for a crowdsourced prediction site.

WSJ_September2014

Continue reading

0

Join SciCast for a Reddit Science AMA and an ACS webinar this week!

Have you ever wondered what will be the next ‘big thing’ in technology?  What if you could garner collective wisdom from your peers - those who are interested in the same topics as you – with global reach?

Don’t miss two unique opportunities to learn more about how you can do this on SciCast (www.scicast.org), the largest known science and technology-focused crowdsourced forecasting site.

SciCast will be the featured topic in a Reddit Science AMA and an American Chemistry Society webinar this week!  Don’t miss these opportunities to share your SciCast expertise and weigh in on the discussion. We also encourage you to share the information with your friends and colleagues.

Continue reading

0

SciCast Calls for Science, Technology Experts to Make Predictions

Color_Logopdf_button

 

Contact:
Lynda Baldwin – 708-703-8804;
[email protected] 

Candice Warltier – 312-587-3105;
[email protected]

FOR IMMEDIATE RELEASE 

SciCast Calls for Science, Technology Experts to Make Predictions 

Largest sci-tech crowdsourcing forecast site in search of professionals and enthusiasts to predict future events 

FAIRFAX, Va (June 19, 2014) – SciCast, a research project run by George Mason University, is the largest known science and technology-focused crowdsourced forecasting site. So what makes a crowdsourced prediction market more powerful? An even bigger crowd. SciCast is launching its first worldwide call for participants to join the existing 2,300 professionals and enthusiasts ranging from engineers to chemists, from agriculturists to IT specialists.

Continue reading

0

Can crowdsourcing help find the missing Malaysia Airlines flight #MH370?

On SciCast, we’ve posted three questions about the missing plane. Can crowdsourcing help to locate it?

Dr. Charles Twardy, Project Principal, explains the different ways to crowdsource a search. “When a community turns out to help look for a lost child, that’s crowdsourcing,” he says. “The community volunteers typically aren’t as well-prepared as the search teams, but when directed by experienced Field Team Leaders, they can greatly extend the search effort. Similarly, experimental micro-tasking sites like TomNod.com let volunteers help search piles of digital images. Call it the effort of the crowd. SciCast is about the wisdom of the crowd: weighing the vast amounts of uncertain and conflicting evidence to arrive at a group judgment, of say the relative chances of several regions or scenarios. This could be as simple as an average - a robust method with much to recommend it when judgments are independent.  Or it could be something more advanced, like SciCast’s combinatorial prediction market.  A market reduces double-counting, and may be better suited to the case where most of us are just mulling over the same information, but a few have real insight. The trick is to find a large and diverse crowd, and persuade them to participate.”

Following are the questions. Click any of them to make your forecast (register or login first). Also, see the discussion and background tabs of each question for more details and links to news sources.

Where will the Malaysia Airlines Flight MH370 be found?

What happened to Malaysia Airlines Flight MH370?

Where will Malaysian Air Flight MH 370 be found (extended version)?

The extended search region uses this map.

Image

See this blog post for info on how to explore conditional probabilities.

Click here to read more about approaches to crowdsourcing Search & Rescue.

0

Decision Analysis Journal Article: Probabilistic Coherence Weighting for Optimizing Expert Forecasts

We’re excited to announce that the Decision Analysis Journal has published Probabilistic Coherence Weighting for Optimizing Expert Forecasts about some work last year related to DAGGRE.

It’s natural to want to help forecasters stay coherent as we ask related questions. For example, what is your confidence that:

1.      “Jefferson was the third president of the United States.”

2.      “Adams was the third president of the United States.”

People are known to be more coherent when these are immediate neighbors than when on separate pages with many unrelated questions in between.  So it’s natural to think it’s better to present related questions close together.

We found that’s not necessarily a good idea. On a large set of general knowledge questions like these, we got more benefit by allowing people to be incoherent, and then giving more weight to coherent people.  At least on general knowledge questions, coherence signals knowledge. We have yet to extend this to forecasting questions.

We found other things, too – cool and interesting things.  Here’s the abstract, but be warned, it gets technical:

Methods for eliciting and aggregating expert judgment are necessary when decision-relevant data are scarce. Such methods have been used for aggregating the judgments of a large, heterogeneous group of forecasters, as well as the multiple judgments produced from an individual forecaster. This paper addresses how multiple related individual forecasts can be used to improve aggregation of probabilities for a binary event across a set of forecasters. We extend previous efforts that use probabilistic incoherence of an individual forecaster’s subjective probability judgments to weight and aggregate the judgments of multiple forecasters for the goal of increasing the accuracy of forecasts. With data from two studies, we describe an approach for eliciting extra probability judgments to (i) adjust the judgments of each individual forecaster, and (ii) assign weights to the judgments to aggregate over the entire set of forecasters. We show improvement of up to 30% over the established benchmark of a simple equal-weighted averaging of forecasts. We also describe how this method can be used to remedy the “fifty–fifty blip” that occurs when forecasters use the probability value of 0.5 to represent epistemic uncertainty.

Read the article!

Christopher W. Karvetski, Kenneth C. Olson, David R. Mandel, and Charles R. Twardy. Probabilistic coherence weighting for optimizing expert forecasts. Decision Analysis 2013 10:4, 305-326

At Decision Analysis | Local PDF  ]

0