[Linkpost] “Assessing Near-Term Accuracy in the Existential Risk Persuasion Tournament” by Forecasting Research Institute

EA Forum Podcast (All audio) - A podcast by EA Forum Team

This is a link post. Forecasting Research Institute just released a new report: Assessing Near-Term Accuracy in the Existential Risk Persuasion Tournament In June–October 2022, we convened 169 people to participate in the “Existential Risk Persuasion Tournament” (XPT). The XPT participants included both superforecasters with proven forecasting track records and domain experts with subject-matter expertise. The tournament incentivized accurate forecasting and persuasive argumentation about long-term risks humanity may face, including risks from artificial intelligence (AI), climate change, nuclear war, and pandemics. This report analyzes respondents’ forecasting accuracy on 38 near-term questions that resolved by mid-2025. The study finds overall performance parity between superforecasters and domain experts, with both groups underestimating AI progress and overestimating improvements in climate technology. Both superforecasters and domain experts substantially outperformed a baseline of educated members of the general public. Read the full report here: https://forecastingresearch.org/near-term-xpt-accuracy --- First published: September 2nd, 2025 Source: https://forum.effectivealtruism.org/posts/fp5kEpBkhWsGgWu2D/assessing-near-term-accuracy-in-the-existential-risk Linkpost URL:https://forecastingresearch.org/near-term-xpt-accuracy --- Narrated by TYPE III AUDIO.

Visit the podcast's native language site