ClimateCheck - Scientific Fact-checking of Social Media Posts on Climate Change
2026 edition
ClimateCheck 2026: Shared Task on Scientific Fact-Checking and Disinformation Narrative Classification of Climate-related Claims
Hosted as part of the NSLP 2026 Workshop at LREC 2026 at 12 May 2026 in Palma de Mallorca, Spain
Motivation
The rise of climate discourse on social media offers new channels for public engagement but also amplifies mis- and disinformation. As online platforms increasingly shape public understanding of science, tools that ground claims in trustworthy, peer-reviewed evidence are necessary. The 2026 iteration of ClimateCheck builds on the results and insights from the 2025 iteration (run at SDP 2025/ACL 2025), extending it by adding training data, a new task on classifying disinformation narratives in climate discourse, and a focus on sustainable solutions.
The following tasks are available:
-
Task 1: Abstract retrieval and claim verification Given a claim and a corpus of publications, retrieve the top 5 most relevant abstracts and classify each claim-abstract pair as supports, refutes, or not enough information. Evaluation: Recall@K (K=2, 5) and B-Pref (for retrieval) + Weighted F1 (for verification) based on gold data; additional unannotated documents will be evaluated automatically. In addition, we will ask participants to use CodeCarbon https://codecarbon.io/ to assess emissions and energy consumption at test inference.
-
Task 2: Disinformation narrative classification Given a claim, predict which climate disinformation narrative exists according to a predefined taxonomy. Evaluation: Macro-, micro-, and weighted-F1 scores based on annotated documents.
Important dates:
- Release of datasets: December 15, 2025 (task 1); December 19, 2025 (task 2)
- Testing phase begins: January 15, 2026 (Codabench link TBA)
- Deadline for system submissions: February 16, 2026
- Deadline for paper submissions: February 20, 2026
- Notification of acceptance: March 13, 2026
- Camera-ready papers due: March 30, 2026
- Workshop: May 12, 2026
We encourage and invite participation from junior researchers and students from diverse backgrounds. Participants are also highly encouraged to submit a paper describing their systems to the NSLP 2026 workshop.
https://nfdi4ds.github.io/nslp2026/docs/climatecheck_shared_task.html
2025 edition
NFDI4DS partners organize Shared Tasks at SDP2025 co-located with ACL2025.
Social media facilitates discussions on critical issues such as climate change, but it also contributes to the rapid dissemination of misinformation, which complicates efforts to maintain an informed public and create evidence-based policies. In this shared task, we emphasise the need to link public discourse to peer-reviewed scholarly articles by gathering English claims from social media about climate change as well as a corpus of about 400K abstracts of publications from the climate sciences domains. Participants will be asked to retrieve relevant abstracts for each claim (subtask I) and classify the relation between the claim and abstract as ‘supports’, ‘refutes’, or ‘not enough information’ (subtask II).
Subtask I: Abstracts Retrieval
- Task: given a claim from social media about climate change and a corpus of abstracts, retrieve the top K most relevant abstracts.
- Evaluation: MAP and B-Pref accounting for retrieving relevant abstracts and not penalising unjudged documents.
Subtask II: Claim Verification
- Task: given the claim-abstract pairs received from the previous subtask, classify their relation as ‘support’, ‘refutes’, or ‘not enough information’.
- Evaluation: F1 score based on judged documents from gold data; unjudged documents will not be included in computing the score.
Find detailed information on the workshop page: https://sdproc.org/2025/climatecheck.html
