Weβve noticed an issue in the AAAI 2026 Main Techincal review system:
the score distribution deviates from a realistic distribution.
This deviation is likely due to the fact that we currently do not yet have a large-scale benchmarking dataset in place. Our team has been actively monitoring the distribution and qualitatively discovered this inconsistency.
We are now working to locate the root cause and implement a fix. Please stay tuned for updates.
Importantly, the review content itself is reliable. However, for now, we recommend treating the numeric βRatingβ with a grain of salt until the issue is resolved.
Recently, it has come to light on multiple platforms (e.g., Xiaohongshu) that AAAI-26 is in the process of retracting a large number of previously issued desk reject decisions. This appears to be a widespread phenomenon this year, especially related to desk rejects caused by issues with checklists, appendices, and possible contradictions between the official template and website instructions.
A screenshot circulating online shows an email from the AAAI-26 team via OpenReview, which states:
"We noticed that some desk rejection notifications were sent out by mistake. We sincerely apologize for any inconvenience caused by these incorrect notifications. Our AAAI-26 team is currently re-evaluating all submissions that were determined to be desk rejected. We are working hard to resolve this issue as quickly as possible.
... Notification regarding erroneously issued desk rejects will be provided after the re-evaluation process concludes."
This means some submissions previously marked as desk rejected may actually still be under consideration, and affected authors should expect follow-up communication from the organizers soon.
Why Did This Happen?
This year, there were unusually many desk rejects related to checklist/appendix issues, with several reports suggesting a mismatch between the official template and website requirements.
Multiple authors have shared that their desk rejects are being reversed after further review.
The AAAI-26 team has acknowledged this and is conducting a comprehensive re-review of all impacted submissions.
What Should You Do?
If your submission received a desk reject from AAAI-26, do not panic!
Monitor your email (and your OpenReview dashboard) closely for follow-up notifications regarding the status of your paper.
Be prepared to respond to further requests from the AAAI-26 committee if necessary.
Source
https://www.xiaohongshu.com/user/profile/64a7a1df000000001001d871
Dear CSPaper Review Users,
We are happy to announce the latest enhancements to CSPaper Review (https://review.cspaper.org/)!
What's New
WACV 2026 Main Track Review Now Available
The reviewing interface and AI-assisted tools are now live for the WACV 2026 main conference track. You can submit reviews, run consistency checks, and explore automated summaries as before.
Improved Related Work Retrieval
We have expanded the search space for identifying missing related work. The tool now returns 2Γ more candidate papers potentially missing from a submissionβs bibliography, improving coverage and citation integrity.
Expanded Quality and Correctness Checks
Based on community feedback and real-case evaluations, we've significantly strengthened our automated paper quality checks. The system now rigorously detects serious errors like:
Invalid or overstated statistical claims, including unverifiable identifiability conditions (e.g., necessity/sufficiency misstatements).
Experimental methodology flaws such as hyperparameter tuning on test sets or improper evaluation protocols.
Misleading claims of novelty or significance unsupported by theory or empirical evidence.
Incorrect convexity or optimization claims lacking justification.
Paper is not coherent or seems like drafted by LLMs with a low overall quality.
These additions improve early detection of desk-reject-worthy issues, supporting reviewers in catching critical errors and improving overall review quality.
Pre-Announcements:
1. Sign-In Requirement Before Review Submission
To ensure backend stability and facilitate secure review deletion upon request, we will soon enforce mandatory sign-in before submitting a review. This change will take effect next week, and we'll post a follow-up update here once it's live.
2. PDF page limitation
Currently, we automatically truncate PDF files to the first 15 pages if the uploaded document exceeds this length. This behavior often results in incomplete review reports, which can negatively affect the perceived quality of our evaluations. To address this, we will enforce a hard limit on the maximum number of PDF pages that can be processed.
We acknowledge that this limitation may feel restrictive, so we are considering increasing the limit to a more generous threshold, such as 20 pages or more. The final decision will be informed by data analysis to strike a balance between system performance, cost, and user experience.
GPT-5 Benchmarking Underway
With OpenAIβs release of GPT-5, weβve initiated benchmarking to evaluate its suitability as a CSPaper Review agent. Our benchmarks will help determine whether to enable GPT-5 for automated reviewing and summarization. We will share detailed findings in the coming weeks.
Thank you for using CSPaper Review!
β The CSPaper Team
Also ref: https://cspaper.org:443/post/322
We have received over 10 requests about deleting review data and we have processed those requests quickly. We realize this data deletion need is a frequent one, hence we are considering to implement this function for our users to delete their review data on "My Review" page.
As a result, if you are not in a hurry, you can wait for this function to be released to https://review.cspaper.org
If you need the review record and/or data deleted as soon as possible, please write email to support@cspaper.org
Important: when you are writing us email about data deletion, please specify the following:
Did you perform review as a guest or sign-in user?
What email (or user name) did you use to sign in?
Do you wish to delete all reviews you submitted or specific ones?
Thank you!
π
Welcome! Feel free to
register (verified or anonymous)
and share your thoughts or story β your voice matters here! π£οΈπ¬

π Try it now at review.cspaper.org
-
-
Using CSPaper Review Tool: Questions, Feedback & Ideas
AI-powered paper reviews for top CS conferences β fast, targeted insights to help boost your acceptance odds. Discuss anything related to the CSPaper Review Tool at review.cspaper.org: ask questions, report issues, or suggest improvements.
-
Peer Review in Computer Science: good, bad & broken
Discuss everything about peer review in computer science research: its successes, failures, and the challenges in between.
-
Anonymous Sharing & Supplementary Materials
Anonymously share data, results, or materials. Useful for rebuttals, blind submissions and more. Only unverified users can post (and edit or delete anytime afterwards).