Dead On Arrival
The science you trust might already be dead — we track every retraction so you don't cite a ghost.
Channel: Dead On Arrival
Tagline: The science you trust might already be dead — we track every retraction so you don’t cite a ghost.
Niche: Automated tracking, analysis, and plain-English explanation of retracted scientific papers, paper mill fraud, and “zombie papers” that keep getting cited after death
Target audience: Graduate students, researchers, science journalists, research integrity officers, policy analysts, curious science-literate public. ~5M+ people globally who actively read/cite papers + millions more who consume science news
Why now: Scientific retractions hit 4,500+ in 2025 alone — an all-time high. Paper mill fraud is “growing at an alarming rate” (PNAS). Hindawi retracted 8,000+ articles. Wiley shuttered journals. Over 50% of retractions now come from Chinese institutions. The Crossref API just integrated Retraction Watch data in Jan 2025, making automated analysis possible for the first time. Meanwhile, retracted papers continue accumulating citations — the “zombie paper” crisis is accelerating. The market leader (Retraction Watch) is a text blog with no data visualizations, no automated analysis, no trend dashboards. The gap is enormous.
Content Example
🧟 This Week’s Zombie Papers — March 31, 2026
The Dead That Won’t Stay Down: 7 Retracted Papers That Gained 200+ New Citations This Quarter
The scientific record has a haunting problem: papers that have been officially killed keep shambling through the literature, infecting new research with undead data. This week, Dead On Arrival tracked 847 newly retracted papers across 12 publishers — but the real horror is what happened to the already dead.
The Worst Offender: A 2019 Oncology Study That Should Have Stayed Buried
When BMJ retracted Chen et al.’s cardiac stem cell therapy paper last week after data sleuths flagged image “mismatches,” it joined a growing graveyard of compromised medical research. But here’s what nobody’s telling you: according to our analysis of OpenAlex citation data, this paper was cited 47 times after the expression of concern was published in 2024 — including in three clinical trial design documents.
That’s not an outlier. Our automated scan of Crossref and OpenAlex this week found 312 retracted papers that accumulated new citations in Q1 2026. The median time since retraction? 2.3 years. These aren’t obscure methodological notes — 41% are in biomedical fields where citation trails can influence treatment decisions.
By the Numbers This Week:
- 🪦 847 new retractions across all indexed journals
- 🧟 312 previously retracted papers gained new citations
- 🏭 189 retractions flagged as likely paper mill products (matching known manipulation patterns)
- 🔬 Top field: Materials Science (112 retractions) — the paper mill capital of science
- 🌍 Top country of origin: China (463, 54.7%), followed by India (89, 10.5%)
- 📈 Fastest-growing retraction category: AI-generated text/images detected (up 340% YoY)
The Paper Mill Corner: Heliyon’s Ongoing Purge
Elsevier’s mega-journal Heliyon continues its mass retraction event that began in February. This week: 34 more papers tagged as paper mill products, bringing the 2026 total to 412. Our network analysis shows a cluster of 23 of these papers sharing three “authors” who appear on 80+ publications across 6 journals — a classic mill signature. [Interactive network graph below]
Why This Matters to You
If you’re a researcher: run your bibliography through our free Zombie Check tool (coming next month). If you’re a science journalist: stop citing retracted papers — 6% of science news stories reference at least one retracted source without noting the retraction. If you’re a patient: that supplement study your naturopath cited? We’ll tell you if it’s still alive.
Dead On Arrival scans 4 million journal articles weekly using the Crossref, OpenAlex, and PubMed APIs. Every number has a source. Every claim has a receipt.
Data Sources
- Crossref REST API — Primary retraction feed. Filter
update-type:retractionfor new retraction notices. Free, unlimited polite use. Also hosts full Retraction Watch database CSV via GitLab. - OpenAlex API — Citation tracking for retracted papers (zombie detection). Filter
is_retracted:true, trackcited_by_countchanges over time. Free, no auth. 100K works/day. - PubMed E-Utilities — Biomedical retraction notices, abstracts, MeSH terms, author affiliations. Free with API key.
- Semantic Scholar API — Citation graph analysis, paper influence scores. Free tier: 100 req/5min.
- Retraction Watch Database (Crossref GitLab) — Full historical CSV with reason codes, subjects, countries. Updated periodically.
- RetractBase (CSIC) — Cross-reference for European retractions.
- PubPeer API — Flag papers with post-publication comments/concerns.
Automation Pipeline
- Schedule: GitHub Actions runs daily at 06:00 UTC (main scan) + weekly deep analysis every Monday
- Collect:
- Daily: Query Crossref API for new retractions (last 48h, with overlap for reliability)
- Daily: Query OpenAlex for citation count changes on all tracked retracted papers
- Weekly: Download fresh Retraction Watch CSV, diff against previous for new entries
- Weekly: PubMed E-Utilities scan for retraction notices in biomedical journals
- Process:
- Deduplicate across sources (DOI-based primary key)
- Categorize by field, country, publisher, retraction reason
- Calculate “zombie score” — post-retraction citation velocity
- Detect paper mill patterns (shared authors across suspicious clusters, image reuse signals)
- AI step: Generate plain-English analysis for each major retraction (why it matters, who’s affected)
- AI step: Write weekly trend analysis with narrative arc and hot takes
- AI step: Identify the “retraction of the week” — most impactful or scandalous
- Generate:
- D3.js network graphs for paper mill clusters (auto-rendered via Puppeteer)
- Chart.js bar/line charts for weekly trends
- Heatmap SVGs for country/field distributions
- “Zombie gauge” custom SVG component for each tracked paper
- Open Graph images for social sharing (auto-composed from charts + headline)
- Publish:
- Astro build → GitHub Pages deploy
- RSS feed auto-generated
- Newsletter email auto-sent via Buttondown/Mailchimp API
Tech Stack
- Static site: TypeScript + Astro (content collections for each retraction/analysis)
- Data processing: Node.js scripts in GitHub Actions
- Image generation: D3.js server-side rendering via Puppeteer, Chart.js for statistical charts, custom SVG templates for zombie scores and OG images
- Data collection: Axios/fetch against Crossref, OpenAlex, PubMed APIs
- Data storage: JSON files in repo (one per week, ~50KB each), SQLite for historical queries during build
- CI/CD: GitHub Actions (daily + weekly cron)
- Hosting: GitHub Pages (free) or Cloudflare Pages
- Newsletter: Buttondown (free tier: 100 subscribers) or Mailchimp
- Search: Pagefind (static search, built into Astro)
Monetization Model
- Tier 1 — Donations/Tips: “Keep the lights on at the morgue” — Ko-fi, GitHub Sponsors, Buy Me a Coffee. Research integrity community is donation-friendly (Retraction Watch sustains on donations).
- Tier 2 — Premium Newsletter: Weekly “Autopsy Report” with deep-dive analysis, paper mill network maps, zombie paper alerts. $5/month via Buttondown or Substack. Target: researchers who need this for their own citation hygiene.
- Tier 3 — Institutional API Access: Universities, publishers, and research integrity offices pay for API access to zombie scores and paper mill risk ratings. $50-200/month per institution.
- Tier 4 — Affiliate: Academic writing tools (Zotero, Mendeley alternatives), research integrity courses, academic editing services.
- Tier 5 — Sponsorship: Research integrity software companies (iThenticate/Turnitin, Proofig, Imagetwin), academic publishers wanting to signal integrity commitment.
- Projected month-1 revenue: $50-150 (early donations from Twitter/academic community)
- Projected month-6 revenue: $800-2,000 (200+ newsletter subscribers at ~10% premium conversion + growing donation base + first sponsorship conversations)
- Projected month-12 revenue: $3,000-8,000 (1,000+ newsletter, institutional interest, regular sponsors)
Growth Mechanics
- SEO: Target long-tail keywords: “is [journal name] trustworthy”, “retracted papers in [field]”, “paper mill journals list 2026”, “how to check if a paper is retracted”. These queries have almost zero quality competition.
- Social/viral: Retraction stories go VIRAL on Twitter/X and Reddit. Academic Twitter loves drama. Each weekly report has shareable chart images with provocative stats (“This week 312 dead papers were cited by the living”).
- Newsletter capture: Free weekly summary email, premium tier for deep analysis.
- Community: Enable comments/discussion on each retraction (moderated). Build a community of integrity sleuths.
- Partnerships: Cross-promote with Retraction Watch (complementary, not competitive — we do data viz, they do investigative journalism). Guest appearances on academic podcasts.
- Tools: Free “Zombie Check” — paste a DOI, check if it’s retracted. Free “Bibliography Scan” — upload a reference list, flag retracted papers. These tools drive organic traffic and newsletter signups.
- Telegram channel: Weekly retraction roundup with star tips enabled.
Channel Soul & Character
Name: Dead On Arrival (D.O.A.)
Mascot: A cartoon forensic pathologist owl wearing a lab coat and holding a magnifying glass over a paper — “Dr. Hoot.” Wide eyes. Always slightly annoyed at the state of science. Logo: an owl silhouette over a retraction stamp.
Visual Identity:
- Color palette: Deep navy (#1a1a2e), forensic teal (#16213e), warning amber (#e94560), clean white
- Typography: Fira Code for data/stats (monospace = forensic report feel), Inter for body
- Design language: “Forensic report meets science magazine” — clean, data-dense but beautiful
- Every page has the retraction stamp watermark element subtly in the background
Voice: A weary but sharp forensic scientist who’s seen too many bodies. Dry humor, dark puns (“today’s body count”), but dead serious about the implications. Think a noir detective who reads PubMed instead of case files. Occasionally exasperated (“how is this paper STILL being cited?!”), always backed by data. Not preachy — amused and appalled in equal measure.
Opinion/Stance:
- Paper mills are organized crime and should be treated as such
- Publishers who profit from retracted papers without refunding APCs are complicit
- The “zombie paper” crisis is a bigger threat to science than most people realize
- Transparency > punishment — the system needs reform, not just finger-pointing
- AI-generated fraud is the next wave, and nobody’s ready
Running Segments:
- 🧟 Zombie of the Week — The retracted paper that gained the most new citations
- 🏭 Mill Report — Paper mill cluster analysis with network graphs
- 🪦 Fresh Graves — This week’s notable new retractions
- 📊 The Numbers — Weekly stats dashboard
- 🔥 Hot Take — Dr. Hoot’s editorial opinion on a retraction story
- 🛡️ Integrity Win — Highlighting good retraction practices and whistleblowers
Launch Complexity: 3/5 (APIs are free and well-documented; data processing is straightforward; D3.js charts require upfront design work but are templatable; main challenge is making the AI analysis genuinely insightful rather than generic)
Content Quality Score: 5/5 (Unique data nobody else visualizes. Crossref + OpenAlex = authoritative sources. “Zombie paper” tracking is genuinely novel. The sample content above demonstrates real utility — researchers NEED this)
Automation Score: 4/5 (Daily data collection is fully automated. AI writing needs good prompts and quality checks to avoid hallucinating paper details. Chart generation is deterministic. Newsletter dispatch is automated. Human review recommended for first month, then can go hands-off)
Revenue Potential: 5/5 (Academic/institutional market has real money. Research integrity is a $100M+ industry. Even small penetration = significant revenue. Newsletter premium tier is high-value for researchers. Sponsorship from integrity tool companies is natural fit)
Total: 17/20
Why This Will Work:
The psychology is perfect: researchers are AFRAID of citing retracted papers — it’s career-damaging. Right now, checking is manual and painful. A beautiful, automated, weekly digest that tells you “here’s what died this week, here’s what’s still being cited from the grave” solves a real anxiety. The “zombie paper” angle is emotionally compelling — it’s horror storytelling meets data journalism. Academic Twitter amplifies retraction stories reliably. The tools (Zombie Check, Bibliography Scan) create organic traffic loops. And the institutional market (universities, publishers, funding agencies) provides a real revenue ceiling that most content sites don’t have.
The timing is impeccable: Crossref integrated Retraction Watch data into their API in January 2025. OpenAlex added retraction flags. For the first time ever, you can build fully automated retraction analysis at scale. The data infrastructure literally didn’t exist 18 months ago.
Risk & Mitigation:
- Retraction Watch sees us as competition — Mitigation: We’re complementary (they do investigative journalism, we do automated data analysis). Reach out early, propose cross-linking.
- AI analysis could hallucinate paper details — Mitigation: All claims must cite specific DOIs/data from APIs. Build verification layer that cross-checks AI output against source data.
- Legal risk from naming fraudsters — Mitigation: Only report what’s already public record (retractions are official publisher actions). Link to primary sources. No original accusations.
- API rate limits could throttle data collection — Mitigation: Crossref and OpenAlex are generous (100K+ requests/day with polite pool). Cache aggressively. Incremental updates.
- Academic audience is small — Mitigation: The “zombie paper” and “paper mill exposé” angles appeal to general science-curious audience too. Science fraud stories regularly hit r/all with 50K+ upvotes.