Capcom’s long-awaited Resident Evil: Requiem has become the latest, clearest example of a new — and rapidly worsening — problem in games media and e-commerce: AI-generated content presented as legitimate criticism and legitimate guidance. What started as a controversy over an AI-written review that briefly appeared on Metacritic has metastasized into a flood of cheaply produced “official” strategy guides on large marketplaces, many of them containing fabricated screenshots, AI-created covers, shallow text, and outright plagiarism. The result is an information market that misleads players, dilutes genuine editorial work, and rewards volume-driven, SEO-first bad actors more than real creators.
Background / Overview
Resident Evil: Requiem shipped in late February 2026 with major publisher support and broad technical requirements targeted at modern Windows 11 PCs. Capcom’s launch and technical notes set expectations for reviewers and players alike, but industry reaction was complicated by two coincident trends: the abrupt reduction of staff at some legacy games outlets and the proliferation of generative-AI outputs masquerading as human work. The combination produced an incident that forced Metacritic to remove an AI-written review and sever a partnership with the publication that posted it — a turn of events reported by multiple outlets.
At the same time, consumer storefronts — especially Amazon’s huge ecosystem of third‑party publishers and print‑on‑demand services — rapidly populated with dozens of low‑quality, AI‑produced game guides: short PDFs and print-on-demand books with pixel-scraped gameplay images, AI‑generated art covers, and boilerplate text that fails to reflect the lived experience of playing the game. Independent reporting and community investigations have documented examples across multiple upcoming and newly released titles, showing this is not a one-off problem limited to a single franchise.
This article lays out what happened, why it matters, how these systems work in practice, and what players, publishers, and platform operators should do next.
The immediate flashpoint: an AI review that shouldn’t have counted
What happened
A review of Resident Evil: Requiem published on VideoGamer — a long‑running UK site — was identified as being AI‑generated. That review briefly appeared on Metacritic and contributed to the aggregate score before being pulled; Metacritic later removed the review and indicated it would investigate the publisher relationship. Multiple reporting outlets documented both the review’s automated origins and the broader staffing changes at the publication that preceded the incident.
Why this matters: review aggregators like Metacritic are used by readers, buyers, and even companies as shorthand for critical consensus. When an AI‑generated piece — especially from a site that lacked access to a review code or direct playtime — is absorbed into an aggregator, it creates the false impression that a real critic played and evaluated the game. That undermines the aggregator’s purpose and damages the trust economy that publishers, platforms, and creators rely on.
Two independent confirmations
Two reputable outlets quickly covered the episode and verified its core claims: Kotaku called out the AI‑generated review and the apparent staffing changes that enabled it, and PC Gamer followed up with an independent account explaining how an AI review briefly made it onto Metacritic before removal. Those separate confirmations matter: this was not a rumor in a forum thread — it was a verifiable failure in editorial process and aggregator controls.
The parallel problem: AI "game guides" flooding marketplaces
Anatomy of the fake guide
E-commerce listings labeled as “official” or “complete” game guides are appearing for both released and unreleased titles. They typically share a pattern:
- AI‑generated or AI‑composited cover art that looks superficially plausible but is visually inconsistent or uncanny.
- A small page count (often under 200 pages) filled with shallow, generic walkthrough prose, reused bits of publicly available text, or chatbot-regurgitated synthesis of pre-release PR and early previews.
- Screenshots that are either AI‑generated fakes, low‑resolution scraped images, or outright copied images from independent walkthrough sites.
- Misstated or generic technical details (e.g., listing difficulty options or save systems based on franchise history rather than the actual game) or sections that end mid‑scenario — evidence the “author” never played the full game.
- Publication dates that sometimes precede the game’s release or the lifting of embargoes, implying no author had full access.
Game pages cited by community reporting include not only Resident Evil: Requiem but also titles not yet released or with no official guide, such as Pragmata and various indie releases — a signal that this is a platform‑level economics problem, not a Capcom-only quirk.
Cases and community reporting
Independent outlets and community threads have cataloged multiple examples. Gamereactor flagged a cluster of Pragmata guides that appear to be AI‑produced, and long Reddit threads show players spotting dozens of bogus guides for recent and upcoming releases. Players have also called out poor artwork, mismatched screenshots, and reprinted sections lifted from smaller walkthrough sites. The pattern is large enough to be observed across different game communities and storefront searches.
Why marketplaces are so vulnerable
Marketplace incentives: low friction + SEO rewards
Amazon’s massive third‑party publishing ecosystem and print‑on‑demand capabilities make it trivially easy to publish a paperback or digital PDF under a pseudonym. For low-cost sellers, the marginal cost of flooding product pages is essentially time and the price of a few cents in hosting. That’s amplified by a few web realities:
- SEO-first titles and product descriptions designed to capture search traffic (e.g., "COMPLETE GAME GUIDE," “WALKTHROUGH,” “OFFICIAL”) often rank well for spikes in searches around big releases.
- Amazon’s ranking systems and search relevancy favor conversion and historical sales signals more than editorial authenticity. A cheaply priced guide that converts clicks into purchases will surface quickly.
- Print‑on‑demand services remove financial risk for bad actors — there’s no warehouse of unsold books to worry about. If a copy sells, the seller makes profit; if not, the financial exposure is minimal.
At the same time, Amazon has itself embraced generative AI features on the storefront, like AI Shopping Guides, which show how AI can be productized inside commerce. When the platform blurs the line between human-curated and machine-generated content, it becomes much harder for consumers to distinguish legitimacy at a glance.
The collapse of editorial gatekeeping
Traditional editorial processes — dedicated reviewers, proofreaders, art directors — impose a time and labor cost that made large-scale, low-quality content projects uneconomical. But when publications downsize or pivot to automation, that gatekeeping disappears. We saw its downstream effects in the VideoGamer/Metacritic incident, and the same dynamic now applies to “publishing” on marketplaces: there’s literally no barrier to putting AI output into a book form and slotting it into a relevant product category.
The real harms
Consumers and parents
For everyday players the harm ranges from minor annoyance to real financial waste. Buyers expecting practical, accurate help will receive shallow, sometimes incorrect guidance. The problem is particularly worrying for younger players and families; kids whose guardians buy a “help” book expecting safe, accurate walkthroughs can be misled by fake images and content designed to look authoritative. Community reporting identified multiple questionable guides for games popular with younger audiences — a notable social harm.
Independent creators and small sites
Small walkthrough sites and dedicated strategy writers lose out twice: their work is scraped, repackaged, and sold, and they lose both traffic and income. In some documented cases, entire walkthrough sections and screenshots were copied from indie sites and republished in these “guides” without attribution. That’s a direct economic injury and an erosion of the incentive to create careful, original guides.
Publishers and platform trust
For publishers, the emergence of fraudulent guides and automated reviews raises reputational risks. When fake covers and fake screenshots circulate, they can confuse marketing, dilute messaging, and create a poor first impression for potential buyers. Aggregators and storefronts that display or monetize that content risk losing consumer trust — and once trust erodes, it’s extremely hard to rebuild.
How these fake guides are constructed (a technical breakdown)
- Source aggregation: crawlers and scrapers pull together public info — trailers, PR blurbs, early previews, and user comments.
- Prompted generation: an LLM is given a prompt to produce a "complete walkthrough" or "official guide" and is fed scraped text to synthesize.
- Image generation or compositing: generative image models produce cover art and fake screenshots; alternately, automated compositing blends real screenshots with AI art to avoid obvious duplication flags.
- Formatting and packaging: the output is run through basic layout templates, converted to PDF or print-ready formats, and uploaded to print-on-demand platforms under pseudonymous publisher accounts.
- SEO tuning: title, subtitle, and keyword spam are optimized to capture search traffic on release spikes.
This process is cheap, automated, and scalable. It’s optimized not for buyer satisfaction but for capturing first-moment search traffic and converting a small fraction of viewers into buyers.
Enforcement, legality, and platform responsibility
Copyright and scraped content
When these guides reproduce screenshots, walkthrough text, or images from other sites without permission, they create clear copyright issues. But enforcement is expensive and reactive: indie creators must file takedown notices, and platforms must be incentivized to act quickly. The speed of the AI‑guide churn — new listings popping up faster than takedown notices can be processed — means copyright enforcement alone is insufficient.
Marketplace policies and detection
Platforms have three levers:
- Preventive measures: strengthen onboarding for publishers, require identity verification for print-on-demand sellers who publish in high‑sensitivity categories, and implement review thresholds before allowing content to be listed as “Official Guide.”
- Reactive measures: faster DMCA and IP takedown workflows, automatic detection for mass‑generated content, and manual audits for high‑traffic titles around big releases.
- Economic disincentives: reduce visibility for simplex “SEO dumps” and prioritize established, verified publishers or publisher-supplied official guides.
Platforms can and should do more — but commercial incentives make meaningful, proactive policing unlikely unless regulators or major publishers push for it.
Why the industry must care now
AI‑assisted content creation will only get better and cheaper. If the current pattern continues, we risk normalizing a content ecosystem where:
- Low‑effort AI outputs dominate search results and storefront categories.
- Genuine creators are squeezed out economically and their work is repurposed without consent.
- Consumers — especially young ones — are increasingly at risk of misinformation packaged as a purchasable product.
We already saw the editorial consequences when editorial teams were replaced or reduced and AI was used to produce critical content. The marketplace phenomenon is simply the commerce mirror of that editorial failure: if you can automate a review, you can also automate a “book.” The effect is to flatten nuance, reduce diversity of opinion, and make discovery a race for the cheapest, most clickable output.
Practical steps forward — what to do now
For players and buyers
- Inspect listings carefully: check page count, look for publisher identity, and read early reviews. Be suspicious of claims like “official” or “complete” when they appear from unknown publishers.
- Prefer official or publisher‑endorsed guides: when available, official guides are produced under license and go through editorial and legal review.
- Use community verification: rely on trusted reviewers, forums, and community guides rather than single retail listings for deep assistance.
For small creators and independent sites
- Use trusted watermarks on screenshots and metadata to make screenshot scraping more detectable.
- Register works with DMCA‑ready processes and prepare a simple takedown template for repeated misuse.
- Consider offering official, small-scale downloadable guides through your own site to undercut low-quality alternatives.
For publishers and game companies
- Supply an official factsheet or a publisher‑verified guide for major releases, even a small free PDF FAQ, to reduce demand for low-cost third‑party substitutes.
- Work with marketplaces to institute verification badges (publisher‑verified guide) for licensed physical guides.
For platforms (Amazon and others)
- Implement higher onboarding standards for “guide” categories at the point of publishing: require proof of play, publisher verification, or a period of review before a product is allowed to use "official" or "complete" branding.
- Improve detection of mass-generated content by combining heuristics (short page counts, repeated template text, cover art anomalies) with human review for high‑impact titles.
- Offer a visible verification label for officially licensed guides that shows the publisher or IP owner has reviewed the product.
The broader ethical and economic question
There is a larger debate here that extends beyond bad books and bad reviews: what is the role of generative AI in creative economies, and how do we prevent it from becoming a vector for displacing human expertise? When outlets replace human reviewers with algorithmic processes, or when marketplaces accept machine-produced “help” as equivalent to expert-created guides, the social contract between creators and consumers breaks down.
Generative AI can and should be a tool: for idea generation, for assisting creators, and for lowering the costs of certain creative tasks. But when it becomes a substitute for verification, labor, and editorial judgment — and when platforms reward volume over quality — it becomes a mechanism for extraction rather than creation.
Conclusion
The Resident Evil: Requiem episode — an AI-written review briefly counted in an aggregator and a wave of AI-generated, productized “game guides” on open marketplaces — is the most visible symptom of a deeper problem. The incentives that drive marketplace visibility, combined with weak editorial gatekeeping and the near-zero marginal cost of publishing AI output, have produced a fertile environment for scams, misattribution, and consumer confusion.
Solving this won’t be easy, but it is tractable: platforms must change incentives, publishers must supply official alternatives and verification, and consumers and communities must remain skeptical and informed. If we fail to act, the steady commoditization of culture into cheap, SEO‑optimized artifacts will continue — and with it, the erosion of the hard, careful work that makes reviews, guides, and creative writing meaningful in the first place.
Source: FRVR
Following AI generated reviews, Resident Evil Requiem AI guide books flood Amazon, and they're part of a major problem