Foundations

The 60% overlap: SEO work that already improves your AI visibility

If you've been doing serious SEO for the last five years, you've already done about 60% of the work for AI search visibility — you just haven't measured the second result. That number isn't optimistic positioning. It's a structural reality of how modern AI engines retrieve and rank content. This is the practical breakdown of what SEO work doubles as GEO, what doesn't, and how to audit your current programme for the dual-purpose wins you're already producing.

By Gareth Hoyle Read time 11 min
TL;DR

Seven SEO disciplines produce AI search visibility almost as a side effect: technical accessibility, schema markup, comparison content, editorial backlinks, content quality and E-E-A-T, internal linking, and site speed. The four areas where SEO is genuinely insufficient: Wikipedia and entity work, Reddit and community presence, sentiment frame management, and live-retrieval-specific optimisation. Mature SEO programmes are typically 60% of the way to strong AI visibility before they consciously start. The gap to close is smaller than most teams assume.

Why this matters now

Most marketing leaders we talk to assume GEO is an entirely new discipline requiring an entirely new programme. They imagine ripping up the SEO budget and reallocating it. They imagine learning new tools, hiring new people, briefing new agencies.

That's mostly wrong. The infrastructure of how AI engines retrieve, rank, and synthesise content shares an enormous amount with how Google does it. Both are crawler-based. Both reward authoritative content. Both punish thin content. Both look for structured data. Both factor in domain authority, link signals, content quality, and topical depth.

The implication: if your SEO programme is mature, you've been quietly building AI visibility for years without knowing it. The work needed to make that visible — and to close the genuinely-distinct GEO gaps — is much smaller than starting from scratch.

Here's the breakdown, ranked by leverage.

The seven SEO disciplines that produce AI visibility as a side effect

Discipline 01

Technical accessibility (the floor)

AI engines crawl the web the same way Google does. They need to fetch your pages successfully, parse the HTML, and extract content. The technical foundation that lets Googlebot reach and read your site is the same foundation that lets GPTBot, ClaudeBot, PerplexityBot, and Google-Extended do their work.

Specifically: server response time under 600ms, working server-side rendering for content (or at least content visible in initial HTML), no aggressive bot-blocking that catches AI crawlers in its net, working internal redirects, no broken links cascading into 404 storms.

If your SEO team has fixed these things, AI engines benefit identically. The only AI-specific addition: explicitly check that you haven't accidentally blocked the AI crawlers in robots.txt. Some hosting providers and CDNs include AI bots in default block lists. Check yours. Five minutes of audit, often a substantial gain.

Double-dip ratio: Near 100%. Almost no work specific to GEO needed if SEO has done it.

Discipline 02

Schema markup

Structured data tells search engines and AI engines exactly what your pages are about — Organization, Product, Service, FAQPage, HowTo, Article, BreadcrumbList. The marginal cost of adding schema is small. The benefit applies to both Google rich results and AI extractability simultaneously.

The patterns matter slightly differently across the two. SEO schema optimises for SERP features (FAQ accordions, HowTo cards, Product carousels). AI schema optimises for facts being machine-extractable when an engine retrieves the page for a buyer query. The schema markup itself is identical in both cases. The strategic emphasis shifts.

Practical check: Run your domain through Google's Rich Results Test on three priority page types — homepage, product/service page, and any comparison or FAQ page. If you have valid schema for the page types Google rewards, you almost certainly have schema that helps AI engines too.

Double-dip ratio: ~95%. The schema you've already invested in does the second job at near-zero marginal cost.

Discipline 03

Comparison content (/vs/ and /alternatives/ pages)

If you've built comparison pages targeting "X vs Y" or "alternatives to Z" queries, you're hitting two birds. SEO rewards them because they target specific commercial-intent keywords with reasonable conversion potential. AI engines retrieve them disproportionately when synthesising buyer-decision answers.

This is one of the highest-leverage page types we see in audits. A single well-written comparison page typically:

  • Ranks within 60-90 days for at least one direct comparison query
  • Gets cited by AI engines within weeks for related decision-stage prompts
  • Provides the framing AI uses to describe your brand vs competitors (so writing it on your terms matters)

If your SEO team has built these pages well, the AI visibility benefit is automatic. If they haven't, this is the single most concentrated investment that improves both channels simultaneously.

Double-dip ratio: ~85%. Some structural tweaks help AI extractability beyond what pure SEO needs (more about that in the gap section), but the bulk of the work counts twice.

Discipline 04

Editorial backlinks (Digital PR)

Backlinks from authoritative editorial sources do double duty. For SEO they pass link equity and improve domain authority. For GEO they place mentions of your brand into the corpus AI engines train on and retrieve from. The same Forbes article that lifts your DR by half a point also gets your brand into the next ChatGPT training snapshot, into Perplexity's live retrieval, and into Google's AI Overviews citation pool.

The volume threshold differs slightly. SEO benefits significantly from 10–20 high-authority backlinks per year. AI training-data influence usually requires sustained 30–60+ per year for several years to meaningfully shift category framing. So the SEO programme that produces baseline link equity is partway, but not all the way, to optimal AI visibility.

The good news: scaling Digital PR up from 20/year to 50/year is much cheaper per placement than starting from zero, because you've already built the relationships, the pitch templates, and the editorial calendar.

Double-dip ratio: ~75%. The work counts twice, the volume requirement for full GEO benefit is higher.

Discipline 05

Content quality and E-E-A-T signals

Google's E-E-A-T framework — Experience, Expertise, Authoritativeness, Trustworthiness — emerged as a ranking factor over the last five years. AI engines have adopted essentially identical signals as quality filters when deciding which content makes it into training data and which gets cited at retrieval time.

Author bios, citation of sources, factual density, original analysis, named expert quotes, dates of publication and revision — these all signal to Google that content is high-quality. They signal to AI engines that the content is reliable enough to use as a source. The signals are so similar that we frequently see SEO teams wondering whether a given content investment helped GEO; the answer is almost always yes, by the same mechanism.

Where the disciplines diverge slightly: AI engines are noticeably stricter about content that "feels" AI-generated. Google penalises low-quality AI content; AI engines actively downweight content that pattern-matches to model output. So the floor for content quality has risen since 2023, and SEO teams who've been pushing back on lazy AI-generated content have a head start on a problem GEO teams are about to face.

Double-dip ratio: ~80%. Same signals, slightly stricter filtering for AI.

Discipline 06

Internal linking and topic clusters

Topic clusters — a hub page surrounded by related supporting pages, all interlinked — are an SEO play. They concentrate authority into a single hub page on a target topic, with the supporting pages funnelling relevance signals upward.

The same architecture helps AI engines understand your topical authority. When a model retrieves content for a query, it weighs not just the page itself but the topical context around it. A page on "agile project management" sitting in a well-built cluster of 15 supporting pages on related concepts looks much more authoritative than the same page sitting alone. Both Google and the AI engines respond to this difference, in similar ways.

Double-dip ratio: ~90%. Topic clusters built for SEO purposes naturally produce the topical authority signals AI engines weight heavily.

Discipline 07

Site speed and Core Web Vitals

Page speed matters for SEO because it's a ranking factor and a user-experience signal. It matters for AI because slow pages have lower crawl budgets, get re-fetched less often by AI crawlers, and are more likely to time out during retrieval.

The fixes are identical: image optimisation, caching, CDN deployment, eliminating render-blocking resources, reducing JavaScript bundle sizes, improving server response times. Every gain made for Google's Core Web Vitals translates directly to better AI crawl behaviour.

Double-dip ratio: Near 100%. Same improvements, same benefit on both sides.

The 40% gap — where SEO genuinely isn't enough

Now the honest part. There are four areas where SEO investment doesn't substitute for GEO-specific work, no matter how mature the programme.

1. Wikipedia and entity work

SEO has never paid much attention to Wikipedia. It's a low-authority link from an SEO perspective (nofollow) and most pages don't directly drive traffic. SEO teams routinely deprioritise it.

For AI visibility, Wikipedia is the single highest-leverage move available. Wikipedia is dramatically over-represented in LLM training data. A well-cited, well-maintained Wikipedia article makes your brand a stable, confident reference point in the AI's category knowledge. A missing or stub article makes you a fuzzy entity the AI is uncertain about.

This is the single area where SEO investment doesn't help GEO at all. Wikipedia work is its own discipline, with its own rules (notability standards, conflict-of-interest restrictions on self-editing, citation expectations). Treat it as a separate workstream.

2. Reddit and community presence

SEO has historically ignored Reddit because the links are nofollow and the platform is hard to control. AI engines train heavily on Reddit and weight it as a quality source for category recommendations.

The work is genuinely different from SEO too. It's not about creating content; it's about real participation. Senior employees active in relevant subreddits, answering technical questions about your product, building reputation. Astroturfed Reddit presence is detectable and counterproductive — it backfires when discovered, which it usually is.

Mature SEO programmes that haven't built community presence will need to do this work specifically for GEO. There is no SEO substitute.

3. Sentiment frame management

SEO doesn't have a sentiment dimension — your URL either ranks or it doesn't. AI does. Your brand can be mentioned negatively or framed unfavourably, and that's worse than not being mentioned at all.

Managing the framing AI engines apply to your brand requires monitoring how they describe you qualitatively, identifying the source content driving any unfavourable framing, and producing enough new content with the correct framing to shift the consensus over time. This discipline is genuinely new — it doesn't map to anything SEO teams traditionally do.

4. Live retrieval optimisation specifics

Some retrieval-specific patterns help AI engines extract content cleanly that SEO doesn't necessarily reward. Specifically: putting factual claims at the start of paragraphs (not buried after setup), using clear "X is Y" definitional sentences, structuring FAQs with the question phrased the way buyers actually ask it, ensuring server-side rendering for content (not just SEO-critical metadata).

An SEO team focused on dwell time and engagement might write differently — leading with hooks, using cliffhanger paragraphs, structuring for read-through. That style works for SEO. It works less well for AI extraction, which prefers content that delivers the answer immediately.

The fix is structural awareness, not new infrastructure. SEO teams aware of this tension can adjust content patterns to serve both — leading with the answer, then expanding for engagement. The conscious effort to do this is the GEO-specific work.

How to audit your current SEO for the GEO double-dip

If you have a mature SEO programme and want to know how much of it is already producing AI visibility, here's a structured audit you can run in a day:

SEO → GEO audit

Seven checks, one day

  • Check 1 — robots.txt (5 min). Open yourdomain.com/robots.txt. Verify GPTBot, ClaudeBot, PerplexityBot, Google-Extended, and CCBot are NOT blocked. Most accidentally-blocked is the cheapest fix in this list.
  • Check 2 — Schema audit (30 min). Test homepage, top 5 product/service pages, and your FAQ page on Google's Rich Results Test. Note where schema is missing or invalid. Same fixes serve both channels.
  • Check 3 — Comparison content inventory (60 min). List your /vs/ and /alternatives/ URLs. If you have fewer than 5, this is your highest-leverage content investment. If you have more than 10, audit them for AI-extractability (clear claims, comparison tables, decision-stage answer at the top).
  • Check 4 — Editorial coverage volume (90 min). Pull last 12 months' high-authority backlinks via your existing backlink tool. Count placements in mainstream press, trade publications, and reputable industry blogs. Threshold: 30+ per year sustained = AI training-data influence likely. Below that, scale up Digital PR.
  • Check 5 — Wikipedia and Wikidata presence (15 min). Wikipedia search for your brand. If no article: this is gap work, not SEO work. If stub article: same. If solid article: good.
  • Check 6 — Reddit footprint (60 min). Search "site:reddit.com [your category] recommendations" plus 3-4 variations. Count mentions of your brand vs top 3 competitors. If your share is below 20% of competitor mentions, this is gap work.
  • Check 7 — AI mention spot check (30 min). Run 10 representative buyer queries through ChatGPT, Claude, and Perplexity. Note presence/absence and any framing patterns. This tells you the current outcome of the work above.

Total time: about 4-5 hours. Output: a clear inventory of what your SEO programme has already produced for GEO, where the genuine gaps are, and which ones to prioritise.

The strategic conclusion

Most marketing leaders evaluating GEO investment ask the wrong question. The question isn't "should we do GEO?" — it's "given what we already have from SEO, where are the specific gaps that need GEO-specific work?"

For mature programmes, the answer is usually some subset of: Wikipedia work, Reddit/community presence, sentiment monitoring, and a small amount of content restructuring for live retrieval. Total incremental investment: probably 20-40% of what the SEO programme already costs, focused on the four genuine gaps rather than reinventing what's already working.

For programmes with weak SEO foundations, the picture is different. Those teams should fix SEO first because the technical and content base is what 60% of GEO depends on. Trying to do GEO without that foundation is like trying to build a second floor on a house with weak foundations — it can be done, expensively, but the structural problems will keep showing up.

The best news of all: this is a category where the brands who've been investing seriously in SEO for years have a genuine head start. They've been building the substrate AI engines reward, even when nobody talked about AI engines. The work that compounds over time has been compounding the whole time.

Now you just measure the second result.

Measure both at once

Get a Search Visibility Audit.

We benchmark your SEO authority across four providers (Semrush, Ahrefs, Moz, DataForSEO) and your AI Share of Voice across five engines — in one report. Find out exactly where the SEO work is paying off in AI visibility, and where the genuine gaps are. From $997, in 48 hours.