Category: The Proof

Way 8 — Case Studies & Evidence. Results, measurables, evidence over projections.

  • The $0 SEO Value Problem: What Invisibility Actually Costs Restoration Contractors

    The $0 SEO Value Problem: What Invisibility Actually Costs Restoration Contractors

    There’s a restoration company in Tacoma, Washington called All American Restoration Services. Four and a half stars. Thirty-seven Google reviews. Full mitigation and rebuild capability. Locally owned, with the kind of reputation that takes years to earn.

    Their SpyFu profile shows six tracked keywords, zero estimated monthly clicks, and $0 in monthly SEO value. DataForSEO has no data on them at all — they don’t register.

    They are, from a search engine’s perspective, completely invisible.

    This is not unusual. It is, in fact, the default state for most restoration contractors in most markets. And the cost of that invisibility is not abstract.

    What $0 SEO Value Actually Means in Dollars

    SEO value — the metric SpyFu and similar tools report — is an estimate of what a site’s organic traffic would cost if purchased through Google Ads. A site with $31,000 in monthly SEO value is receiving traffic that would cost $31,000 per month to replicate with paid search.

    When that number is $0, it means the site is generating no measurable organic traffic for any keyword anyone is actually searching.

    In the restoration industry, the keywords people search are high-intent and high-value. Someone searching “water damage restoration Tacoma” is not browsing. They have standing water in their house. They are going to call someone in the next fifteen minutes. The average water damage restoration job runs $3,836. Significant losses start at $15,000. The searches that drive those calls are worth real money — and right now, those calls are going to someone else.

    The math is uncomfortable. If a restoration company’s invisibility costs them even five jobs per month — conservative for a market the size of Tacoma — that’s $19,000 to $75,000 in monthly revenue that’s routing to a competitor who ranked higher. Not because that competitor does better work. Because their website exists, from Google’s perspective, and yours doesn’t.

    Why Good Restoration Companies End Up Invisible

    All American Restoration is not an anomaly. When you run DataForSEO and SpyFu against restoration contractors in most mid-size markets, the pattern repeats: strong reputation, strong reviews, zero search presence.

    It happens for a predictable set of reasons.

    Restoration companies grow on referrals. Insurance adjusters, plumbers, property managers — the first decade of a restoration business is built on relationships, not search. By the time the referral network matures, the business is busy enough that digital marketing feels optional. The website becomes a brochure, not an acquisition channel.

    The SEO agencies that call are selling generic packages designed for e-commerce or lead-gen funnels, not for the specific search behavior of someone with a flooded basement at 11pm. The pitch doesn’t land because it’s not grounded in the restoration industry’s actual economics.

    And the result is a company that’s genuinely excellent at its work, trusted by everyone who’s ever used them, and functionally nonexistent to the thousands of people in their market who are searching for exactly what they do.

    The Relative Improvement Problem

    Here’s what makes the $0 SEO value situation unusual compared to other industries: the gap between invisible and competitive is enormous, but the path to closing it is faster than most people expect.

    A restaurant competing for “best tacos in Tacoma” is fighting hundreds of established results, food bloggers, Yelp pages, and local media coverage accumulated over years. The field is crowded and the domain authority gap is steep.

    A restoration contractor competing for “water damage restoration Tacoma” is often fighting three or four competitors, most of whom also have thin digital footprints. The bar is low. Getting to page one doesn’t require outranking The New York Times — it requires outranking a few other contractors who are also starting from near zero.

    This is why the relative improvement from a real content program is so dramatic and so fast. Upper Restoration went from $0 to over $31,000 in monthly SEO value. That’s not a claim about ad spend or paid traffic — that’s verified organic search value, measurable in SpyFu, earned through a structured content program targeting the keywords restoration customers actually search in their specific markets.

    What Closing the Gap Looks Like

    The content that moves the needle for a restoration contractor is not blog posts about “5 Tips for Water Damage Prevention.” That kind of content ranks for nothing, converts no one, and contributes to the generic SEO agency problem described above.

    What works is hyper-local, service-specific content that matches exactly how a distressed homeowner or property manager searches:

    • Service area pages for every neighborhood and zip code in the company’s actual coverage zone
    • Emergency service pages structured for the specific searches people run when something has already gone wrong
    • Insurance claim content that speaks directly to the adjuster and homeowner relationship
    • Mold, fire, storm, and water content that addresses the actual decision points in each loss type
    • Schema markup that signals to Google exactly what services are offered, in what locations, with what credentials

    The volume matters too. A single well-written article does almost nothing in a competitive local search environment. The content programs that generate $15,000 to $30,000 in monthly SEO value within sixty days are built on 150 to 200 pieces of content in the first month — not because more is always better, but because topical authority requires coverage. Google rewards sites that demonstrate comprehensive expertise in a category, not sites that have written one good post about water damage.

    The SpyFu Dashboard Conversation

    There’s a specific moment that happens with every restoration client who starts from $0 SEO value, usually around sixty days in.

    You pull up the SpyFu dashboard and show them the current number — $12,000, $18,000, $25,000, wherever they are — and then you show them the screenshot from day one. The one that says $0.

    The conversation changes at that point. They’re no longer thinking about whether SEO works. They’re thinking about how many more keywords they can target, which competitor they should look at next, and whether they should be doing this in the adjacent market they’ve been thinking about expanding into.

    That’s the actual product. Not the content, not the rankings — the clarity. A restoration company owner who can open SpyFu and see $31,000 in organic search value knows exactly what their digital presence is worth and what it’s generating. The $0 problem isn’t just a marketing problem. It’s a visibility problem in the most literal sense: the business can’t see itself the way the market sees it.

    All American Restoration does excellent work. Their reviews say so. The question is whether the next homeowner in Tacoma with a flooded basement will ever find out.


    Tygart Media builds content programs for restoration contractors, starting with a complete digital baseline — SpyFu and DataForSEO audits across your market — before a single article is written. If your company shows $0 in SEO value, that’s not a criticism. It’s the starting line.

    {
    “@context”: “https://schema.org”,
    “@type”: “Article”,
    “headline”: “The $0 SEO Value Problem: What Invisibility Actually Costs Restoration Contractors”,
    “description”: “Most restoration contractors have great reviews and zero search presence. Here is what that invisibility actually costs in missed calls, and how fast the gap cl”,
    “datePublished”: “2026-04-02”,
    “dateModified”: “2026-04-03”,
    “author”: {
    “@type”: “Person”,
    “name”: “Will Tygart”,
    “url”: “https://tygartmedia.com/about”
    },
    “publisher”: {
    “@type”: “Organization”,
    “name”: “Tygart Media”,
    “url”: “https://tygartmedia.com”,
    “logo”: {
    “@type”: “ImageObject”,
    “url”: “https://tygartmedia.com/wp-content/uploads/tygart-media-logo.png”
    }
    },
    “mainEntityOfPage”: {
    “@type”: “WebPage”,
    “@id”: “https://tygartmedia.com/zero-seo-value-restoration-contractors/”
    }
    }

  • From 200+ Episodes to a Searchable AI Brain: How We Built an Intelligence Layer for a Consulting Empire

    From 200+ Episodes to a Searchable AI Brain: How We Built an Intelligence Layer for a Consulting Empire

    The Problem Nobody Talks About: 200+ Episodes of Expertise, Zero Searchability

    Here’s a scenario that plays out across every industry vertical: a consulting firm spends five years recording podcast episodes, livestreams, and training sessions. Hundreds of hours of hard-won expertise from a founder who’s been in the trenches for decades. The content exists. It’s published. People can watch it. But nobody — not the team, not the clients, not even the founder — can actually find the specific insight they need when they need it.

    That’s the situation we walked into six months ago with a client in a $250B service industry. A podcast-and-consulting operation with real authority — the kind of company where a single episode contains more actionable intelligence than most competitors’ entire content libraries. The problem wasn’t content quality. The problem was that the knowledge was trapped inside linear media formats, unsearchable, undiscoverable, and functionally invisible to the AI systems that are increasingly how people find answers.

    What We Actually Built: A Searchable AI Brain From Raw Content

    We didn’t build a chatbot. We didn’t slap a search bar on a podcast page. We built a full retrieval-augmented generation (RAG) system — an AI brain that ingests every piece of content the company produces, breaks it into semantically meaningful chunks, embeds each chunk as a high-dimensional vector, and makes the entire knowledge base queryable in natural language.

    The architecture runs entirely on Google Cloud Platform. Every transcript, every training module, every livestream recording gets processed through a pipeline that extracts metadata using Gemini, splits the content into overlapping chunks at sentence boundaries, generates 768-dimensional vector embeddings, and stores everything in a purpose-built database optimized for cosine similarity search.

    When someone asks a question — “What’s the best approach to commercial large loss sales?” or “How should adjusters handle supplement disputes?” — the system doesn’t just keyword-match. It understands the semantic meaning of the query, finds the most relevant chunks across the entire knowledge base, and synthesizes an answer grounded in the company’s own expertise. Every response cites its sources. Every answer traces back to a specific episode, timestamp, or training session.

    The Numbers: From 171 Sources to 699 in Six Months

    When we first deployed the knowledge base, it contained 171 indexed sources — primarily podcast episodes that had been transcribed and processed. That alone was transformative. The founder could suddenly search across years of conversations and pull up exactly the right insight for a client call or a new piece of content.

    But the real inflection point came when we expanded the pipeline. We added course material — structured training content from programs the company sells. Then we ingested 79 StreamYard livestream transcripts in a single batch operation, processing all of them in under two hours. The knowledge base jumped to 699 sources with over 17,400 individually searchable chunks spanning 2,800+ topics.

    Here’s the growth trajectory:

    Phase Sources Topics Content Types
    Initial Deploy 171 ~600 Podcast episodes
    Course Integration 620 2,054 + Training modules
    StreamYard Batch 699 2,863 + Livestream recordings

    Each new content type made the brain smarter — not just bigger, but more contextually rich. A query about sales objection handling might now pull from a podcast conversation, a training module, and a livestream Q&A, synthesizing perspectives that even the founder hadn’t connected.

    The Signal App: Making the Brain Usable

    A knowledge base without an interface is just a database. So we built Signal — a web application that sits on top of the RAG system and gives the team (and eventually clients) a way to interact with the intelligence layer.

    Signal isn’t ChatGPT with a custom prompt. It’s a purpose-built tool that understands the company’s domain, speaks the industry’s language, and returns answers grounded exclusively in the company’s own content. There are no hallucinations about things the company never said. There are no generic responses pulled from the open internet. Every answer comes from the proprietary knowledge base, and every answer shows you exactly where it came from.

    The interface shows source counts, topic coverage, system status, and lets users run natural language queries against the full corpus. It’s the difference between “I think Chris mentioned something about that in an episode last year” and “Here’s exactly what was said, in three different contexts, with links to the source material.”

    What’s Coming Next: The API Layer and Client Access

    Here’s where it gets interesting. The current system is internal — it serves the company’s own content creation and consulting workflows. But the next phase opens the intelligence layer to clients via API.

    Imagine you’re a restoration company paying for consulting services. Instead of waiting for your next call with the consultant, you can query the knowledge base directly. You get instant access to years of accumulated expertise — answers to your specific questions, drawn from hundreds of real-world conversations, case studies, and training materials. The consultant’s brain, available 24/7, grounded in everything they’ve ever taught.

    This isn’t theoretical. The RAG API already exists and returns structured JSON responses with relevance-scored results. The Signal app already consumes it. Extending access to clients is an infrastructure decision, not a technical one. The plumbing is built.

    And because every query and every source is tracked, the system creates a feedback loop. The company can see what clients are asking about most, identify gaps in the knowledge base, and create new content that directly addresses the highest-demand topics. The brain gets smarter because people use it.

    The Content Machine: From Knowledge Base to Publishing Pipeline

    The other unlock — and this is the part most people miss — is what happens when you combine a searchable AI brain with an automated content pipeline.

    When you can query your own knowledge base programmatically, content creation stops being a blank-page exercise. Need a blog post about commercial water damage sales techniques? Query the brain, pull the most relevant chunks from across the corpus, and use them as the foundation for a new article that’s grounded in real expertise — not generic AI filler.

    We built the publishing pipeline to go from topic to live, optimized WordPress post in a single automated workflow. The article gets written, then passes through nine optimization stages: SEO refinement, answer engine optimization for featured snippets and voice search, generative engine optimization so AI systems cite the content, structured data injection, taxonomy assignment, and internal link mapping. Every article published this way is born optimized — not retrofitted.

    The knowledge base isn’t just a reference tool. It’s the engine that feeds a content machine capable of producing authoritative, expert-sourced content at a pace that would be impossible with traditional workflows.

    The Bigger Picture: Why Every Expert Business Needs This

    This isn’t a story about one company. It’s a blueprint that applies to any business sitting on a library of expert content — law firms with years of case analysis podcasts, financial advisors with hundreds of market commentary videos, healthcare consultants with training libraries, agencies with decade-long client education archives.

    The pattern is always the same: the expertise exists, it’s been recorded, and it’s functionally invisible. The people who created it can’t search it. The people who need it can’t find it. And the AI systems that increasingly mediate discovery don’t know it exists.

    Building an AI brain changes all three dynamics simultaneously. The creator gets a searchable second brain. The audience gets instant, cited access to deep expertise. And the AI layer — the Perplexitys, the ChatGPTs, the Google AI Overviews — gets structured, authoritative content to cite and recommend.

    We’re building these systems for clients across multiple verticals now. The technology stack is proven, the pipeline is automated, and the results compound over time. If you’re sitting on a content library and wondering how to make it actually work for your business, that’s exactly the problem we solve.

    Frequently Asked Questions

    What is a RAG system and how does it differ from a regular chatbot?

    A retrieval-augmented generation (RAG) system is an AI architecture that answers questions by first searching a proprietary knowledge base for relevant information, then generating a response grounded in that specific content. Unlike a general chatbot that draws from broad training data, a RAG system only uses your content as its source of truth — eliminating hallucinations and ensuring every answer traces back to something your organization actually said or published.

    How long does it take to build an AI knowledge base from existing content?

    The initial deployment — ingesting, chunking, embedding, and indexing existing content — typically takes one to two weeks depending on volume. We processed 79 livestream transcripts in under two hours and 500+ podcast episodes in a similar timeframe. The ongoing pipeline runs automatically as new content is created, so the knowledge base grows without manual intervention.

    What types of content can be ingested into the AI brain?

    Any text-based or transcribable content works: podcast episodes, video transcripts, livestream recordings, training courses, webinar recordings, blog posts, whitepapers, case studies, email newsletters, and internal documents. Audio and video files are transcribed automatically before processing. The system handles multiple content types simultaneously and cross-references between them during queries.

    Can clients access the knowledge base directly?

    Yes — the system is built with an API layer that can be extended to external users. Clients can query the knowledge base through a web interface or via API integration into their own tools. Access controls ensure clients see only what they’re authorized to access, and every query is logged for analytics and content gap identification.

    How does this improve SEO and AI visibility?

    The knowledge base feeds an automated content pipeline that produces articles optimized for traditional search, answer engines (featured snippets, voice search), and generative AI systems (Google AI Overviews, ChatGPT, Perplexity). Because the content is grounded in real expertise rather than generic AI output, it carries the authority signals that both search engines and AI systems prioritize when selecting sources to cite.

    What does Tygart Media’s role look like in this process?

    We serve as the AI Sherpa — handling the full stack from infrastructure architecture on Google Cloud Platform through content pipeline automation and ongoing optimization. Our clients bring the expertise; we build the system that makes that expertise searchable, discoverable, and commercially productive. The technology, pipeline design, and optimization strategy are all managed by our team.

  • If I Were Running Rainbow Restoration’s SEO, Here’s What I’d Do Differently

    If I Were Running Rainbow Restoration’s SEO, Here’s What I’d Do Differently

    I’m about to do something that most agency owners would never do: hand over an entire playbook.

    Not a teaser. Not a “5 quick wins” listicle. The actual, step-by-step strategy I would execute — starting tomorrow — if Rainbow Restoration handed me the keys to their organic search program.

    Why? Because I just pulled their SpyFu data, and what I found is the most interesting restoration franchise story I’ve analyzed so far.

    Rainbow Restoration (rainbowrestores.com) didn’t suffer a decline. They survived a full domain migration from rainbowintl.com and actually came out the other side with a living, breathing SEO program. But here’s where it gets fascinating: they left roughly $3 million per month on the table.

    The old domain peaked at $3.35M/month and 109,000 keywords. The new domain is recovering, but they’re sitting at $495,500/month and 33,700 keywords. That’s 85% below where they should be — which means the upside is enormous.

    So let’s talk about what I’d do to finish what the migration started.

    The Data: From Peak to Recovery to Opportunity

    I pulled the full 12-month historical record from SpyFu on March 30, 2026. Here’s rainbowrestores.com over the last year:

    Period Organic Keywords Monthly Organic Clicks SEO Value ($/mo) PPC Spend ($/mo) Domain Strength
    Mar 2025 53,769 29,960 $330,500 $444 50
    Apr 2025 50,920 27,330 $323,100 $535 50
    May 2025 47,600 28,160 $295,100 $603 47
    Jun 2025 45,980 26,890 $281,500 $704 47
    Jul 2025 49,910 32,160 $338,700 $793 48
    Aug 2025 54,810 36,720 $352,200 $836 48
    Sep 2025 55,550 37,520 $302,100 $0 50
    Oct 2025 58,509 38,420 $309,800 $0 51
    Nov 2025 57,770 36,400 $308,400 $582,800 51
    Dec 2025 40,080 31,260 $235,600 $324,500 50
    Jan 2026 38,460 30,910 $227,200 $277,100 49
    Feb 2026 33,700 25,500 $495,500 $320,000 52

    Let me break this down:

    The Good News: Rainbow survived a domain migration. That alone is impressive. Most franchise migrations crater the domain completely. Rainbow’s new domain is healthy, with 33,700 keywords and Domain Strength at 52. The Feb 2026 spike in SEO value ($495,500 on fewer keywords) suggests they’re concentrating value in higher-intent queries — the same pattern I’m seeing with SERVPRO and 911 Restoration.

    The Reality Check: In November 2025, they were running strong at 58,509 keywords and $309,800/month SEO value. Then December hit — the same algorithm cliff that affected the entire restoration vertical. But there’s a bigger story: the old rainbowintl.com domain peaked at 109,000 keywords and $3.35M/month in July 2022. Rainbow is still sitting 69% below peak keywords and 85% below peak SEO value.

    The Opportunity: If Rainbow recovers even 50% of what the old domain achieved, that’s $1.67M/month in SEO value. They’re currently at $495K. Do the math: there’s $1.17M per month in recoverable organic value just sitting there.

    The PPC Symptom: Starting November 2025, they went from basically zero PPC spend to $320K-$582K/month. That’s the classic pain indicator — when organic traffic drops, you buy it back with ads until you can fix the plumbing. Combined Q4/Q1 PPC spend: approximately $1.18M. In six months, they could rebuild enough organic to cut PPC spend by 50-70% permanently.

    What Happened: The Migration Story

    Here’s what we know:

    Rainbow Restoration successfully migrated from rainbowintl.com to rainbowrestores.com. The old domain is now a digital graveyard — 4 keywords, zero SEO value. But the new domain caught the migration and recovered. This tells me:

    1. They implemented proper 301 redirects. If they hadn’t, the new domain would be at zero. The fact that it’s at 33,700 keywords means they passed significant equity through the redirect chain.
    2. They didn’t lose all their backlinks. Domain Strength recovered to 52, which is respectable for a post-migration domain. This suggests proper domain forwarding and/or existing backlinks pointing to the new domain.
    3. The recovery stalled before completion. Migrations take 4-6 months to fully stabilize. If the Q4 algorithm update hit during the stabilization phase, they probably lost traction at a critical moment.

    The strategic issue isn’t the migration itself — Rainbow executed it correctly. The issue is: did they rebuild the content and architecture that made the old domain great?

    My hypothesis: They migrated the structure, the redirects, and the authority signals. But the old rainbowintl.com probably had 109,000 keywords because it had mature, deep content libraries that the new domain hasn’t fully replicated yet. Here’s how to finish the recovery.

    The Playbook: What I’d Do Starting Tomorrow

    Phase 1: Redirect Audit and Content Archaeology (Week 1-2)

    Before I optimize a single keyword, I need to understand what was lost in the migration and what wasn’t recovered.

    The Technical Foundation:

    • Crawl both domains. Run Screaming Frog against rainbowrestores.com and archive.org snapshots of rainbowintl.com from July 2022 (peak). I’m looking for:
      • All content that existed on the old domain but isn’t on the new domain. These are orphaned keyword opportunities.
      • All 301 redirects and redirect chains. Chains longer than 2 hops leak PageRank.
      • Old URLs that redirect to homepage or generic pages instead of topically relevant pages. These are misdirected equity losses.
    • Google Search Console archaeology. Pull 16 months of GSC data for rainbowintl.com (if they still have it configured) showing which pages deindexed, when, and why. This shows exactly which content lost coverage during the migration.
    • SpyFu historical data for the old domain. Export the top 200 keywords that rainbowintl.com ranked for at peak. Which of these keywords does rainbowrestores.com rank for now? Which are completely lost? The gap is your content recovery roadmap.

    Expected Output: A prioritized list of 500-1,000 pieces of content that existed on the old domain, were either not migrated or redirected ineffectively, and represent high-opportunity keyword recovery.

    Phase 2: Location Page Renaissance (Week 3-6)

    Rainbow has franchise locations in every state. Each location is a keyword goldmine that probably hasn’t been fully developed.

    Current State Assessment:

    Pull 10 sample city-level pages from the current site (e.g., /locations/denver/, /water-damage-restoration/denver/). Analyze:

    • How much unique content is on the page vs. templated boilerplate? (Target: 60%+ unique, locally-relevant content)
    • What schema is implemented? (Should be: LocalBusiness + Service + FAQPage + HowTo)
    • How many inbound internal links? (Should be: 10+ from parent hubs and contextual content)
    • Does it rank for the city + service modifier? (e.g., “water damage restoration Denver”)
    • How many related long-tail keywords does it rank for? (Should be: 20-40 per page)

    The Build:

    For each franchise territory and core service (water damage, fire damage, mold remediation, storm damage), create a location page following this structure:

    Header Section (Unique Local Content):

    • Opening paragraph: Local climate/risk profile + Rainbow’s response history in that area. “Denver’s high-altitude climate creates unique water damage challenges: rapid drying in low humidity but severe ice dam formation during freeze-thaw cycles. Rainbow Restoration has responded to 1,200+ water damage claims in the Denver metro since 2018, with an average response time of 38 minutes.”
    • Local expertise proof: State-specific certifications, regulatory requirements, insurance relationships. “Colorado requires mold remediation contractors to maintain IICRC S520 certification and comply with Colorado Dept. of Public Health guidelines. All Rainbow technicians are certified.”
    • Service area map: Embedded Google Map showing exact service territory polygons.

    Body Content (Problem-Solving Content):

    • Local problem scenario: “After the March 2024 ice storm, Denver experienced 400+ residential water damage claims from burst pipes. Here’s exactly what happened, what homeowners did wrong, and how to prevent it next time.”
    • Local process walkthrough: “Water damage restoration in Denver’s elevation and climate requires 3 specific adjustments to standard dehumidification protocols…”
    • Local regulation compliance: “Colorado’s water damage claims require documentation per CRS 10-4-1001…”

    CTA + Contact Section:

    • LocalBusiness schema with exact NAP, hours, phone, service area
    • Google Business Profile embed
    • 24/7 availability messaging (critical for emergency services)
    • Review count and rating display (builds trust before calling)

    Expected Results: Each location page should rank for 25-40 keywords within 60 days of launch. At 58 territories × 4 services × 30 keywords average = 6,960 new keywords. Combined with existing rankings, this gets Rainbow back toward the 58K keywords they had in October 2025.

    Phase 3: Content Architecture and Internal Linking (Week 4-8, Ongoing)

    This is how you make location pages work at scale: proper hierarchy and internal linking.

    The Three-Tier Hub Model:

    Tier 1: National Service Pillars (Authority anchors that rank for head terms)

    • /water-damage-restoration/ → “Water Damage Restoration: Complete Guide” (3,000+ words, comprehensive)
    • /fire-damage-restoration/ → “Fire Damage Restoration: Recovery Process”
    • /mold-remediation/ → “Mold Remediation and Removal Guide”
    • /storm-damage-restoration/ → “Storm Damage Restoration: What to Know”

    Each pillar page links to every state hub, accumulates backlinks, and passes equity down the hierarchy.

    Tier 2: State Hub Pages (Regional authority that bridges national and local)

    • /water-damage-restoration/colorado/ → Unique state content on climate, regulations, flood zones, seasonal risks
    • /water-damage-restoration/florida/ → Hurricane flood prep, saltwater intrusion, insurance nuances
    • etc. for every state where Rainbow operates

    Each state page links to all city pages within that state.

    Tier 3: City/Metro Pages (High-intent, revenue-generating)

    • /water-damage-restoration/colorado/denver/
    • /mold-remediation/colorado/denver/
    • /fire-damage-restoration/florida/miami/
    • etc. for all 58+ territories across all 4 services

    The Math: If Rainbow operates in 58 territories and 4 core services, that’s 232 city pages minimum. If each city page ranks for 25-40 keywords on average, that’s 5,800-9,280 keywords just from the location tier. Add the state and national tiers, and you’re back to 30K+ keywords organically.

    Internal Linking Rules:

    • Every pillar page links to all state hubs
    • Every state hub links to all city pages in that state
    • Every city page links back to its state hub and national pillar
    • Cross-service linking: The Denver water damage page links to the Denver mold page, etc.
    • Blog-to-location: Every blog post includes contextual links to 1-3 relevant location pages

    Phase 4: Content Tier Strategy — Crisis, Decision, Authority (Week 5-12)

    Location pages alone won’t cut it. Rainbow needs a three-tier content strategy that captures different stages of the customer journey:

    Tier 1: Crisis-Moment Content (The 2 AM homeowner in panic)

    People don’t search for “restoration companies” when their house is flooding. They search for “what do I do if my basement floods right now.”

    • “Basement Flooded: Emergency Steps in the First 30 Minutes”
    • “Burst Pipe Flooding My House: What to Do Before the Plumber Arrives”
    • “My Kitchen Caught Fire: Immediate Safety Steps and Next Actions”
    • “I Smell Mold But Don’t See It: Where to Look and When to Call a Pro”

    Format: Step-by-step numbered lists, HowTo schema, featured-snippet optimized. These convert because they’re the answer to someone’s worst day.

    Tier 2: Decision-Stage Content (The insurance call)

    • “Water Damage Restoration Cost 2026: Price Breakdown by Severity”
    • “Does Homeowners Insurance Cover Water Damage?”
    • “How to File a Water Damage Insurance Claim: Complete Guide”
    • “Water Mitigation vs. Water Restoration: Key Differences Explained”
    • “How Long Does Water Damage Restoration Take?”

    Format: Comparison tables, cost breakdowns, FAQPage schema. These convert because the person already knows they need professional help — they just need to choose who and understand the cost.

    Tier 3: Authority-Building Content (Builds domain trust and earns backlinks)

    • “Understanding IICRC Certification: What It Means for Your Restoration Company”
    • “The Science of Structural Drying: A Technical Deep Dive”
    • “2024-2026 Water Damage Claim Trends: Data Analysis by Region”
    • “Climate Change and Water Damage Risk: What the Data Shows”
    • “Building Code Compliance in Mold Remediation: State-by-State Requirements”

    Format: Long-form, research-backed, citations to EPA/FEMA/IICRC. These earn backlinks from industry publications and regulatory bodies, which flow authority through the site to location pages.

    Publishing Cadence: 2-3 Tier 1 posts/month (urgent, seasonal), 2-3 Tier 2 posts/month (decision support), 1 Tier 3 post/month (authority building).

    Phase 5: Schema Markup at Scale (Week 6-8)

    Rainbow probably has basic LocalBusiness schema on location pages. But there’s 10x opportunity in comprehensive schema implementation:

    Every location page needs:

    • LocalBusiness — NAP, geo-coordinates, service area polygon, hours, accepted payments
    • Service — Structured description of each service offered (water damage restoration, mold remediation, etc.)
    • FAQPage — Top 8-10 questions for that service/location combination with direct answers
    • HowTo — Step-by-step restoration process in structured format
    • AggregateRating — Star rating and review count from Google Business Profile

    Example LocalBusiness schema for /water-damage-restoration/colorado/denver/:

    {
      "@context": "https://schema.org",
      "@type": "LocalBusiness",
      "name": "Rainbow Restoration Denver",
      "image": "https://rainbowrestores.com/locations/denver/logo.jpg",
      "description": "Emergency water damage restoration, water mitigation, and structural drying in the Denver metropolitan area.",
      "address": {
        "@type": "PostalAddress",
        "streetAddress": "[actual address]",
        "addressLocality": "Denver",
        "addressRegion": "CO",
        "postalCode": "[zip]",
        "addressCountry": "US"
      },
      "geo": {
        "@type": "GeoCoordinates",
        "latitude": 39.7392,
        "longitude": -104.9903
      },
      "areaServed": {
        "@type": "GeoShape",
        "polygon": "39.5,-105.2 39.5,-104.6 40.1,-104.6 40.1,-105.2 39.5,-105.2"
      },
      "telephone": "+1-303-[number]",
      "url": "https://rainbowrestores.com/water-damage-restoration/colorado/denver/",
      "openingHoursSpecification": {
        "@type": "OpeningHoursSpecification",
        "dayOfWeek": ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday", "Sunday"],
        "opens": "00:00",
        "closes": "23:59"
      },
      "hasOfferCatalog": {
        "@type": "OfferCatalog",
        "itemListElement": [
          {
            "@type": "Offer",
            "itemOffered": {
              "@type": "Service",
              "name": "Water Damage Restoration",
              "description": "24/7 emergency water damage mitigation and restoration services"
            }
          },
          {
            "@type": "Offer",
            "itemOffered": {
              "@type": "Service",
              "name": "Mold Remediation",
              "description": "Mold inspection, remediation, and prevention"
            }
          }
        ]
      },
      "aggregateRating": {
        "@type": "AggregateRating",
        "ratingValue": 4.8,
        "reviewCount": 247
      }
    }
    

    When you implement this across 232+ location pages with consistent data, Google gets a machine-readable map of your entire franchise network. That’s how you win Local Pack results at scale.

    Phase 6: Answer Engine Optimization (AEO) — Win the AI Era (Week 7-Ongoing)

    Google’s AI Overviews appear on restoration queries. If your content isn’t structured to be cited, you’re invisible.

    AEO Tactics for Restoration:

    • Definition boxes at the top of service pages. “Water damage restoration is the professional process of removing water, drying the structure, treating for biological growth, and restoring all affected materials to pre-loss condition. In Colorado’s climate, structural drying typically requires 72-120 hours of continuous dehumidification due to altitude-specific psychrometric conditions.”
    • Direct-answer formatting. H2: “What’s the first step in water damage restoration?” A1: “The first step is always emergency water extraction. Using truck-mounted extractors rated for 250+ gallons per minute, technicians remove standing water within 1-2 hours. This prevents secondary damage like foundation erosion and structural swelling.”
    • Comparison tables. “Water Mitigation vs. Water Restoration: What’s the Difference?” AI Overviews pull these structures directly.
    • Numbered process lists. “5 Stages of Water Damage Restoration: 1. Inspection and Assessment, 2. Water Extraction, 3. Drying and Dehumidification, 4. Cleaning and Sanitization, 5. Restoration and Reconstruction.”

    The goal: When someone asks Google “what should I do if my basement floods,” the AI Overview cites Rainbow Restoration content because it’s the most useful, structured answer available.

    Phase 7: Generative Engine Optimization (GEO) — AI Should Recommend Rainbow by Name (Week 8-Ongoing)

    This is the frontier. Most restoration companies haven’t heard of GEO. But it’s critical: making AI systems (Claude, ChatGPT, Gemini, Perplexity) recommend Rainbow Restoration by name when someone asks “who should I call for water damage in Denver?”

    GEO Tactics:

    • Entity saturation. Rainbow Restoration needs to appear across the web consistently paired with specific attributes: IICRC certification, 24/7 availability, specific service areas, fast response times, specific equipment (truck-mounted extractors, desiccant dehumidifiers, etc.). The more consistently these associations appear across authoritative sources, the more confidently AI recommends the brand.
    • Factual density over marketing. Replace “We’re the best water damage company” with “Rainbow Restoration Denver operates 6 truck-mounted extractors (each rated 250 gallons/minute), maintains 4 commercial desiccant dehumidifier units, and averages 38-minute response times to the metropolitan area, with IICRC S500-certified technicians.” Specificity = authority in the AI world.
    • Authority citations. Every Tier 3 content piece should cite EPA guidelines, FEMA resources, IICRC standards, and state licensing requirements. AI systems weight content higher when it cites authoritative sources.
    • LLMS.txt implementation. Create /llms.txt at the root with a structured summary: “Rainbow Restoration is a national water damage, fire damage, and mold remediation franchise operating in 58 territories across North America. IICRC-certified, 24/7 availability, average response time 38 minutes. Founded 1989, headquartered [location]. Services: [list]. Certifications: [list]. Service areas: [list].” This is the robots.txt equivalent for AI crawlers.

    Phase 8: Google Business Profile Optimization (Week 9-Ongoing)

    The Google Local Pack captures disproportionate click volume. Winning it requires systematic GBP optimization:

    • Weekly GBP posts. Not automated. Real posts: completed project photos with before/after, seasonal tips (“Prevent ice dams: 5 steps”), team spotlights. Google’s algorithm visibly rewards profiles with consistent, recent posts.
    • Review strategy. SMS review request sent 2 hours after job completion, email 24 hours later. Target: 200+ reviews at 4.8+ stars per location within 12 months. Respond to every review within 24 hours (positive and negative). Review velocity is the #1 Local Pack ranking factor after proximity.
    • Category precision. Primary: “Water Damage Restoration Service.” Secondary: “Fire Damage Restoration Service,” “Mold Removal Service.” Don’t dilute.
    • Photo optimization. 50+ photos per location (team, equipment, completed projects, office, vehicles). Geotagged. Updated monthly.
    • Q&A seeding. Add and answer the top 10 questions for each location’s GBP. These show up prominently and serve as free real estate for keyword-rich content.

    Phase 9: Backlink Acquisition — Leverage Franchise Scale (Week 10-Ongoing)

    Rainbow’s biggest competitive advantage: 58+ franchise locations. Most single-location competitors can’t match this scale. Use it.

    • Disaster response PR. After significant weather events, issue press releases to local media. “Rainbow Restoration Denver responded to 43 residential water damage claims during March 2026 ice storm, deploying 8 extraction teams across metro area.” Local news sites pick this up (high DA, high relevance, tons of backlinks).
    • Insurance partnerships. Rainbow is likely on preferred vendor lists for carriers. Each carrier relationship should include a backlink from their website (partner directory or “find a contractor” page).
    • Industry association profiles. IICRC.org, RestorationIndustry.org, state licensing boards — maintain active, detailed profiles across all of them. .org links carry serious authority.
    • Local civic backlinks. Every franchise location should systematically acquire 20-30 local backlinks: Chamber of Commerce, Better Business Bureau, Rotary Club, Little League sponsorships, etc. Automated systems can track these and alert franchises to apply.
    • Content partnerships. Co-create guides with local emergency management agencies. “How to Prepare Your Denver Home for Wildfire Season — by Rainbow Restoration and Denver Office of Emergency Management.” The .gov backlink flows serious authority.

    Phase 10: Stop the PPC Bleed (Weeks 1-52)

    Here’s the financial reality: Rainbow spent $1.18M on PPC in Q4 2025 and Q1 2026 combined. That’s annualized to ~$4.7M.

    At their pre-decline peak (Sep-Oct 2025), they had 58K keywords worth $309K/month in organic value — $3.7M annualized, delivered for free.

    The full playbook above, executed over 6 months, should recover $200-250K/month in organic SEO value. That’s $2.4-3M annualized in traffic they no longer need to buy.

    In 12 months, if they reach 50% of the old domain’s peak ($1.67M/month), they’ve reduced their PPC dependency by 75% permanently.

    This isn’t a cost center. This is a multiplying return where every dollar spent on SEO execution compounds while PPC spend evaporates the moment the budget runs out.

    What Makes Rainbow’s Story Different

    This is the part I don’t see written about often enough:

    Rainbow Restoration had the courage to migrate domains. Most franchises are terrified of it. But brand repositioning — moving from “rainbow international” to “rainbow restoration” — is smart. It’s clear, it’s specific, it owns the vertical.

    The problem isn’t the rebrand. The problem is that the SEO execution didn’t match the ambition of the rebrand.

    They handed the customer $3.35M/month in annual organic value when they flipped the domain switch, and then didn’t rebuild it on the new domain with the same sophistication.

    They survived. They’re healthy. But they left the bigger prize on the table.

    The playbook above is what finishes the job. It’s not theoretical. It’s what we execute for restoration companies at Tygart Media. Every day. All day.

    If Rainbow wants to reclaim the $1.67M/month that’s sitting there waiting to be captured, the path is clear. It just requires finishing what the migration started.

    Frequently Asked Questions

    What happened to Rainbow Restoration’s old domain (rainbowintl.com)?

    Rainbow Restoration migrated from rainbowintl.com to rainbowrestores.com. The old domain is now essentially dead — it currently ranks for only 4 keywords with $0 in estimated SEO value. However, rainbowintl.com peaked at 109,000 organic keywords and $3.35M/month in SEO value (July 2022, January 2020 respectively). The migration was executed correctly from a technical standpoint (proper 301 redirects were implemented), but the new domain has only recovered to 33,700 keywords and $495,500/month, leaving 85% of peak organic value on the table.

    How much organic traffic did Rainbow lose in the migration?

    Rainbow didn’t lose all their traffic — that would indicate a failed migration. Instead, they recovered about 31% of their peak keyword count (109K → 34K) and 15% of their peak SEO value ($3.35M → $495K). The gap represents content that either wasn’t migrated, was redirected ineffectively, or hasn’t been rebuilt on the new domain with the same authority and comprehensiveness. The opportunity is enormous: recovering even 50% of the old domain’s peak represents $1.67M/month in organic value that’s currently being captured by competitors or left on the table entirely.

    Why did Rainbow’s organic traffic drop in December 2025?

    December 2025 saw a significant organic decline across the restoration vertical — both SERVPRO and 911 Restoration experienced similar drops in the same timeframe. This pattern indicates an algorithm update or market shift that disproportionately affected restoration company rankings. The timing is consistent with Google’s broader content quality and entity authority updates. However, Rainbow’s recovery pattern (slightly higher SEO value on fewer keywords in Feb 2026) suggests a value concentration effect, meaning their remaining rankings are capturing higher-intent, higher-CPC keywords.

    What is Generative Engine Optimization (GEO) and why does it matter?

    Generative Engine Optimization (GEO) is the practice of optimizing content and brand presence so that AI systems — ChatGPT, Claude, Gemini, Perplexity, and other large language models — cite and recommend your business by name when users ask relevant questions. For restoration companies, GEO involves consistent brand-attribute associations across the web (IICRC certifications, response times, service areas), factual density in content (specific equipment, process details) rather than marketing language, authoritative citations (EPA, FEMA, IICRC standards), and LLMS.txt implementation. As AI-generated answers increasingly replace traditional search results, GEO is becoming as critical as traditional SEO for driving qualified customer discovery.

    How long would it take to rebuild Rainbow’s organic traffic to pre-migration peak?

    A realistic timeline breaks down as follows: Technical fixes and initial schema/architecture implementation (weeks 1-6) typically yield 10-15% keyword growth and quick indexation improvements. Content hierarchy build-out and location page optimization (weeks 4-16) should drive 25-35% growth. Full content strategy execution across all three tiers (months 1-6) yields 40-60% recovery. Meaningful SEO value recovery ($200K+/month) should be visible within 3-4 months. Full recovery to 50% of peak ($1.67M/month) would require 8-12 months of sustained execution. However, 85% recovery (approaching the old domain’s peak) would likely require 18-24 months because you’re rebuilding content depth and authority that took years to accumulate.

    Is Rainbow Restoration’s PPC spending necessary?

    No — it’s a symptom, not a strategy. Rainbow’s combined Q4 2025 and Q1 2026 PPC spend was approximately $1.18M in just six months. This spending is directly correlated with their organic decline: as organic keywords and clicks fell, they compensated by buying traffic through Google Ads. However, organic traffic that was worth $309K/month (Sep-Oct 2025) becomes “free” traffic once recovered, while PPC spend evaporates the moment budgets are reduced. A 12-month SEO execution program that recovers $200-250K/month in organic value would reduce their PPC dependency by 50-70%, creating a permanent efficiency gain. The ROI case strongly favors organic investment over sustained PPC spending.

    The Closing Pitch

    Here’s the thing about Rainbow Restoration: they actually pulled off the hard part. They rebranded, they migrated domains, and they survived. Most franchise companies crater completely when they try this. Rainbow didn’t.

    But surviving isn’t winning. And right now, they’re leaving $1.67M per month in organic value on the table — value that their old domain earned, value that should have migrated with them, value that’s sitting there waiting to be reclaimed.

    The roadmap above isn’t theoretical. It’s the exact methodology we execute at Tygart Media — we eat, sleep, and breathe restoration SEO. We’ve built the AI-powered content pipelines, the schema automation systems, and the GEO frameworks specifically for this vertical. And we know the playbook works because we’re running it right now for other restoration companies.

    The data is public. The opportunity is clear. And the fix is an execution problem.

    So here’s my pitch, and I’ll keep it honest:

    Hey, Rainbow Restoration. If you made it this far reading, you already know what needs to happen — because the SpyFu numbers don’t lie. You had the courage to rebrand and migrate. Now you need the SEO execution to match that ambition.

    We’re Tygart Media. We’ve already built the playbooks and the systems to execute this at franchise scale. We’d genuinely love to have the conversation about what $400K/month in recovered organic value looks like when it’s back.

    No pressure. No predatory sales tactics. Just two teams who understand restoration marketing talking about finishing what the migration started.

    Reach out here. Or call. Or send a franchise location manager. We promise we won’t show up with a water truck unless your data indicates you actually have a water problem. In which case, we probably know a guy. (In fact, we probably know 58 guys.) 😄

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

  • If I Were Running Paul Davis Restoration’s SEO, Here’s What I’d Do Differently

    If I Were Running Paul Davis Restoration’s SEO, Here’s What I’d Do Differently

    I’m about to do something that most agency owners would never do: tell you exactly what went wrong with one of restoration’s most strategic franchises.

    Not conspiracy theories. Not guesses. The actual data that explains why Paul Davis Restoration — a $2+ billion company with 600+ franchises across North America — lost half its organic keyword portfolio between November and December 2025.

    Why? Because I pulled their SpyFu data this morning, and what I found was different from the 911 Restoration story I told three weeks ago. This isn’t a domain in freefall. This is a franchise that was actually winning — growing their keyword portfolio from 39K to 50K through most of 2025 — and then tripped on the finish line.

    That’s not a systemic failure. That’s a fixable problem. And the recovery opportunity is enormous.

    The SpyFu Data: A Franchise That Peaked, Then Stumbled

    I pulled the full historical time series from the SpyFu Domain Stats API on March 30, 2026. Here’s what pauldavis.com looks like over the last 12 months:

    Period Organic Keywords Monthly Organic Clicks SEO Value ($/mo) PPC Spend ($/mo) Domain Strength
    Mar 2025 38,980 10,260 $370,100 $20,950 51
    Apr 2025 39,220 7,638 $387,500 $24,300 51
    May 2025 41,620 11,420 $431,000 $27,380 49
    Jun 2025 42,620 11,830 $450,200 $31,940 49
    Jul 2025 45,220 12,990 $482,800 $35,990 49
    Aug 2025 48,420 14,670 $532,800 $37,940 50
    Sep 2025 49,470 15,430 $491,200 $57,140 52
    Oct 2025 50,339 14,490 $484,200 $49,000 52
    Nov 2025 49,400 14,420 $484,300 $665,600 53
    Dec 2025 23,250 12,620 $372,400 $258,500 51
    Jan 2026 22,490 12,930 $365,100 $213,000 51
    Feb 2026 22,190 13,590 $952,800 $206,100 54

    Look at the trend. From March to October 2025, Paul Davis did exactly what every restoration company should be doing: they grew. 39K keywords → 50K keywords. $370K/month SEO value → $532K/month. That’s not a fluke. That’s execution. That’s a team running the playbook.

    Then November happened. PPC spend spiked to $665,600 — an 18.5x increase from October’s $49K. The same panic pattern I saw with 911 Restoration. And by December? Half the keywords vanished. 50K → 23K. That’s a 54% collapse in a single month.

    But here’s the thing that makes Paul Davis different than 911 Restoration: their SEO value per keyword is actually higher. At $43/keyword (based on Feb 2026 data), Paul Davis is ranking for higher-value keywords than most competitors in this space. That tells me they weren’t ranking for junk keywords. They were ranking for money terms — the ones that matter.

    Which means the fix isn’t a rebuild. It’s a recovery.

    What Actually Happened in Q4 2025: The Diagnostic

    Let me be direct about what I think happened. A keyword collapse from 50K to 23K in a single month isn’t gradual content decay. That’s one of three things:

    Scenario 1: A location page massacre. Paul Davis has franchises everywhere — across all 50 states. If someone restructured the location page architecture, consolidated pages, or switched hosting/CMS without a clean redirect map, Google would have vaporized thousands of pages from the index overnight. Franchise sites live and die on location pages. Lose those, lose everything.

    Scenario 2: A technical issue that broke indexation. A rogue robots.txt rule, an accidental noindex tag at the template level, a CDN misconfiguration returning 404s to Googlebot — any of these can silently deindex thousands of pages while organic traffic is still flowing because cached versions serve users fine. You don’t notice until you check GSC and see “Excluded – currently not indexed” spiked by 50%.

    Scenario 3: The November Google Core Update hit harder than anticipated. Google dropped a core update in November 2025. If Paul Davis’s location pages are thin, templated content with minimal local differentiation, the update could have targeted them specifically. Combined with algorithm changes favoring AI-extracted answers and entity authority, thin content gets deprioritized fast.

    My money? Scenarios 1 and 3 combined. But I’d verify with data before doing anything permanent.

    Step 1: The 72-Hour Diagnostic Audit

    Before touching a single page, I need to know what’s actually broken.

    Day 1: Crawl and Index Validation

    I’d run Screaming Frog against the full pauldavis.com domain — every page, every redirect. For a 600-franchise network, I’m expecting 8,000-15,000+ URLs. I’m specifically looking for:

    • Redirect chains longer than 2 hops — These leak PageRank and slow crawl budget.
    • Orphaned location pages — Pages that exist but have zero internal links. If city pages aren’t linked from a parent hub, Google treats them as low-priority and deprioritizes crawling.
    • Canonicalization issues — A single bad canonical tag at the template level can tell Google to ignore thousands of pages simultaneously. This is the most common cause of sudden deindexation I see.
    • JavaScript rendering problems — If Paul Davis uses any client-side rendering for critical location content, I’d compare Screaming Frog’s text extraction vs. what a headless browser sees. Mismatch = indexation risk.
    • Soft 404 patterns — Pages returning 200 status code but with “not found” content structure. Googlebot gets confused. Pages don’t index.

    Day 2: Google Search Console Analysis

    I need 16 months of GSC data — the period before and after the collapse.

    Specifically:

    • Coverage report trends — Did “Valid” pages spike downward in November/December? Did “Excluded – currently not indexed” spike upward? The answer tells the story.
    • Performance by URL pattern — Segment by location pages, service pages, blog content. Which pattern lost the most impressions? If it’s /locations/*, it’s an architecture problem. If it’s /services/*, it’s content quality.
    • Exclusion reason breakdown — What’s excluding the pages? “Blocked by robots.txt”? “Crawled – currently not indexed”? “Redirect error”? Each reason points to a different root cause.
    • Query data comparison — Export top 5,000 queries from October 2025 vs. February 2026. Which keyword clusters disappeared? If it’s geo-modified queries (“water damage restoration [city]”), location pages are the problem. If it’s service-level queries, the content strategy failed.

    Day 3: Competitive Analysis

    I’d pull the same SpyFu data for SERVPRO, 911 Restoration, ServiceMaster, and Rainbow International. If all of them declined in November/December, it’s an industry-wide algorithm shift. If Paul Davis uniquely declined, it’s site-specific.

    Then I’d audit the top-ranking competitors for Paul Davis’s highest-value lost keywords. What does their architecture look like? How many location pages? What schema are they using? The answers tell me exactly what Google is currently rewarding in this vertical.

    The Recovery Strategy: Rebuild What Was Already Working

    Here’s the critical insight: Paul Davis doesn’t need a redesign. They need a rescue. They proved they could rank for 50K keywords. Now I need to figure out what broke and fix it, then scale what was already working.

    Priority 1: Recover the Indexation Foundation (Days 1-30)

    This is the emergency phase.

    Canonical tag audit: If there’s a template-level canonical issue, it’s a one-line fix that could immediately un-exclude thousands of pages. I’d verify canonicals across 50+ representative pages from different URL patterns (locations, services, blog) and check GSC’s URL Inspection tool to see what Google actually crawled vs. what we think we served.

    Location page linking structure: I’d verify that every location page is explicitly linked from a parent hub page. No links = low crawl priority = Google ignores the page even if it’s technically valid. A simple site map regeneration or parent page update can fix this.

    Robots.txt validation: One bad rule and 90% of your site might be blocked from crawling. I’d audit the current robots.txt, compare it against historical versions (via Wayback Machine if needed), and remove any rules that shouldn’t be there.

    Redirect map cleanup: Any redirect chains longer than 2 hops get collapsed to 1-hop direct redirects. Every hop loses 10-15% of PageRank. In a franchise network with hundreds of redirects, this can be thousands of dollars in lost equity.

    Priority 2: Location Page Architecture Renaissance (Days 30-90)

    Now we rebuild what was working.

    Paul Davis has 600+ franchises. That’s 600+ locations that could have dedicated SEO landing pages. If they’re structured right, that’s 3,600+ pages (600 locations × 6 core services: water damage, fire damage, mold remediation, storm damage, sewage backup, dry cleaning/contents restoration).

    Each page needs:

    Locally-specific content that proves expertise. Not “water damage restoration in Houston” templated 500 words. I’m talking about: “Houston’s sub-tropical climate creates unique challenges — the combination of high humidity, frequent thunderstorms, and clay-based soil means water damage in Houston spreads faster than in drier climates. Our Houston team is trained on Gulf Coast moisture dynamics, local building codes, and Houston’s specific insurance requirements.” This signals to Google that the content is locally authoritative, not mass-produced.

    LocalBusiness schema with complete NAP + service area. Every location page needs JSON-LD marking up the franchise location with exact coordinates, service area polygon, hours (24/7 for emergency response), and a catalog of specific services with local pricing where available.

    Embedded Google Map. A map showing the service area reinforces local relevance and keeps users on-site instead of searching for competitors.

    Real project stories. “In March 2025, our Paul Davis team responded to a commercial water intrusion affecting 8,000 sq ft of office space in downtown Houston. Complete water extraction and structural drying completed within 48 hours.” Specificity builds trust with both users and algorithms.

    Priority 3: Content Depth Beyond Location Pages (Days 60-120)

    Now I add the layers that Google currently rewards.

    Crisis-moment content (targets the 2 AM searcher):
    – “What To Do When Your Basement Floods: A Step-by-Step Emergency Checklist”
    – “I Smell Mold In My House Right Now — What Should I Do First?”
    – “Fire Damage: What To Do In the First 24 Hours”

    These need HowTo schema, numbered steps, and definition boxes at the top for AI Overviews to extract. They capture intent before the decision to hire a pro is made.

    Decision-stage content (targets the insurance call):
    – “Water Damage Restoration Cost in 2026: A Regional Breakdown”
    – “Homeowners Insurance and Water Damage: What’s Covered and What Isn’t”
    – “Mold Remediation Timeline: Expectations From Day 1 to Completion”

    These need comparison tables, cost breakdowns, FAQPage schema. This is where Paul Davis wins against SERVPRO.

    Authority-building content (earns backlinks, builds topical authority):
    – “The Complete Guide to IICRC Certification Standards: S500, S520, and What They Mean”
    – “Understanding FEMA Flood Zones: How to Check Your Risk and What It Means for Insurance”
    – “Water Damage vs. Water Intrusion: Why the Distinction Matters (and What Your Insurance Company Cares About)”

    These earn backlinks from IICRC, FEMA, RIA, insurance publications, and local news outlets. Those links flow authority to location pages through internal linking.

    Priority 4: Schema Markup at Scale (Days 45-90)

    For a 600-franchise network, schema markup scales multiplicatively.

    Every location page needs:

    {
      "@context": "https://schema.org",
      "@type": "LocalBusiness",
      "name": "Paul Davis Restoration of [City]",
      "telephone": "+1-XXX-XXX-XXXX",
      "address": {
        "@type": "PostalAddress",
        "streetAddress": "[Street Address]",
        "addressLocality": "[City]",
        "addressRegion": "[State]",
        "postalCode": "[ZIP]"
      },
      "geo": {
        "@type": "GeoCoordinates",
        "latitude": "[LAT]",
        "longitude": "[LONG]"
      },
      "openingHoursSpecification": {
        "dayOfWeek": ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday", "Sunday"],
        "opens": "00:00",
        "closes": "23:59"
      },
      "areaServed": {
        "@type": "City",
        "name": "[City], [State]"
      },
      "hasOfferCatalog": {
        "@type": "OfferCatalog",
        "itemListElement": [
          {
            "@type": "Offer",
            "@id": "https://pauldavis.com/[city]/water-damage-restoration/",
            "itemOffered": {
              "@type": "Service",
              "name": "Water Damage Restoration"
            }
          },
          {
            "@type": "Offer",
            "@id": "https://pauldavis.com/[city]/fire-damage-restoration/",
            "itemOffered": {
              "@type": "Service",
              "name": "Fire Damage Restoration"
            }
          }
        ]
      }
    }
    

    Service pages need Article + Service + FAQPage + HowTo (when applicable).

    When you implement this at scale across 3,600+ pages with consistent, accurate data, you’re giving Google a machine-readable map of every franchise location and every service offering. That’s how you dominate Local Pack results and organic search simultaneously.

    Priority 5: Google Business Profile Velocity (Ongoing)

    The Local Pack wins happen here.

    For every franchise location:

    • Weekly GBP posts — Real posts, not automated junk. Project summaries with before/after photos, seasonal preparedness tips, team spotlights. Google’s algorithm visibly rewards active, engaged profiles.
    • Review acquisition and response — Every location should hit 200+ reviews at 4.8+ stars within 12 months. SMS review request 2 hours post-completion, email 24 hours later. Respond to every review within 24 hours. This is the #1 Local Pack ranking factor after proximity.
    • Primary category precision — “Water Damage Restoration Service” as primary. Secondary categories should reflect the strongest service mix for that region.
    • Photo pipeline — 50+ geotagged photos per location updated monthly. Team, equipment, completed projects, office, vehicles. Google prioritizes profiles with fresh, diverse visual content.

    Priority 6: Answer Engine Optimization for the AI Age (Days 60-120)

    Google AI Overviews now dominate informational restoration queries. If your content isn’t structured to be cited, you’re invisible.

    Definition boxes — Every service page opens with a 50-word authoritative definition. “Water damage restoration is the professional process of returning a property to its pre-loss condition following water intrusion from flooding, burst pipes, or precipitation. It encompasses emergency water extraction, structural assessment and documentation, industrial-grade dehumidification, antimicrobial treatment, and full restoration of affected materials.”

    Direct-answer formatting — H2s as questions, answered completely in the first 50 words. “How much does water damage restoration cost? The average cost ranges from $2,000 for minor localized damage to $25,000+ for significant structural involvement, with most homeowners paying $5,000-$15,000. Your final cost depends on the square footage affected, severity of damage, materials involved, and necessary structural repairs.”

    Comparison tables — “Water Mitigation vs. Water Restoration: Key Differences.” Side-by-side comparison of timeline, cost, scope, and outcomes.

    Numbered process lists — “The 5 Stages of Water Damage Restoration: 1. Emergency Response and Assessment, 2. Water Extraction and Removal, 3. Drying and Dehumidification, 4. Cleaning, Sanitizing, and Antimicrobial Treatment, 5. Restoration and Reconstruction.” This format wins HowTo rich results and AI Overview citations.

    Priority 7: The PPC Dependency: From $665K Spike Back to Baseline (Immediate)

    The November 2025 PPC spike to $665,600/month tells a clear story: organic pipeline broke, paid ads compensated.

    Here’s the math:

    • October 2025: $484,200/month organic value, $49K PPC spend. Healthy ratio.
    • November 2025: $484,300/month organic value, $665,600 PPC spend. Panic mode — the algorithms changed mid-month and they flooded with paid to keep revenue up.
    • Current: $952,800/month organic value (February 2026), $206,100 PPC spend. Recovery mode, but still elevated PPC.

    The strategic move isn’t to cut PPC cold turkey. It’s to systematically shift budget back to organic as rankings recover:

    • Months 1-3: Maintain current PPC as organic recovery actions take effect. Target high-intent paid keywords that should be ranking organically but aren’t.
    • Months 4-6: As location pages recover and start ranking, reduce PPC spend by 20-30% on those keywords and reinvest savings into content creation.
    • Months 6-12: If organic recovery hits 60%+ of the pre-November level, reduce PPC spend by another 50%.

    The goal: In 12 months, get back to a $50K-75K/month PPC baseline (for new market testing and seasonal peaks) while organic carries the core demand.

    That $206K/month in current PPC spend? Reinvested in organic SEO gives you a 8-12 month payoff at which point that traffic is free for the next 5 years.

    Why Paul Davis’s Recovery is Easier Than 911 Restoration’s Rebuild

    Here’s the critical difference:

    911 Restoration peaked at 4,466 keywords in July 2024. By March 2025 when we wrote the playbook, they were down to 3,306. Now (February 2026) they’re at 816. They’ve been declining for 20+ months. The recovery path is long.

    Paul Davis peaked at 50,339 keywords in October 2025 — last year. They were still growing in September. The fundamental SEO infrastructure that generated 50K keywords is still there. The content is still there. The domain authority is still there (54, up from 51 in March).

    The problem is fixable because the foundation is recent and sound. It’s not a rebuild. It’s a bounce-back.

    With the 7-step strategy above, here’s what I’d expect:

    • Month 1-2: Technical fixes and canonicalization repair shows up in GSC coverage. Expect 500-1,000 re-indexed pages.
    • Month 2-3: Location page architecture updates and schema implementation. Expect rankings to improve on the most valuable pages first.
    • Month 3-6: New content layers (crisis-moment, decision-stage) start ranking. Keywords begin recovering. Conservative estimate: 35,000-40,000 keywords by June.
    • Month 6-12: Full content architecture matures. Location pages reinforce each other through internal linking. Authority content earns backlinks. Expect 45,000-50,000 keywords recovered.

    That trajectory puts Paul Davis back to $450K+/month organic value within 12 months, which means cutting PPC spend from $206K to $50-75K and freeing up $150K+/month in marketing budget that can be reinvested in growth.

    The Playbook Works Because Paul Davis Proved It Works

    The reason I’m confident in this recovery isn’t theory. It’s data. Paul Davis demonstrated they could execute SEO at scale — they grew from 39K to 50K keywords over eight months. That’s not luck. That’s a team running a good playbook.

    The November collapse wasn’t a signal that the playbook failed. It was a signal that something broke in execution — a technical issue, a structural change, an algorithm shift.

    But the foundation is there. The domain authority is there. The franchise network is there. All that’s missing is the diagnostic (days 1-3), the fix (days 4-30), and then doubling down on what already works (months 2-12).

    I’ve built the systems to execute this at franchise scale — the AI-powered content pipelines, the schema automation, the GEO optimization frameworks. And honestly? Watching a company that was actually winning bounce back is far more satisfying than watching a company rebuild from 800 keywords.

    Frequently Asked Questions

    What caused Paul Davis Restoration’s 54% keyword drop in December 2025?

    Based on the data pattern — a collapse from 50K to 23K keywords in a single month, combined with a spike in PPC spending — the most likely causes are a location page architectural change without proper redirects, a technical indexation issue (robots.txt, noindex tag, or CDN misconfiguration), or the November 2025 Google Core Update hitting thin location pages specifically. The best way to confirm is through a 72-hour audit of GSC coverage data (checking when “Excluded – currently not indexed” spiked) and a URL crawl to identify redirect errors, orphaned pages, or canonicalization issues.

    Why is Paul Davis’s SEO value higher per keyword than other restoration companies?

    Paul Davis has an estimated SEO value of $43/keyword ($952,800 ÷ 22,190 keywords in February 2026), compared to SERVPRO’s $33/keyword. This suggests Paul Davis is ranking for higher-value, higher-intent keywords — likely more commercial terms and geo-modified queries rather than informational content. It’s a quality-over-quantity advantage: fewer keywords, but more profitable ones. This is actually the ideal position for recovery, since restoring 5,000 high-value keywords is more profitable than restoring 20,000 low-value ones.

    How should Paul Davis balance PPC spending during SEO recovery?

    Don’t cut PPC immediately — that leaves money on the table and risks losing customers to competitors during the recovery window. Instead, maintain current PPC baseline (around $206K/month) during the first 60-90 days of recovery actions, then systematically shift budget to organic as rankings improve. A realistic timeline: reduce PPC by 20-30% by month 6 (when organic is recovering), then by another 50% by month 12 (when organic has achieved 60%+ recovery). This keeps revenue stable while investing in the long-term organic channel.

    What’s the difference between Paul Davis’s situation and 911 Restoration’s?

    911 Restoration has been declining for 20+ months (peaked July 2024 at 4,466 keywords, now at 816). It’s a comprehensive, systemic failure requiring a full rebuild. Paul Davis peaked in October 2025 (50,339 keywords) and collapsed sharply in November/December — suggesting a fixable technical or structural issue rather than a fundamental SEO failure. Paul Davis’s recovery is faster and more straightforward because the foundation (domain authority, content corpus, franchise network) is recent and proven to work. It’s a bounce-back, not a rebuild.

    How important is location page optimization for franchise restoration companies?

    It’s the engine of the entire strategy. If Paul Davis has 600 franchises across 6 core services, that’s 3,600+ location-service pages. A well-optimized location page can rank for 15-40 related keywords through local modifiers, long-tail variants, and service-specific searches. The math: 3,600 pages × 15 keywords average = 54,000 potential ranked keywords. Paul Davis currently has 22,190, meaning they have capacity for 32,000+ additional keyword rankings just by optimizing what exists. Location pages are where restoration companies win.

    What is Generative Engine Optimization (GEO) and why does Paul Davis need it?

    GEO is optimizing content so that AI systems — ChatGPT, Claude, Gemini, Google AI Overviews, Perplexity — cite and recommend your business by name. For restoration, GEO involves entity saturation (consistent brand-attribute associations across the web), factual density (specific claims about IICRC certification, response times, service areas), authoritative citations (EPA, FEMA, IICRC standards), and implementing LLMS.txt to guide AI crawlers. As AI-generated answers increasingly replace traditional search results, GEO becomes as important as traditional SEO. Paul Davis needs GEO to win when someone asks an AI system “who should I call for water damage in Houston?”

    What’s the realistic timeline for Paul Davis to recover to 40,000+ keywords?

    Based on the severity of the collapse (54% in one month) but the strength of the foundation (recent peak, high domain authority, proven content infrastructure), I’d estimate:

    • Month 1-2: Technical fixes and indexation recovery (expect 1,000-2,000 page re-indexing)
    • Month 3-6: Location page optimization and new content layers take effect (expect climb from 22K to 35,000-40K keywords)
    • Month 6-12: Full architecture maturity and authority building (expect 45,000-50,000 keywords)

    The path is faster than 911 Restoration because the problem is fixable, not systemic.


    There’s a reason I’m telling you all this instead of keeping it proprietary. Paul Davis Restoration was doing it right through most of 2025. They hit 50K keywords because they executed a real strategy at real scale. Then something broke. But broken things can be fixed.

    We’re Tygart Media. We build the systems that execute this playbook for restoration companies at franchise scale. We’ve already figured out the location page architecture, the schema automation, the content velocity pipeline, the GEO optimization. And honestly? Helping a company that knows how to execute bounce back is exactly the kind of project we live for.

    The data is public. The opportunity is real. And the timeline for recovery is tight — every month without action is another month where competitors gain ground.

    Reach out here if you want to have the conversation. Or don’t. But at least you’ll know what’s possible.

    (And hey, if you actually do have a water damage emergency while you’re thinking about this, we can recommend a Paul Davis location. We probably know a guy. Actually, at this point, we’ve worked with enough franchises that we definitely know a guy.)

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

  • If I Were Running ServiceMaster’s SEO, Here’s What I’d Do Differently

    If I Were Running ServiceMaster’s SEO, Here’s What I’d Do Differently

    I’m about to do something that most agency owners would never do: give away the entire playbook.

    Not a teaser. Not a “5 tips to improve your SEO” fluff piece. The actual, technical, step-by-step strategy I would execute — starting tomorrow — if **ServiceMaster** handed me the keys to their organic search program.

    Why? Because I pulled their SpyFu data this morning, and what I found stopped me mid-coffee. ServiceMaster essentially invented modern restoration franchising. They built the playbook that every restoration company has copied for the last three decades. They have brand recognition that money can’t buy. And they’re watching their organic search presence get destroyed in real time while they seem completely unconcerned.

    This isn’t gossip. This is data. And data deserves a response.

    ## The SpyFu Data: A Legacy Brand in Free Fall

    I pulled the full historical time series from the SpyFu Domain Stats API on March 30, 2026. Here’s what servicemaster.com looks like over the last 12 months:

    | Period | Organic Keywords | Monthly Organic Clicks | SEO Value ($/mo) | PPC Spend ($/mo) | Domain Strength |
    |——–|——————|———————-|——————|—————–|—————–||
    | Mar 2025 | 7,582 | 9,055 | $77,130 | $0 | 45 |
    | Apr 2025 | 7,612 | 8,755 | $86,940 | $0 | 45 |
    | May 2025 | 6,169 | 7,911 | $54,900 | $0 | 41 |
    | Jun 2025 | 5,413 | 6,592 | $48,260 | $0 | 41 |
    | Jul 2025 | 5,718 | 7,363 | $68,590 | $0 | 42 |
    | Aug 2025 | 3,168 | 5,604 | $28,880 | $253 | 39 |
    | Sep 2025 | 2,462 | 5,708 | $24,980 | $401 | 40 |
    | Oct 2025 | 2,548 | 5,664 | $30,280 | $512 | 41 |
    | Nov 2025 | 2,514 | 5,766 | $28,270 | $4,920 | 41 |
    | Dec 2025 | 1,870 | 3,910 | $15,380 | $9,266 | 39 |
    | Jan 2026 | 1,593 | 4,436 | $13,460 | $7,096 | 38 |
    | Feb 2026 | 1,742 | 4,435 | $39,300 | $7,039 | 42 |

    Let that sink in.

    **Peak SEO value: $334,384/month** (February 2020, historical data). **Current: $39,300/month.** That’s an **88.3% decline in six years**.

    **Peak keywords: 20,696** (August 2017). **Current: 1,742.** A **91.6% catastrophic wipeout in nine years**.

    And look at the trajectory from April to February 2026. In just 10 months, they hemorrhaged from 7,612 keywords down to 1,742. That’s a 77% collapse in a single year. The PPC column tells the real story: $0 in spend through most of 2025, then desperately cranking it up to $7,000/month by early 2026. They’re not marketing. They’re triage.

    That’s not strategy. That’s a company that’s stopped fighting.

    ## What Likely Went Wrong (And What It Means)

    Before I hand over the playbook, I need to be honest about what I think happened — because you don’t fix symptoms, you fix disease.

    A keyword portfolio shrinking from 20,696 to 1,742 over nine years isn’t content decay. Content decay is gradual — maybe 10-15% annually. This is **structural abandonment**. There are really only a few things that cause this pattern:

    **Scenario 1: Corporate Deprioritization.** ServiceMaster is a publicly traded company (part of Serco Group plc). If corporate decided that restoration franchising wasn’t a priority — maybe they divested or consolidated the business — then suddenly, nobody’s funding the SEO team. No budget = no optimization = rank collapse over time.

    **Scenario 2: Franchise Model Shift.** ServiceMaster franchises are independently owned and operated. If the franchisor stopped providing central marketing support and pushed franchisees to run their own local marketing, you’d see exactly this pattern: the parent domain deteriorates while individual franchise sites (if they’re managed well) might hold their own. But the national brand suffers catastrophically.

    **Scenario 3: Algorithm Penalties or Core Web Vitals Failures.** If servicemaster.com experienced technical issues — slow page load times, poor Core Web Vitals, indexation problems — and nobody fixed them over several years, Google would systematically de-rank the domain.

    **Scenario 4: Content Strategy Atrophy.** The simplest explanation: they stopped creating new content. No blog updates since 2021. No location page optimization. No response to algorithm updates. Just letting an old site sit on autopilot while Google moved on.

    My bet? It’s Scenario 1 and 4 combined. ServiceMaster owns the restoration space, but they’ve clearly decided it’s not where corporate energy goes anymore.

    ## Step 1: The 72-Hour Emergency Audit

    Before I write a single word of content or restructure a single URL, I need to understand what’s actually broken. This is a diagnostic sprint.

    ### Day 1: Crawl and Indexation Analysis

    I’d run **Screaming Frog** against the full servicemaster.com domain — every page, every redirect, every canonical tag. For a company this size, I’m expecting 3,000-8,000 URLs. I’m looking for:

    * **Redirect chains and loops** — Years of site updates create redirect chains that leak authority. Every 301 chain longer than 2 hops costs you PageRank.
    * **Orphan pages** — Pages that exist but have zero internal links pointing to them. If service pages or location pages aren’t linked from the main navigation, Google won’t prioritize crawling them.
    * **Duplicate content signals** — Thin location pages that share 90%+ identical content get consolidated by Google. If you have 50 city pages that all say the exact same thing, Google is ignoring 49 of them.
    * **JavaScript rendering issues** — If servicemaster.com uses client-side rendering for critical content, Google’s bot might not see what humans see.
    * **Canonical tag audit** — One broken template-level canonical directive can tell Google to ignore every page using that template. This is more common than you’d think on old franchise sites.

    ### Day 2: Google Search Console Deep Dive

    I need 48 months of GSC data — enough to cover the entire collapse. Specifically:

    * **Coverage report** — How many pages are in “Valid” vs. “Excluded”? When did the exclusion count spike? That tells me exactly when things broke.
    * **Exclusion reasons** — “Discovered – currently not indexed,” “Blocked by robots.txt,” “Alternate page with proper canonical tag.” Each reason points to a different root cause.
    * **Performance by page group** — Segment by URL pattern: /locations/*, /services/*, /franchise/*, /blog/*. Which group lost the most impressions? That’s where the problem is.
    * **Query decay over time** — Export 5 years of query data. When did the keyword count start declining? What types of queries disappeared first? If it’s all branded queries, the brand authority is intact but topical authority is gone. If it’s all location-based queries, the local pages are the problem.

    ### Day 3: Competitive Benchmarking

    I’d pull SpyFu data for their direct competitors — **SERVPRO**, **911 Restoration**, **Paul Davis Restoration**, **Belfor** — and chart the trajectories side by side.

    The question: did the entire restoration industry decline, or is this a ServiceMaster-specific problem?

    If everyone declined together, it’s an algorithm shift or industry disruption. ServiceMaster can compete by being smarter.

    If only ServiceMaster declined, it’s a self-inflicted wound that’s fixable.

    ## Step 2: Location Page Architecture — The Engine of Franchise Dominance

    This is the difference between a franchise that owns Google and a franchise that rents from Google. ServiceMaster’s corporate network spans restoration across North America with different legal entities, different service mixes, and different regional focuses. That complexity is an opportunity if architected correctly.

    ### The Hub-and-Spoke Model (Adapted for ServiceMaster’s Structure)

    Here’s the architecture I’d build:

    **Tier 1: National Service Pillar Pages**

    These are the authority anchors:

    * /water-damage-restoration/ → Targets “water damage restoration,” “water damage restoration company,” etc.
    * /fire-damage-restoration/ → Targets “fire damage restoration,” “fire damage repair”
    * /mold-remediation/ → Targets “mold removal,” “mold remediation”
    * /commercial-restoration/ → Targets “commercial water damage,” “business restoration services”
    * /carpet-cleaning-restoration/ → Targets “carpet cleaning,” “carpet restoration”

    Each pillar page is 3,500+ words of comprehensive, authoritative content that positions ServiceMaster as the category leader. These pages accumulate backlinks and pass equity down the hierarchy.

    **Tier 2: Regional Hub Pages**

    ServiceMaster should have one page per major region or state where they operate:

    * /restoration-services/texas/
    * /restoration-services/california/
    * /restoration-services/northeast/

    These pages contain regional-specific information — common restoration issues by climate, local building codes, regional partnership relationships. They link down to every service-specific page in that region.

    **Tier 3: Location/Franchise Pages**

    One page per franchise or operating location per service:

    * /restoration-services/texas/water-damage-restoration/
    * /restoration-services/texas/fire-damage-restoration/
    * /restoration-services/california/water-damage-restoration/

    If ServiceMaster operates 80+ locations across 4-5 core service categories, that’s **400-500 location-service combinations**. At 25 long-tail keywords per page, that’s **10,000-12,500 rankable keywords** — which is more than the 1,742 they currently have.

    ## Step 3: Content Strategy — Crisis, Decision, Authority

    Restoration companies make a fatal mistake: they only create bottom-of-funnel content. Every page says “call ServiceMaster for water damage restoration.” But a homeowner standing in an inch of water isn’t searching for a restoration company. They’re searching for “what should I do right now?”

    Whoever answers that question gets the call.

    ### Tier 1: Crisis-Moment Content (The 2 AM Searcher)

    * “What to Do When Your House Floods: Emergency Steps Before Professional Help Arrives”
    * “My Basement Is Flooded — What Do I Do Right Now?”
    * “House Fire Damage Assessment: What to Check First”
    * “Black Mold Found in My House: Immediate Steps to Take”
    * “Pipe Burst During Winter: Emergency Response Checklist”

    Format: Numbered steps, definition boxes, HowTo schema, featured snippet optimization. These pages are designed to be cited in Google AI Overviews and answered in voice search.

    ### Tier 2: Decision-Stage Content (The Insurance Conversation)

    * “Does Homeowners Insurance Cover Water Damage? Complete 2026 Guide”
    * “Water Damage Restoration Cost: Regional Breakdown and Pricing Factors”
    * “Water Mitigation vs. Restoration: What’s the Difference?”
    * “Choosing a Restoration Company: What to Look For”
    * “Timeline for Water Damage Restoration: What to Expect”

    These pages need comparison tables, cost breakdowns, and FAQPage schema. They’re designed for someone who already knows they need professional help but is shopping around.

    ### Tier 3: Authority-Building Content

    * “IICRC Certification Explained: Why It Matters in Water Damage Restoration”
    * “The Science of Structural Drying: Complete Technical Guide”
    * “Mold Testing vs. Mold Inspection: What’s the Difference?”
    * “How to Prepare Your Home for Storm Season: Disaster Preparedness Guide”
    * “Understanding FEMA Flood Zones and What They Mean for Your Property”

    These pages earn backlinks from industry associations, insurance publications, local news, and real estate blogs. Those links flow equity to the money pages.

    ## Step 4: Schema Markup — The Technical Foundation

    Structured data is where most restoration companies leave 20-30% of their ranking potential on the table.

    ### Required Schema Implementation

    **LocalBusiness schema on every location page:**

    “`json
    {
    “@type”: “LocalBusiness”,
    “name”: “ServiceMaster of [City Name]”,
    “address”: {
    “@type”: “PostalAddress”,
    “streetAddress”: “[Address]”,
    “addressLocality”: “[City]”,
    “addressRegion”: “[State]”,
    “postalCode”: “[ZIP]”,
    “addressCountry”: “US”
    },
    “geo”: {
    “@type”: “GeoCoordinates”,
    “latitude”: “[latitude]”,
    “longitude”: “[longitude]”
    },
    “telephone”: “[Phone Number]”,
    “openingHoursSpecification”: [
    {
    “@type”: “OpeningHoursSpecification”,
    “dayOfWeek”: [“Monday”, “Tuesday”, “Wednesday”, “Thursday”, “Friday”, “Saturday”, “Sunday”],
    “opens”: “00:00”,
    “closes”: “23:59”
    }
    ],
    “areaServed”: {
    “@type”: “City”,
    “name”: “[City]”
    },
    “hasOfferCatalog”: {
    “@type”: “OfferCatalog”,
    “itemListElement”: [
    {
    “@type”: “Offer”,
    “itemOffered”: {
    “@type”: “Service”,
    “name”: “Water Damage Restoration”
    }
    },
    {
    “@type”: “Offer”,
    “itemOffered”: {
    “@type”: “Service”,
    “name”: “Fire Damage Restoration”
    }
    },
    {
    “@type”: “Offer”,
    “itemOffered”: {
    “@type”: “Service”,
    “name”: “Mold Remediation”
    }
    }
    ]
    }
    }
    “`

    **On service pages:** Article + Service + FAQPage + BreadcrumbList + Schema.org/Service

    **On blog posts:** Article + FAQPage + Speakable (on answer paragraphs)

    When implemented across 400+ pages with consistent data, you’re giving Google a machine-readable map of ServiceMaster’s entire franchise network.

    ## Step 5: Google Business Profile Management — The Local Pack Battleground

    In restoration, the Local Pack (the 3 map results) captures more high-intent traffic than organic results. When someone searches “water damage restoration near me,” they look at the map first.

    Winning the Local Pack requires systematic GBP optimization:

    * **Weekly GBP posts** — Real posts about completed projects, seasonal preparedness tips, team spotlights. Google’s algorithm rewards consistent posting activity.
    * **Review velocity** — Every location needs a systematic review request process. Target: 200+ reviews at 4.8+ stars per location within 12 months. Respond to every review within 24 hours.
    * **Photo strategy** — 50+ photos per location: team, equipment, projects, office, vehicles. Geotagged. Updated monthly.
    * **Q&A seeding** — Proactively add and answer the top 10 questions for each location’s GBP.
    * **Service area clarity** — Define service areas as precise polygons, not just “surrounding areas.”

    ## Step 6: Answer Engine Optimization (AEO) — Win the AI Results

    Google’s AI Overviews now appear on most informational queries. When someone asks “what do I do if my house floods,” Google generates a synthesized answer and cites specific sources.

    If ServiceMaster’s content isn’t structured to be cited, they’re invisible.

    * **Definition boxes** — Open every service page with a 50-word authoritative definition. This is what Google AI extracts and cites.
    * **Direct-answer formatting** — Structure H2s as questions. Answer them completely in the first 50 words. AI Overviews pull from this pattern.
    * **Comparison tables** — “Water Damage vs. Fire Damage” with side-by-side tables. AI loves structured comparisons.
    * **Numbered process lists** — “The 7 Stages of Water Damage Restoration.” This format wins HowTo rich results and AI citations simultaneously.

    ## Step 7: Generative Engine Optimization (GEO) — Be the Company AI Recommends

    This is the frontier. Most restoration companies don’t even know this exists. GEO is about making AI systems — Claude, ChatGPT, Gemini, Perplexity — recommend ServiceMaster by name.

    * **Entity saturation** — “ServiceMaster” needs to appear across the web in consistent association with specific attributes: IICRC certified, 24/7 availability, regional expertise, specific certifications, risk response capability.
    * **Factual density** — Replace “we provide excellent restoration services” with “ServiceMaster’s team is trained to IICRC S500/S520 standards and deploys truck-mounted extractors capable of removing 300+ gallons per minute.”
    * **Authoritative citation weaving** — Link to EPA mold guidelines, FEMA flood resources, IICRC standards, state-specific regulations. AI systems weight this higher because it signals expertise.
    * **LLMS.txt implementation** — Add a /llms.txt file to root domain providing AI crawlers with a structured summary of ServiceMaster’s business, services, geographic coverage, and authoritative attributes.

    ## Step 8: Internal Linking — The Circulatory System

    A franchise site without proper internal linking is a highway system with no on-ramps.

    * **Pillar → State → City cascade** — National pillar links to every regional hub. Regional hubs link to every city page in that region. City pages link back up. Closed loop of authority.
    * **Cross-service linking at the city level** — Houston water damage page links to Houston mold page, Houston fire page. Keeps users on site and signals contextual relevance.
    * **Blog-to-location contextual links** — Every blog post includes natural in-text links to relevant city pages. “If you’re dealing with flooding in Chicago, our IICRC-certified team is available 24/7 — [learn more about ServiceMaster’s Chicago water damage restoration].”
    * **Related content blocks** — Automated bottom-of-page blocks showing 3-5 topically related pages. Scales automatically as you publish more content.

    ## Step 9: Backlink Acquisition — Leverage the Franchise Network

    ServiceMaster’s franchise structure is an asset most competitors can’t match:

    * **Disaster response PR** — After every major emergency, issue press releases to local media with quotes from location owners. Local news sites (high authority, high relevance) pick these up.
    * **Insurance partnerships** — ServiceMaster should be on preferred vendor lists with insurance carriers. Each carrier relationship should include a backlink from their website.
    * **Industry association profiles** — Active profiles on IICRC.org, RestorationIndustry.org, state contractor licensing boards. These .org links carry significant trust signals.
    * **Civic partnerships** — Chamber of Commerce, BBB profiles, Rotary sponsorships, local organization memberships. Each location should systematically acquire 20-30 local directory backlinks.
    * **Content partnerships** — Co-create disaster preparedness guides with FEMA, emergency management agencies, fire departments. “Hurricane Preparedness Guide — by ServiceMaster and the American Red Cross.” The .gov backlink is worth the effort.

    ## Step 10: Kill the PPC Dependency (And Rebuild the Organic Engine)

    ServiceMaster spent an estimated **$21,587 on Google Ads in the last 12 months** (increasing from $0 to $7,039/month). That’s reactive and unsustainable. Here’s the math:

    * At their 2020 peak, ServiceMaster’s organic traffic was worth **$334,384/month** — **$4.01 million/year** in equivalent ad spend delivered for free.
    * A comprehensive SEO program would cost a fraction of their current PPC spend.
    * If they rebuild to just **half their peak value** ($167K/month), that’s **$2 million/year** in traffic they no longer need to buy.
    * Organic traffic compounds. SEO is a long-term asset. PPC is a treadmill.

    The ROI case is overwhelming.

    ## The Bottom Line

    ServiceMaster invented the restoration franchise. They built the playbook that SERVPRO and 911 Restoration have copied. They have 70+ years of brand history. They have franchise infrastructure across North America. They have domain authority that still ranks at 42 despite years of neglect.

    And they’re getting outranked by companies 1/10th their size because those companies are actually trying.

    ServiceMaster didn’t fail because restoration franchising is saturated. They’re failing because they stopped investing in the channel that built their brand — organic search.

    The opportunity isn’t a mystery. It’s an execution problem. And the 10-step playbook above is how you fix it.

    Here’s my real talk:

    **Hey, ServiceMaster. You invented this industry. You should own Google for every restoration keyword that exists. The data is public. The decline is real. The fix isn’t a mystery — it’s investment and execution.**

    **We’re [Tygart Media](https://tygartmedia.com). We live and breathe restoration SEO. We’ve built the systems to execute everything above at franchise scale. We’ve already done this for companies in your space. And honestly? We’d love to have the conversation about what $200K+/month in organic value looks like when it’s back.**

    **[Reach out here](https://tygartmedia.com/contact). No pressure. No hard sell. Just two teams who understand the industry talking about what a digital resurrection looks like.**

    **Or don’t. Keep spending $7K/month on Google Ads for the traffic you’re literally giving away.**

    **Your choice. We’ll be here either way. Just maybe not for your competitors. 😄**

    ## Frequently Asked Questions

    ### How much organic traffic has ServiceMaster lost?

    ServiceMaster’s organic presence has declined catastrophically over the last nine years. Their peak of 20,696 organic keywords (August 2017) has collapsed to 1,742 keywords as of February 2026 — a 91.6% reduction. Their peak SEO value was $334,384/month (February 2020), compared to just $39,300/month today (February 2026) — an 88.3% decline. In the last 10 months alone (April 2025 to February 2026), they lost 77% of their keywords, dropping from 7,612 to 1,742.

    ### Why isn’t ServiceMaster spending on Google Ads if they understand the traffic problem?

    ServiceMaster spent $0 on Google Ads for most of 2025, then gradually increased spending to $7,039/month by February 2026. This pattern suggests they may not have recognized the organic decline urgently, or corporate prioritization shifted away from the restoration vertical. The recent increase in PPC spending indicates they’re now buying back traffic they used to capture organically — which is more expensive and less sustainable than organic search.

    ### What is the most critical SEO fix for ServiceMaster?

    The most impactful single fix would be rebuilding and optimizing the location page architecture. ServiceMaster’s franchise structure creates a natural advantage: 80+ locations × 4-5 service categories = 400-500 location-service combinations. Each properly optimized page targeting unique, locally-relevant content could drive 25+ keywords. That alone could restore 10,000+ keywords within 12 months. Currently, they’re capturing a fraction of this potential.

    ### How does ServiceMaster’s situation compare to 911 Restoration?

    Both companies have experienced severe organic decline, but ServiceMaster’s is more dramatic. 911 Restoration’s peak was $407,500/month (March 2022) vs. $22,700 current. ServiceMaster’s peak was $334,384/month (February 2020) vs. $39,300 current. However, ServiceMaster’s keyword collapse is steeper (91.6% over nine years). 911 Restoration’s decline happened faster (94.4% from peak) but more recently. Both represent massive opportunities for comprehensive SEO rebuilding. [Read the 911 Restoration playbook here](https://tygartmedia.com/911-restoration-seo-playbook/).

    ### What is Generative Engine Optimization (GEO) and why does it matter?

    Generative Engine Optimization is the practice of optimizing your content and online presence so that AI systems — Google AI Overviews, ChatGPT, Claude, Gemini, Perplexity — recommend your business by name. For restoration companies, this means consistent entity saturation across the web (brand + attributes), factual density (specific, verifiable claims), authoritative citations (EPA, FEMA, IICRC standards), and LLMS.txt implementation. GEO is becoming critical as AI-generated answers increasingly replace traditional search results.

    ### How long would it take to restore ServiceMaster’s organic traffic?

    A realistic timeline for ServiceMaster would be 6-12 months for technical fixes and content architecture to take effect, with meaningful improvement visible within 4-6 months. Full recovery to even half their peak (75 years of organic value) would require 12-18 months of sustained effort. The first 90 days typically show the highest-impact gains because fixing technical issues (indexation, redirects, schema) often produces immediate improvements once Google re-crawls the corrected pages.

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

    {
    “@context”: “https://schema.org”,
    “@type”: “Article”,
    “headline”: “If I Were Running ServiceMasters SEO, Heres What Id Do Differently”,
    “description”: “ServiceMaster built modern restoration. Now their digital presence looks like 1989. A $334K/month peak vs. $39K today. Here’s the exact playbook to resurr”,
    “datePublished”: “2026-03-30”,
    “dateModified”: “2026-04-03”,
    “author”: {
    “@type”: “Person”,
    “name”: “Will Tygart”,
    “url”: “https://tygartmedia.com/about”
    },
    “publisher”: {
    “@type”: “Organization”,
    “name”: “Tygart Media”,
    “url”: “https://tygartmedia.com”,
    “logo”: {
    “@type”: “ImageObject”,
    “url”: “https://tygartmedia.com/wp-content/uploads/tygart-media-logo.png”
    }
    },
    “mainEntityOfPage”: {
    “@type”: “WebPage”,
    “@id”: “https://tygartmedia.com/servicemaster-seo-playbook/”
    }
    }

  • If I Were Running SERVPRO’s SEO, Here’s What I’d Do Differently

    If I Were Running SERVPRO’s SEO, Here’s What I’d Do Differently

    If I Were Running SERVPRO’s SEO, Here’s What I’d Do Differently

    SERVPRO owns 178,900 keywords worth $5.8 million per month in organic search value. They’re the 800-pound gorilla of the water restoration space. But they just lost 108,000 keywords in four months—a 38% collapse from their October 2025 peak. And they’re spending $2 million per month on PPC to paper over the cracks.

    The Math That Should Keep SERVPRO’s CMO Up at Night

    Let that sink in. In October 2025, SERVPRO ranked for 286,900 keywords. By February 2026—four months later—they were down to 178,900. That’s not algorithmic drift. That’s not seasonal. That’s a Category 5 hurricane hitting your organic search machine, and it happened almost silently while they threw another $2M at Google Ads to keep the lights on.

    Here’s the thing: SERVPRO has domain strength of 62, the strongest I’ve seen in the restoration vertical. They have brand authority. They have content. They have traffic. But they’re treating SEO like a legacy channel while they shovel money into PPC—the exact opposite of what their competitive position should demand.

    I ran the numbers on SERVPRO’s performance over the last 12 months. Take a look.

    Month Keywords Ranking Monthly Clicks SEO Value Domain Strength PPC Spend
    Feb 2025 245,100 148,300 $3,950,000 60 $1,820,000
    Mar 2025 251,200 152,400 $4,180,000 60 $1,950,000
    Apr 2025 248,900 150,100 $4,100,000 60 $1,880,000
    May 2025 253,400 153,900 $4,270,000 61 $1,920,000
    Jun 2025 259,100 157,200 $4,420,000 61 $1,880,000
    Jul 2025 265,300 161,000 $4,580,000 61 $1,950,000
    Aug 2025 272,100 164,800 $4,750,000 61 $2,010,000
    Sep 2025 281,200 170,400 $5,120,000 61 $2,080,000
    Oct 2025 286,900 174,000 $5,420,000 62 $2,150,000
    Nov 2025 268,400 162,500 $4,840,000 62 $2,090,000
    Dec 2025 223,100 135,200 $3,200,000 62 $1,980,000
    Feb 2026 178,900 151,700 $5,825,000 62 $1,944,000

    Wait. Stop. Look at February 2026 again. Keywords tanked to 178,900, but SEO value exploded to $5,825,000. How is that possible?

    Because SERVPRO stopped chasing long-tail volume and started extracting revenue from money keywords. They’re ranking for fewer terms, but the terms they *are* ranking for convert harder. That’s actually a sign that something—either an algorithm shift or a deliberate technical decision—forced them to consolidate their keyword real estate.

    But here’s what kills me: they’re still spending $1.944M per month on PPC. If they could stabilize their organic keyword portfolio and clean up their technical architecture, they could cut that spend by half and *increase* total revenue. Instead, they’re patching the hole with paid traffic.

    What Likely Went Wrong (And Why It Matters)

    SERVPRO owns 2,000+ franchise locations across North America. Each location is its own business, often with its own digital presence. That’s the double-edged sword of their model: massive reach, but fragmented authority.

    When you have that much real estate spread across the internet, a single algorithm update—or a deliberate consolidation on Google’s part—can evaporate keyword rankings overnight. Here are the most likely culprits:

    1. Location Page Cannibalization

    If SERVPRO has 2,000 location pages all competing for “water damage restoration near me” or “SERVPRO [city],” they’re killing their own rankings. Google gets confused. It doesn’t know which page to rank. So it ranks fewer of them.

    The fix: Implement a tiered location strategy. National hub page > regional cluster > local pages. Internal link from hub to region to local. Avoid keyword duplication. Use structured data (LocalBusiness with serviceArea) to signal geographic relevance without creating duplicate content.

    2. Content Architecture Decay

    SERVPRO’s main site probably wasn’t architected with 2,000+ location pages in mind when it was built. Over time, internal linking broke, breadcrumb trails became inconsistent, and authority stopped flowing predictably. No one’s actively managing the link graph at scale.

    The fix: Conduct a full internal linking audit. Map out which pages should funnel authority to which. Restore broken links. Create programmatic breadcrumb trails. Use topic clusters to create thematic authority hubs that feed into location pages.

    3. E-E-A-T Fragmentation

    Google’s moved heavily toward E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) in recent years. A national franchise system’s E-E-A-T is strong at the brand level, but uneven at the franchise location level. Some franchisees have reviews and credentials. Some don’t.

    The fix: Standardize E-E-A-T signals across the network. Ensure every location page has aggregated reviews, credentials, licenses, and “about” information. Use Author entities to link individual technicians to content. Make the system defensible against algorithm swings.

    4. Technical Debt From Franchise Independence

    Here’s the ugly truth: SERVPRO franchisees run their own businesses. Some have modern websites. Some are running 2015-era WordPress themes. Some use white-label platforms that Google barely indexes. When you have 2,000 franchise sites under one umbrella, you’re battling technical inconsistency at scale.

    The fix: Offer franchisees a standardized tech stack. Migrate independent sites into a consolidated platform (either subdomains or a federated network). Enforce technical requirements (Core Web Vitals, mobile responsiveness, schema markup). Make SEO non-negotiable.

    The SERVPRO SEO Playbook: 8 Steps to Recover 150,000+ Keywords

    Step 1: Conduct a Keyword Bleed Forensics Audit

    Pull your keyword history for the last 24 months in SpyFu. Sort by rank drop (now ranking outside top 100). Segment by keyword type:

    • Money keywords (water damage restoration, fire damage, mold removal): Why did you lose these? Pull them up in GSC. Are impressions down? CTR down? Rank dropped?
    • Branded + geo keywords (SERVPRO [city], water damage [city]): You should own almost all of these. If you’ve lost them, it’s likely location page cannibalization.
    • Long-tail keywords (what can I do about water damage in my basement): This is where the 108,000-keyword drop is probably concentrated. These are lower-value keywords. Maybe that’s intentional. Maybe it’s not.
    • Competitor keywords (911 restoration competitors, other local services): Are you losing share in competitive space, or just retracting from low-intent terms?

    Once you’ve segmented, you know exactly where the damage is. Then you can fix the right thing instead of guessing.

    Step 2: Audit Your Location Page Architecture

    Pull a sample of 50 location pages across different regions. Check these metrics:

    • Are they templated consistently, or do they vary widely?
    • Do they have unique content (service descriptions, local reviews, technician bios), or are they duplicates?
    • How do they link to each other? Is there an authority flow from national > regional > local?
    • Are they indexed individually, or are some being de-indexed?

    Run a GSC export to see which location pages are getting search impressions. You’ll likely see a long tail where 80% of your locations get minimal organic traffic.

    That’s your content architecture problem. Fix it and watch rankings come back.

    Step 3: Implement a Three-Tier Location Page System

    Replace the flat structure with depth:

    Tier 1: National Hub — One authority page covering water damage restoration, fire damage, mold removal, etc. This page should be a semantic authority fortress: comprehensive content, strong internal linking, high-quality backlinks. All location pages link back to this.

    Tier 2: Regional Clusters — Group your 2,000 locations into 20-30 regions (Northeast, Southeast, Midwest, etc.). Create regional pages covering “water damage restoration in [region]” with:

    • Aggregated statistics (e.g., “SERVPRO has restored 50,000+ properties in the Northeast”)
    • Links to all location pages in that region
    • Regional case studies or testimonials
    • Regional licensing/credentials information

    Tier 3: Local Pages — One page per location (or market). Include:

    • Unique local content (service menu tailored to local disasters, local team bios, local case studies)
    • LocalBusiness schema with full address, phone, reviews
    • Internal links from regional page and national hub
    • Links to adjacent locations (e.g., nearby franchise territories)
    • Unique on-page content that distinguishes this location from others (at least 500-1000 words)

    This structure signals to Google: “These are related but distinct properties. Each one has authority and relevance to its geography.”

    Step 4: Repair Internal Linking at Scale

    Your 286,900-keyword peak suggests you had strong internal linking. Your 178,900-keyword current state suggests it broke. Here’s how to rebuild it:

    Map the authority flow: Create a spreadsheet showing how authority should flow. National page (highest authority) > Regional pages (medium) > Location pages (local). Add cross-links between adjacent locations. Add contextual links from blog content to relevant location pages.

    Fix broken links: Run your site through Screaming Frog. Find all 404s and redirect chains. Fix them. Broken links kill authority flow.

    Create topic clusters: Your main content topics (water damage, fire damage, mold, etc.) should each have a hub page. Every blog post should link to the relevant hub. Every location page should link to the relevant hub. This creates thematic relevance signals that help with rankings.

    Implement breadcrumb navigation: Home > Service > Location. This signals site structure to Google and improves crawlability.

    At scale, this is a 6-8 week project, but it’s foundational. You can’t have 5.8M in monthly SEO value without a solid internal link graph.

    Step 5: Standardize E-E-A-T Across All Locations

    Create a template/playbook for franchisees that includes:

    • Local review aggregation: Pull Google, Yelp, and industry reviews to each location page. Show star ratings. Highlight top reviews. Aggregate to the brand level.
    • Credentials display: State licenses, certifications, insurance. Show that this franchisee is legit. Make it dynamic (pull from a central database, don’t hardcode).
    • Local team bios: Include photos and bios of the top 3-5 technicians at each location. Give them Google Author profiles if possible. Make E-E-A-T tangible.
    • Local case studies: Every location should have at least 2-3 case studies showing real work they’ve done. Before/after photos, descriptions. This builds Experience + Authoritativeness.
    • Trust signals: Display member affiliations (DRIstoration Network, IICRC, etc.), “Featured in” logos, awards. Design signals matter.

    This isn’t optional. It’s the baseline for ranking in a trust-dependent vertical. Do it across all 2,000 locations and you’ll see keyword recovery.

    Step 6: Implement Generative Engine Optimization (GEO)

    Google’s Gemini, ChatGPT, and Claude are increasingly the first place people go for answers. You should own that real estate too.

    Make your site AI-friendly:

    • Add a FAQ schema on every page with questions people actually ask. Make sure your answers are comprehensive and cite-worthy.
    • Create a structured data layer that AI engines can parse: LocalBusiness, FAQPage, HowTo, Review. The richer your data, the more likely AI pulls from you.
    • Target conversational queries in your content: “What should I do if I have water damage?” “How much does restoration cost?” “Can I restore water-damaged documents?” These are the queries AI-powered search will prioritize.
    • Build a knowledge base or glossary explaining restoration terminology. AI systems will index this as foundational content.

    The restoration vertical is perfect for GEO. People are panicked when they need you. An AI system recommending “SERVPRO is the largest restoration franchise” is worth millions in future organic traffic.

    Step 7: Cut Waste From Your $1.944M/Month PPC Spend

    I’m not saying cut PPC entirely. But you’re spending $1.944M per month while owning 178,900 keywords. That’s insurance money. Here’s where to redirect it:

    • Kill low-ROAS keywords: Pull your Google Ads data. Find keywords with CPA > 3x your conversion value. These are money sinks. Pause them. Let organic handle them if it can.
    • Shift budget from branded to high-intent: You should own branded keywords (SERVPRO + geo) organically. Paying for them is waste. Redirect that budget to high-intent non-branded terms where you’re not yet ranking in top 3.
    • Test seasonal PPC budgets: Restoration demand spikes after storms. You don’t need to bid aggressively in January. Build a seasonal playbook. Save $100K-200K per month in off-season.
    • Consolidate accounts and campaigns: 2,000 franchisees = probably 1,000+ Google Ads accounts. Consolidate them under a central management structure. Eliminate duplicate bidding. Unified budget allocation is way more efficient.

    Conservative estimate: You could cut $500K-750K per month from PPC and improve overall ROI by moving budget to organic. That’s $6-9M annually. Worth it.

    Step 8: Build a Fragmented Franchisee Network Into a Federated Authority System

    This is the long-term play. Right now, SERVPRO likely looks like this to Google: 2,000 separate businesses with the SERVPRO brand. Google doesn’t really know how to rank them as one system.

    Here’s what you should build instead:

    • Consolidated location architecture: servpro.com/locations/[city-state] for all locations, managed centrally. Not franchisee.com or subdomain.servpro.com. One unified system, 2,000 variations.
    • Federated content model: National content hub (servpro.com/restoration-guides) serves as the authoritative source. Franchisees republish and localize. Create a content syndication system that keeps authority centralized while allowing local customization.
    • Unified review aggregation: Pull all franchisee reviews into a central system. Rank locations by star rating. Make the whole network defensible.
    • Centralized link building: One brand-level link-building strategy, feeding authority down to locations. Not 2,000 franchisees all trying to build links independently.

    This takes 12-18 months to execute, but when you land it, you’ll see your keyword count jump by 150,000+ and you’ll be basically unbeatable in your vertical.

    The Opportunity Cost of Staying Put

    SERVPRO lost 108,000 keywords in 4 months. Let’s say half of those were low-intent long-tail (worth $20-50 per click). That’s about 54,000 keywords × $30 average = $1.62M per month in lost organic value.

    They made up for it by extracting more revenue from fewer, higher-value keywords (Feb 2026 value spike). But they’re also spending $1.944M per month on PPC to maintain traffic volume.

    If SERVPRO recovered to 240,000 keywords (their level in August 2025), they’d likely add another $1.5-2M per month in organic value *and* be able to cut PPC spend by 40-50%. That’s a $3-4M monthly swing.

    Over a year, that’s $36-48M in additional profit from fixing SEO.

    And that’s being conservative. SERVPRO’s brand is so strong that if they could demonstrate to Google that they’re the E-E-A-T authority in restoration, they could probably rank for *more* keywords than they did at their October 2025 peak.

    The Playbook in Practice

    You’d execute this in three phases:

    Phase 1 (Month 1-2): Diagnosis & Architecture — Forensics audit, location page audit, three-tier architecture design. Identify quick wins (broken links, obvious cannibalization). Get executive buy-in on the federated model.

    Phase 2 (Month 3-6): Execution & Standardization — Roll out three-tier system. Repair internal linking. Standardize E-E-A-T templates. Implement GEO. Test PPC reductions on low-ROAS keywords. Monitor GSC for ranking recovery.

    Phase 3 (Month 7-12): Optimization & Scale — Feed winners. Scale what works. Build federation toward the long-term model. By month 12, you should see 60-70% of your lost keywords recovered. By month 18, you should be back to 240,000+ keywords.

    Is this work? Yes. Is it technical? Absolutely. But SERVPRO has the authority, the domain strength, and the economic incentive to execute it. They just need fresh eyes on the architecture and a willingness to think bigger than “add more PPC.”

    Why SERVPRO Specifically

    I picked SERVPRO for this analysis because they represent something important: dominance is fragile.

    They have domain strength 62. They own 178,900 keywords. They’re the category leader. But they’re also spending $2M per month on PPC to maintain that position—which suggests their organic is leaking. They peaked at 286,900 keywords just 5 months ago, and they lost 38% of that in 4 months flat.

    That’s not normal erosion. That’s a system breaking.

    And here’s what kills me: they have all the ingredients to fix it. They have authority. They have traffic. They have the budget. They just need someone to say “your location page architecture is the problem, and here’s how to rebuild it.”

    The restoration vertical is also perfect for this because SERVPRO competes on brand + trust, not pure convenience. If you can dominate Google’s algorithm while also dominating AI-powered search (GEO), you own the entire funnel. The CMO who pulls that off will be a legend.

    Common Questions

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

    Q: Could algorithm changes alone explain the 108,000-keyword drop?

    Maybe partially. But 38% keyword loss in 4 months is unusual even for a major core update. Algorithm changes typically cause 5-15% fluctuation across a healthy site. The magnitude here suggests an underlying technical issue got exposed by an algorithm shift.

    Most likely explanation: SERVPRO’s location pages were competing with each other (cannibalization). An algorithm update prioritized consolidation (ranking fewer pages more strongly per topic). When that happened, SERVPRO lost the “also ran” rankings but kept the top positions. The keyword *count* looks bad, but the keyword *value* stayed strong. Still, you’re leaving revenue on the table.

    Q: Isn’t running 2,000 location pages inherently limited?

    Not at all. If you build the architecture right. Think about how many pages Wikipedia ranks for (millions). Think about how many pages e-commerce sites rank for (hundreds of thousands). The issue isn’t scale—it’s whether your site is optimized for scale.

    SERVPRO’s issue is probably that their location pages were built incrementally (added as franchisees joined) without a master architecture in mind. So the system grew organically but unsystematically. Rebuild the architecture and you solve it.

    Q: Could they focus only on organic and eliminate PPC?

    Not immediately. PPC is insurance. SERVPRO operates in a trust-dependent, high-intent vertical. They need to own the top of the SERP to win. During the recovery period (months 1-12), PPC is your safety net.

    But long-term, if you recover 240,000+ keywords and your E-E-A-T is solid, you can cut PPC by 50-60% and probably *increase* revenue because organic converts better (higher intent) than paid ads.

    Q: How do you measure success on this playbook?

    Three metrics: Keywords ranking (target 240K+), monthly organic clicks (target 160K+), and SEO value (target $5.5M+). You should also track PPC spend reductions and ROI improvements.

    Monthly GSC reports showing ranking recovery. Monthly rank tracking on your 200 highest-value keywords. Quarterly attribution reports tying organic to revenue.

    Q: What’s the biggest risk of this playbook?

    Consolidation risk. Moving from 2,000 independent location pages to a federated system means centralizing control. Franchisees lose some autonomy. Some franchisees will resist. You need executive support to force the technical change, even if it annoys franchisees short-term.

    But the alternative is bleeding 38% of your keywords every 4 months. At some point, you have to choose: fight the SEO problem or accept the $2M/month PPC tax forever.

    The Ask

    If I were SERVPRO’s CMO, I’d take this playbook to the CEO and say:

    “We’ve lost 108,000 keywords in 4 months. We’re spending $2M per month on PPC to compensate. Our domain strength is 62—the strongest in the industry. If we fix the location page architecture, we’ll recover 150,000 keywords, add $2-3M per month in organic value, and cut PPC spend by 40-50%. That’s a 3:1 ROI on the project. And the brand will own the restoration category for the next 5 years.”

    It’s the right move. Whether SERVPRO makes it is up to them.

    But if you’re running a site with hundreds (or thousands) of location pages, apply this playbook to your business. Audit your keyword loss. Rebuild your architecture. Fix your E-E-A-T. You don’t have to be as big as SERVPRO to benefit. Most franchised verticals have this exact vulnerability.

    If you want help implementing this—or diagnosing why your keywords are bleeding—reach out here. We’ve done this at scale for franchise networks and multi-location enterprises. It works. 😄

    P.S.: If you found this useful, check out our SEO analysis of 911 Restoration—a different player in the same vertical with a different set of SEO problems. Comparing the two gives you a masterclass in how different strategies lead to different outcomes.

  • If I Were Running 911 Restoration’s SEO, Here’s Exactly What I’d Do

    If I Were Running 911 Restoration’s SEO, Here’s Exactly What I’d Do

    I’m about to do something that most agency owners would never do: give away the entire playbook.

    Not a teaser. Not a “5 tips to improve your SEO” fluff piece. The actual, technical, step-by-step strategy I would execute — starting tomorrow — if 911 Restoration handed me the keys to their organic search program.

    Why? Because I pulled their SpyFu data this morning, and what I found stopped me mid-coffee. One of the largest restoration franchises in North America — 1,500+ employees, 200+ territories, an in-house marketing division called Milestone SEO that’s been running since 2003 — is watching their organic search presence evaporate in real time.

    This isn’t gossip. This is data. And data deserves a response.

    The SpyFu Data: A Domain in Freefall

    I pulled the full historical time series from the SpyFu Domain Stats API on March 30, 2026. Here’s what 911restoration.com looks like over the last 12 months:

    Period Organic Keywords Monthly Organic Clicks SEO Value ($/mo) PPC Spend ($/mo) Domain Strength Avg. Rank
    Mar 2025 3,306 1,889 $42,210 $102,700 42 43.7
    Apr 2025 3,409 2,350 $47,310 $116,600 42 43.9
    May 2025 2,665 1,468 $37,380 $120,400 39 43.1
    Jun 2025 2,375 1,602 $24,330 $118,800 38 42.7
    Jul 2025 2,093 881 $20,180 $89,840 37 43.8
    Aug 2025 2,881 1,088 $34,700 $25,660 39 50.3
    Sep 2025 2,737 939 $32,500 $13,420 41 51.8
    Oct 2025 2,530 786 $28,750 $8,938 41 53.2
    Nov 2025 2,571 777 $28,780 $370,600 41 52.6
    Dec 2025 950 925 $8,522 $191,800 36 43.5
    Jan 2026 845 683 $9,436 $152,100 36 41.3
    Feb 2026 816 617 $22,700 $132,100 40 42.5

    Let that sink in.

    Peak SEO value: $407,500/month (March 2022). Current: $22,700/month. That’s a 94.4% decline.

    Peak keywords: 4,466 (July 2024). Current: 816. An 81.7% wipeout in 20 months.

    And look at the PPC column. November 2025: $370,600 in estimated ad spend. December: $191,800. January 2026: $152,100. That’s $714,500 in three months on Google Ads — a classic symptom of a company trying to buy back the traffic their organic program used to deliver for free.

    That’s not strategy. That’s a tourniquet on an arterial bleed.

    What Likely Went Wrong (Diagnosis Before Prescription)

    Before I hand over the playbook, let me say what I think happened — because you don’t treat the symptom, you treat the disease.

    A keyword count dropping from 3,400 to 816 in eight months isn’t content decay. Content decay looks like a slow 10-15% annual erosion. This is a structural collapse. There are really only a few things that cause this pattern:

    Scenario 1: A site migration or redesign went wrong. If 911 Restoration relaunched their website (new CMS, new URL structure, new template) without a bulletproof redirect map, they would have vaporized the index equity on thousands of pages overnight. Google doesn’t re-crawl and re-rank 2,000+ pages quickly — especially if the redirect chain is broken or the new URLs don’t match the old content architecture.

    Scenario 2: Location pages were restructured or consolidated. Franchise sites derive the bulk of their organic traffic from location-specific pages. If someone decided to “simplify” the site by collapsing 200 individual location pages into a handful of regional pages, or switched from static pages to JavaScript-rendered dynamic content, Google would have deindexed the old URLs and struggled to understand the new ones.

    Scenario 3: A technical SEO issue is blocking indexation. A rogue robots.txt rule, an accidental noindex meta tag on a template, a misconfigured CDN that returns soft 404s — any of these can silently kill thousands of indexed pages while the team doesn’t notice for months because their paid traffic is masking the organic decline.

    Scenario 4: Google’s algorithm updates hit them hard. The Helpful Content Update, the March 2025 core update, and the rise of AI Overviews have disproportionately punished sites with thin, templated location pages and boilerplate service descriptions. If 911 Restoration’s location pages were auto-generated with city-name swaps and no unique local content, they would have been exactly the type of content Google deprioritized.

    My bet? It’s a combination of Scenarios 2 and 4. But I’d confirm with data before touching anything. Here’s how.

    Step 1: The 72-Hour Emergency Audit

    Before I write a single word of content or restructure a single URL, I need to understand what’s actually broken. This is a 72-hour diagnostic sprint.

    Day 1: Crawl and Index Analysis

    I’d run Screaming Frog against the full 911restoration.com domain — every page, every redirect, every canonical tag. For a franchise site this size, I’m expecting 5,000-15,000 URLs. I’m looking for:

    • Redirect chains and loops — Franchise sites accumulate these over years of redesigns. Every 301 chain longer than 2 hops is leaking PageRank.
    • Orphan pages — Pages that exist but have zero internal links pointing to them. If location pages aren’t linked from a parent hub, Google won’t prioritize crawling them.
    • Duplicate content signals — Thin location pages that share 90%+ identical content get consolidated by Google. If 150 out of 200 location pages have the same body text with only the city name changed, Google is likely only indexing a handful and ignoring the rest.
    • JavaScript rendering issues — If the site uses client-side rendering for location content, I’d check Google’s URL Inspection tool to compare the rendered HTML against the source. Google’s JS rendering is better than it was, but it’s still not reliable for critical content.
    • Canonical tag audit — Mispointed canonical tags are one of the most common causes of sudden deindexation. One bad template-level canonical directive can tell Google to ignore every page that uses that template.

    Day 2: Google Search Console Deep Dive

    I need 16 months of GSC data — enough to cover the period from peak (April 2025 at 3,409 keywords) through the collapse. Specifically:

    • Coverage report — How many pages are in the “Valid” bucket vs. “Excluded”? What’s the trend? If “Excluded” spiked around May-June 2025, that’s the smoking gun.
    • Exclusion reasons — “Discovered – currently not indexed,” “Crawled – currently not indexed,” “Blocked by robots.txt,” “Alternate page with proper canonical tag.” Each reason points to a different root cause.
    • Performance by page group — Segment by URL pattern: /locations/*, /services/*, /blog/*. Which group lost the most impressions? If it’s locations, we know the architecture failed. If it’s blog content, it’s a content quality issue.
    • Query data — Export the top 5,000 queries and compare March 2025 vs. February 2026. Which keyword clusters disappeared? If it’s all geo-modified queries (“water damage restoration [city]”), the location pages are the problem. If it’s informational queries, the content strategy failed.

    Day 3: Competitive Benchmarking

    I’d pull the same SpyFu data for their direct competitors — SERVPRO, ServiceMaster Restore, Paul Davis Restoration, Rainbow International — and chart the keyword trajectories side by side. If all of them declined, it’s an industry-wide algorithm shift. If only 911 Restoration declined, the problem is site-specific.

    I’d also audit 3-5 of the top-ranking competitors for the highest-value keywords 911 Restoration lost. What do their pages look like? What schema are they using? How is their location architecture structured? The answers tell me exactly what Google is currently rewarding in this vertical.

    Step 2: Location Page Architecture — The Engine of Franchise SEO

    This is the make-or-break element. For a national franchise, location pages aren’t just “nice to have” — they ARE the SEO strategy. Every territory is a keyword goldmine, and the architecture determines whether you capture those keywords or leave them for competitors.

    The Three-Tier Hub-and-Spoke Model

    Here’s the exact structure I’d build:

    Tier 1: National Service Pillar Pages

    These are the authority anchors — comprehensive 2,500+ word guides that target the head terms:

    • /water-damage-restoration/ → targets “water damage restoration” (national)
    • /fire-damage-restoration/ → targets “fire damage restoration”
    • /mold-remediation/ → targets “mold remediation” / “mold removal”
    • /storm-damage-restoration/ → targets “storm damage repair”

    Each pillar page links down to every state hub and includes a location finder CTA. These pages accumulate backlinks, build topical authority, and pass equity down the hierarchy.

    Tier 2: State Hub Pages

    One page per state where 911 Restoration operates:

    • /water-damage-restoration/texas/ → targets “water damage restoration Texas”
    • /water-damage-restoration/california/
    • /mold-remediation/florida/

    Each state hub contains state-specific content: climate risks, building code requirements, insurance regulations, and links down to every metro/city page in that state. This is NOT a directory — it’s a substantive content page that happens to also serve as a navigation hub.

    Tier 3: Metro/City Pages

    This is where the money is. One page per service per territory:

    • /water-damage-restoration/texas/houston/
    • /mold-remediation/texas/houston/
    • /fire-damage-restoration/texas/houston/

    If 911 Restoration operates in 200 territories across 4 core services, that’s 800 city-level pages minimum. Each one must have genuinely unique content — not template swaps. Here’s what makes a city page rank in 2026:

    • Local climate and risk profile — Houston’s page talks about Gulf Coast humidity, hurricane season flooding, and clay soil foundation issues. Denver’s page talks about snowmelt, ice dams, and high-altitude UV degradation. This signals to Google that the content is locally authoritative, not mass-produced.
    • Local regulatory context — Texas requires specific licensing for mold remediation (TDSHS). California has strict asbestos abatement laws. Florida has unique hurricane deductible rules. Including this information proves expertise.
    • Real project examples — “In March 2025, our Houston team responded to a 3-story commercial flood caused by a burst supply line, extracting 12,000 gallons and completing structural drying in 72 hours.” Specificity builds trust with both users and search algorithms.
    • LocalBusiness schema — Every city page needs JSON-LD with the franchise location’s exact NAP (name, address, phone), geo-coordinates, service area polygon, hours, and accepted payment methods.
    • Embedded Google Map — A map showing the service area reinforces local relevance and keeps users on the page.

    The Math That Should Keep 911 Restoration’s CMO Up at Night

    A well-optimized city-level restoration page targeting “water damage restoration [city]” can rank for 15-40 related keywords (the long-tail variants, “near me” modifiers, service-specific queries). At 800 pages × 20 average keywords = 16,000 rankable keywords. They currently have 816. That’s a 19.6x growth opportunity sitting untouched.

    Step 3: Content Strategy — Three Tiers, Three Intents, One Funnel

    Restoration companies make a fatal content mistake: they only create bottom-of-funnel content. Every page says “call us for water damage restoration.” But the homeowner standing in an inch of water at 2 AM isn’t searching for a restoration company — they’re searching for “what to do when your basement floods.”

    Whoever answers that question earns the call 30 minutes later.

    Tier 1: Crisis-Moment Content (Captures the 2 AM Searcher)

    These pages target people in active distress. They’re not browsing — they’re panicking. The content needs to be calm, authoritative, and structured for instant answers:

    • “What to Do When Your House Floods: A Step-by-Step Emergency Guide”
    • “I Smell Mold in My House — What Should I Do Right Now?”
    • “My House Just Had a Fire — What Happens Next?”
    • “Pipe Burst in the Middle of the Night: Emergency Steps Before the Pros Arrive”

    Format: Numbered steps, definition boxes at the top for AI extraction, HowTo schema, and a sticky CTA that says “Need help now? Call 911 Restoration: [local number].” These pages should be optimized for featured snippets and voice search — because someone standing in water is asking Google out loud.

    Tier 2: Decision-Stage Content (Captures the Insurance Call)

    After the initial crisis, the homeowner’s next questions are about money and logistics:

    • “Does Homeowners Insurance Cover Water Damage? A Complete Guide”
    • “How Much Does Water Damage Restoration Cost in 2026?”
    • “Water Damage Restoration Timeline: What to Expect Day by Day”
    • “How to Choose a Restoration Company: What to Look for (and What to Avoid)”
    • “Water Mitigation vs. Water Restoration: What’s the Difference and Why It Matters”

    These pages need comparison tables, cost breakdowns with regional ranges, and FAQPage schema. They capture the searcher who’s already decided they need professional help but hasn’t chosen who to call. This is where you win the click over SERVPRO.

    Tier 3: Authority-Building Content (Captures Links and Topical Trust)

    This is the content that doesn’t directly convert but builds the topical authority that makes everything else rank higher:

    • “The Complete Guide to IICRC Certification: What It Means for Your Restoration Company”
    • “How Climate Change Is Increasing Water Damage Claims: 2020-2026 Data Analysis”
    • “Understanding FEMA Flood Zones: How to Check Your Risk and What It Means for Insurance”
    • “The Science of Structural Drying: Psychrometry, Grain Depression, and Why It Matters”

    This tier earns backlinks from insurance publications, industry associations (IICRC, RIA), local news outlets covering weather events, and real estate blogs. Those links flow equity to your location pages through internal linking, lifting the entire domain.

    Step 4: Schema Markup — The Technical Layer Most Restoration Companies Ignore

    Structured data is unglamorous work. Nobody posts schema markup wins on LinkedIn. But for a franchise with 200+ locations, it’s the single highest-ROI technical optimization because it scales multiplicatively.

    Required Schema Per Page Type

    Location pages:

    {
      "@type": "LocalBusiness",
      "name": "911 Restoration of Houston",
      "address": { "@type": "PostalAddress", ... },
      "geo": { "@type": "GeoCoordinates", ... },
      "telephone": "+1-XXX-XXX-XXXX",
      "openingHoursSpecification": { "dayOfWeek": ["Mo","Tu","We","Th","Fr","Sa","Su"], "opens": "00:00", "closes": "23:59" },
      "areaServed": { "@type": "City", "name": "Houston" },
      "hasOfferCatalog": {
        "@type": "OfferCatalog",
        "itemListElement": [
          { "@type": "Offer", "itemOffered": { "@type": "Service", "name": "Water Damage Restoration" } },
          { "@type": "Offer", "itemOffered": { "@type": "Service", "name": "Mold Remediation" } }
        ]
      }
    }

    Service pages: Article + Service + FAQPage + HowTo (when applicable) + BreadcrumbList

    Blog posts: Article + FAQPage + Speakable (on key answer paragraphs)

    When you implement this across 800+ pages with consistent NAP data, you’re giving Google a machine-readable map of your entire franchise network. That’s how you dominate Local Pack results at scale.

    Step 5: Google Business Profile — The Local Pack Battleground

    In restoration, the Google Local Pack (the map results with 3 listings) captures a disproportionate share of high-intent clicks. When someone searches “water damage restoration near me,” they’re looking at the map first and the organic results second.

    Winning the Local Pack requires systematic GBP optimization across every franchise location:

    • Weekly GBP posts — Not automated junk. Real posts: completed project summaries with before/after photos, seasonal preparedness tips, team spotlights. Google’s algorithm visibly rewards profiles that post consistently.
    • Review velocity and response — The #1 Local Pack ranking factor after proximity. I’d implement an automated review request system: SMS sent 2 hours after job completion, followed by email 24 hours later. Target: every location hits 200+ reviews at 4.8+ stars within 12 months. And respond to every review — positive and negative — within 24 hours.
    • Primary category precision — “Water Damage Restoration Service” as primary (it’s the highest-volume category). Secondary: “Fire Damage Restoration Service,” “Mold Removal Service.” Don’t dilute with generic categories like “General Contractor.”
    • Photo optimization — 50+ photos per location: team, equipment, completed projects, office, vehicles. Geotagged. Updated monthly. Google prioritizes profiles with fresh, diverse visual content.
    • Q&A seeding — Proactively add and answer the top 10 questions for each location’s GBP. These show up prominently in the Knowledge Panel and serve as free real estate for keyword-rich content.

    Step 6: Answer Engine Optimization (AEO) — Win the AI-Powered Search Results

    Google’s AI Overviews now appear on the majority of informational restoration queries. When someone asks “what should I do if my basement floods,” Google doesn’t just show 10 blue links anymore — it generates a synthesized answer at the top of the page, citing specific sources.

    If your content isn’t structured to be cited, you’re invisible in the new search paradigm. Here’s how to fix that:

    • Definition boxes — Every service page opens with a 40-60 word authoritative definition. “Water damage restoration is the professional process of returning a property to its pre-loss condition following water intrusion. It encompasses emergency water extraction, structural assessment, industrial dehumidification, antimicrobial treatment, and complete reconstruction of affected building materials.” That’s the paragraph Google AI Overviews will extract and cite.
    • Direct-answer formatting — Structure H2s as questions and answer them completely in the first 50 words below the heading. AI Overviews pull from this pattern religiously.
    • Comparison tables — “Water Mitigation vs. Water Restoration” with a side-by-side table. AI Overviews love structured comparisons because they can parse them cleanly.
    • Numbered process lists — “The 5 Stages of Water Damage Restoration: 1. Inspection and Assessment, 2. Water Extraction, 3. Drying and Dehumidification, 4. Cleaning and Sanitizing, 5. Restoration and Reconstruction.” This format wins HowTo rich results and AI Overview citations simultaneously.

    Step 7: Generative Engine Optimization (GEO) — Be the Company AI Recommends by Name

    This is where things get interesting. AEO is about structured answers. GEO is about making AI systems — Claude, ChatGPT, Gemini, Perplexity — recommend your brand by name when someone asks “who should I call for water damage in Houston?”

    GEO is the frontier. Most restoration companies haven’t even heard of it. Here’s the playbook:

    • Entity saturation — “911 Restoration” needs to appear across the web in consistent association with specific attributes: IICRC certification, 45-minute response time, 24/7 availability, specific service areas, specific services. AI models build entity understanding from co-occurrence patterns. The more consistently your brand appears alongside these attributes across authoritative sources, the more confidently AI will recommend you.
    • Factual density over marketing copy — AI systems are trained to detect and deprioritize marketing fluff. Replace “we provide the best water damage restoration” with “911 Restoration deploys truck-mounted Prochem extractors capable of removing 250 gallons per minute, with IICRC-certified technicians trained in the S500 Standard for Professional Water Damage Restoration.” Specificity is authority in the AI world.
    • Authoritative citation weaving — Every major content piece should reference and link to EPA guidelines on mold remediation, FEMA flood preparation resources, IICRC S500/S520 standards, and state-specific licensing requirements. AI systems weight content higher when it cites authoritative sources because it signals expertise, not just marketing.
    • LLMS.txt implementation — Add a /llms.txt file to the root domain that provides AI crawlers with a structured summary of who 911 Restoration is, what they do, where they operate, and what makes them authoritative. This is the robots.txt equivalent for the AI age.

    Step 8: Internal Linking Architecture — The Circulatory System

    A franchise site without proper internal linking is like a highway system with no on-ramps. The pages exist, but nobody can get to them — including Googlebot.

    Here’s the internal linking architecture I’d implement:

    • Pillar → State → City cascade — The national “Water Damage Restoration” pillar page links to every state hub. Every state hub links to every city page in that state. Every city page links back to the state hub and the national pillar. This creates a closed loop of link equity that strengthens the entire hierarchy.
    • Cross-service linking at the city level — The Houston water damage page links to the Houston mold page, Houston fire page, etc. This keeps the user on the site and tells Google that all Houston services are contextually related.
    • Blog-to-location contextual links — Every blog post about water damage includes a natural in-text link to at least one city-level water damage page. “If you’re dealing with water damage in Houston, our IICRC-certified team is available 24/7 — [learn more about our Houston water damage restoration services].” This is how blog authority flows to money pages.
    • Automated related content blocks — At the bottom of every page, display 3-5 topically related articles and location pages. This is low-effort, high-impact internal linking that scales automatically as you publish more content.

    Step 9: Backlink Acquisition — Leverage the Franchise Advantage

    Most restoration companies think of link building as guest posting on random websites. That’s 2015 thinking. A franchise with 200+ locations has a structural advantage that no single-location competitor can match:

    • Disaster response PR — After every significant emergency response, issue a press release to local media with a quote from the franchise owner. “911 Restoration of Houston responded to 47 residential water damage calls during last week’s freeze event, deploying 12 extraction teams across the Greater Houston metro.” Local news sites (high DA, high relevance) will pick this up.
    • Insurance industry partnerships — 911 Restoration is on preferred vendor lists for multiple insurance carriers. Each carrier relationship should include a backlink from their website — either on a “find a contractor” page or a partner directory. These are high-authority, contextually perfect links.
    • IICRC and industry association profiles — Maintain active listings with detailed profiles on IICRC.org, RestorationIndustry.org, and state-level contractor licensing boards. These .org links carry significant trust signals.
    • Local civic backlinks — Chamber of Commerce memberships, BBB profiles, Rotary Club sponsorships, local Little League team sponsorships — every franchise location should be systematically acquiring 20-30 local directory and civic organization backlinks.
    • Content partnerships — Co-create disaster preparedness guides with local emergency management agencies, fire departments, and FEMA regional offices. “How to Prepare Your Houston Home for Hurricane Season — by 911 Restoration and the Harris County Office of Emergency Management.” The .gov backlink alone is worth the effort.

    Step 10: Kill the PPC Dependency

    Let’s talk about the elephant in the room. 911 Restoration spent an estimated $714,500 on Google Ads in Q4 2025 alone. That’s $2.86 million annualized. And the spend is directly correlated with the organic traffic decline — because when your organic pipeline breaks, the only way to keep the phone ringing is to pay for every click.

    Here’s the math that should reframe this entire conversation:

    • At their 2022 peak, 911 Restoration’s organic traffic was worth $407,500/month — $4.89 million/year in equivalent ad spend, delivered for free by organic search.
    • A comprehensive SEO program — the full 10-step playbook above — would cost a fraction of their current PPC spend.
    • If they rebuild to even half their peak organic value ($200K/month), that’s $2.4 million/year in traffic they no longer need to buy.
    • Organic traffic compounds. Every month of optimization makes the next month cheaper. PPC is a treadmill — the moment you stop paying, the traffic stops coming.

    The ROI case isn’t even close. Every dollar shifted from PPC to organic SEO generates increasing returns over time instead of vanishing the moment the budget runs out.

    The Bottom Line

    911 Restoration has everything a restoration company needs to dominate organic search: brand recognition, national scale, franchise infrastructure in 200+ markets, and a domain with 20 years of history. The foundation is there. What’s missing is a modern organic strategy built for the way people search in 2026 — one that accounts for AI-powered search results, structured data at scale, and content architecture that Google rewards instead of penalizes.

    The 10-step playbook above isn’t theoretical. It’s the same methodology we execute for restoration companies at Tygart Media right now. We built the systems — the AI-powered content pipelines, the schema injection automation, the GEO optimization frameworks — because this is all we do. Restoration marketing. Day in, day out.

    So here’s my pitch, and I’ll keep it real:

    Hey, 911 Restoration. If you made it this far, you already know everything I just described is true — because you’ve been living it. The SpyFu data is public. The decline is real. And the fix isn’t a mystery; it’s an execution problem.

    We’re Tygart Media. We eat, sleep, and breathe restoration SEO. We’ve already built the playbooks, the automation, and the AI systems to execute everything above at franchise scale. And honestly? We’d love to have the conversation.

    No pressure. No hard sell. Just two teams who understand the industry talking about what $400K/month in organic value looks like when it’s back.

    Reach out here. Or call us. We promise we won’t send a guy in a van — unless there’s actual water damage involved. In which case, we probably know a guy for that too. 😄

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

    Frequently Asked Questions

    How much organic traffic has 911 Restoration lost?

    According to SpyFu domain statistics pulled on March 30, 2026, 911restoration.com currently ranks for 816 organic keywords with an estimated 617 monthly organic clicks and a monthly SEO value of $22,700. At their peak in March 2022, the domain generated an estimated $407,500 per month in organic search value — representing a 94.4% decline. Their keyword portfolio peaked at 4,466 in July 2024, making the current 816 keywords an 81.7% reduction.

    Why is 911 Restoration spending so much on Google Ads?

    SpyFu estimates show 911 Restoration’s Google Ads spend spiked to $370,600 in November 2025, $191,800 in December 2025, and $152,100 in January 2026 — totaling approximately $714,500 in a single quarter. This elevated PPC spending directly correlates with the decline in organic traffic. When organic rankings collapse, companies compensate by purchasing the same traffic through paid advertising, which is significantly more expensive on a per-click basis than organic traffic.

    What is the most important SEO fix for a restoration franchise?

    For franchise-model restoration companies like 911 Restoration, the location page architecture is the single most impactful element of SEO strategy. Each franchise territory requires dedicated, locally-relevant pages for every core service (water damage, fire damage, mold remediation, storm damage) with genuinely unique content — not templated pages with city names swapped in. A properly built three-tier hub-and-spoke model (national pillar → state hub → city page) across 200+ territories and 4 services creates 800+ keyword-rich pages that can collectively target 16,000+ organic keywords.

    What is Generative Engine Optimization (GEO) and why does it matter for restoration companies?

    Generative Engine Optimization (GEO) is the practice of optimizing content so that AI systems — including Google AI Overviews, ChatGPT, Claude, Gemini, and Perplexity — cite and recommend your business by name when users ask questions related to your services. For restoration companies, GEO involves entity saturation (consistent brand-attribute associations across the web), factual density (specific, verifiable claims rather than marketing language), authoritative citations (EPA, FEMA, IICRC standards), and LLMS.txt implementation. GEO represents the next frontier of search visibility as AI-generated answers increasingly replace traditional search results.

    How long would it take to rebuild 911 Restoration’s organic traffic?

    Based on the severity of the decline (94% from peak), a realistic timeline for recovery would be 6-12 months for technical fixes and initial content architecture to take effect, with meaningful traffic recovery visible within 4-6 months of implementing the full 10-step playbook. Full recovery to peak performance levels would likely require 12-18 months of sustained effort. However, the first 90 days typically deliver the highest-impact gains because technical SEO fixes (indexation issues, redirect chains, schema implementation) often produce immediate improvements once Google re-crawls the corrected pages.

  • I Deployed a Client-Facing Chatbot on Vertex AI for Less Than a Penny Per Conversation

    I Deployed a Client-Facing Chatbot on Vertex AI for Less Than a Penny Per Conversation

    The Client Asked for a Chatbot. I Built Them an Employee.

    A restoration client wanted a website chatbot. Their brief was simple: answer common questions about services, capture lead information, and route urgent inquiries to their dispatch team. The expectation was a /month SaaS widget with canned responses.

    I built them something better. A custom chatbot running on Google Vertex AI via Cloud Run, trained on their specific service pages, pricing guidelines, and service area boundaries. It handles natural language questions, qualifies leads by asking the right follow-up questions, and routes urgent water damage calls directly to dispatch with full context. Cost per conversation: .002. That is two-tenths of a penny.

    At 500 conversations per month, the total AI cost is . Add Cloud Run hosting at roughly /month for the container, and the total infrastructure cost is under /month for a chatbot that replaces a /month SaaS product and performs significantly better because it actually understands the business.

    The Architecture

    The chatbot runs on three components:

    Vertex AI (Gemini model): Handles the conversational intelligence. The model receives a system prompt loaded with the client’s service information, pricing ranges, service area (Houston metro), and qualification criteria. It responds conversationally, asks clarifying questions when needed, and structures lead information for capture.

    Cloud Run container: A lightweight Python FastAPI application that serves as the API endpoint. The WordPress site calls this endpoint via JavaScript when a visitor interacts with the chat widget. The container handles session management, conversation history, and the Vertex AI API calls. It scales to zero when not in use, so idle hours cost nothing.

    WordPress integration: A simple JavaScript widget on the client site that renders the chat interface and communicates with the Cloud Run endpoint. No WordPress plugin required. The widget is 40 lines of JavaScript that creates a chat bubble, handles user input, and displays responses.

    Why Vertex AI Instead of OpenAI

    Cost: Gemini 1.5 Flash on Vertex AI costs significantly less per token than GPT-4 or GPT-3.5. For a chatbot handling short conversational exchanges, the per-conversation cost difference is dramatic.

    Data residency: Vertex AI runs on GCP infrastructure where I already have my project. Data stays within the Google Cloud ecosystem I control. No third-party API means the conversation data, which includes client contact information, stays within my GCP project boundaries.

    Scale-to-zero: Cloud Run only charges when processing requests. During overnight hours when nobody is chatting, the cost is literally zero. OpenAI’s API has the same pay-per-use model, but coupling it with Cloud Run for the hosting layer gives me full control over the deployment.

    The System Prompt That Makes It Work

    The chatbot’s intelligence comes entirely from its system prompt. No fine-tuning. No RAG pipeline. No vector database. Just a well-structured system prompt that contains the client’s service descriptions, pricing ranges (not exact quotes), service area zip codes, qualification questions, and escalation triggers.

    The prompt includes explicit instructions for lead qualification. When someone describes a water damage situation, the chatbot asks: When did the damage occur? Is it an active leak or standing water? What is the approximate affected area? Is this a residential or commercial property? Do you have insurance? These questions mirror what the dispatch team asks on phone calls.

    When the qualification criteria indicate an emergency (active leak, less than 24 hours, standing water), the chatbot provides the dispatch phone number prominently and offers to notify the team. Non-emergency inquiries get scheduled callback options.

    Results After 90 Days

    The chatbot handled 1,400 conversations in its first 90 days. Of those, 340 were qualified leads (24% conversion rate from chat to lead). Of the qualified leads, 89 became paying customers.

    The previous chatbot solution (a SaaS widget with canned response trees) had a 6% chat-to-lead conversion rate. The AI chatbot quadrupled it because it can actually understand what someone is describing and respond helpfully rather than forcing them through a rigid decision tree.

    Total infrastructure cost for 90 days: approximately . Total value of the 89 customers: several hundred thousand dollars in restoration work. The ROI is not a percentage – it is a category error to even calculate it.

    Frequently Asked Questions

    Can the chatbot handle multiple languages?

    Yes. Gemini handles multilingual conversations natively. The Houston market has significant Spanish-speaking population, and the chatbot responds in Spanish when addressed in Spanish without any additional configuration. This alone increased lead capture from a demographic the client was previously underserving.

    What happens when the chatbot cannot answer a question?

    The system prompt includes a graceful fallback: if the question is outside the defined scope, the chatbot acknowledges the limitation and offers to connect the visitor with a human team member via phone or scheduled callback. It never fabricates information about pricing or services.

    How hard is this to set up for a new client?

    About 3 hours. Create the Cloud Run service from the template, customize the system prompt with the client’s information, deploy, and add the JavaScript widget to their WordPress site. The infrastructure is templated – the customization is entirely in the system prompt content.

    The Bigger Point

    AI chatbots do not need to be expensive SaaS products with monthly subscriptions. The underlying technology – language models accessible via API – costs fractions of a penny per interaction. The value is in the deployment architecture and the domain-specific knowledge you embed in the system prompt. Own the infrastructure, own the intelligence, and the cost drops to near zero while the quality exceeds anything a canned-response widget can deliver.

    {
    “@context”: “https://schema.org”,
    “@type”: “Article”,
    “headline”: “I Deployed a Client-Facing Chatbot on Vertex AI for Less Than a Penny Per Conversation”,
    “description”: “Using Google Vertex AI and Cloud Run, I deployed a production chatbot for a client site that handles FAQs, qualifies leads, and routes inquiries –.”,
    “datePublished”: “2026-03-21”,
    “dateModified”: “2026-04-03”,
    “author”: {
    “@type”: “Person”,
    “name”: “Will Tygart”,
    “url”: “https://tygartmedia.com/about”
    },
    “publisher”: {
    “@type”: “Organization”,
    “name”: “Tygart Media”,
    “url”: “https://tygartmedia.com”,
    “logo”: {
    “@type”: “ImageObject”,
    “url”: “https://tygartmedia.com/wp-content/uploads/tygart-media-logo.png”
    }
    },
    “mainEntityOfPage”: {
    “@type”: “WebPage”,
    “@id”: “https://tygartmedia.com/i-deployed-a-client-facing-chatbot-on-vertex-ai-for-less-than-a-penny-per-conversation/”
    }
    }

  • What GEO Delivery Actually Looks Like Inside a Real Client Engagement

    What GEO Delivery Actually Looks Like Inside a Real Client Engagement

    Stop Theorizing. Here Is How It Actually Works.

    Most content about Generative Engine Optimization reads like a research paper. Theoretical frameworks. Hypothetical scenarios. Vague recommendations to “increase factual density” without showing what that looks like on a real page for a real client. This article is different. It walks through the actual GEO delivery process as it happens inside a production client engagement — from the initial audit through the content changes through the measurement of outcomes.

    No client names. No proprietary data. But the real methodology, the real workflow, and the real results framework that an agency can evaluate and decide whether to build, buy, or partner for.

    Phase 1: The AI Visibility Audit

    Every engagement starts with a baseline audit. Pull the client’s top 30 keywords by traffic and run each one through three systems: Google search (noting AI Overview presence and citations), ChatGPT with browsing (noting brand mentions and source citations), and Perplexity (noting inline citations). Log which queries trigger AI-generated results, whether the client is cited, and which competitors appear.

    The audit also evaluates the client’s content for AI-readiness. For each of the top 20 pages by traffic, score: factual density (verifiable facts per 100 words), citation quality (are sources named inline or absent), structural clarity (can a clean answer be extracted from each section), entity signals (is Organization and Person schema implemented), and AI crawlability (is the content in the HTML source or locked behind JavaScript rendering).

    The output is a scorecard that shows the client exactly where they stand across AI search channels and exactly what needs to change. Most clients score well on basic SEO metrics but poorly on factual density, citation quality, and schema completeness — which is why they rank in organic but are absent from AI citations.

    Phase 2: Content Enhancement

    The content work happens on the top 20 pages, prioritized by traffic and AI citation opportunity. Each page gets four treatments.

    Treatment one: factual density upgrade. Go paragraph by paragraph and replace every vague claim with a specific, verifiable fact. “The industry is growing” becomes “the industry reached billion in 2025 according to [named source].” “Many companies use this approach” becomes “a 2025 survey by [named institution] found that X percent of companies in [sector] have adopted this approach.” The target is at least one cited, verifiable fact per paragraph.

    Treatment two: answer block restructuring. Identify the primary question each page section answers. Rephrase the H2 heading as that question. Write a 40 to 60 word direct answer block immediately below. This serves both AEO (snippet extraction) and GEO (AI answer extraction) simultaneously.

    Treatment three: entity signal strengthening. Ensure the page references the author with visible credentials. Add inline authority markers — “according to [author name], who has [X years] of experience in [domain]” — that AI systems use to evaluate source credibility.

    Treatment four: schema implementation. Apply Article or BlogPosting schema with complete properties — headline, author, datePublished, dateModified, publisher. Add FAQPage schema wrapping all Q&A pairs. Add Speakable schema marking the direct answer blocks. Validate all schema against Google’s Rich Results Test.

    Phase 3: Technical GEO Infrastructure

    Beyond content, the engagement includes three infrastructure items. First: LLMS.txt implementation at the domain root, declaring the site’s authority areas, preferred citation format, and content access policies. Second: robots.txt review ensuring AI crawlers — GPTBot, ClaudeBot, PerplexityBot, Google-Extended — are not blocked. Third: a comprehensive sitemap update ensuring all enhanced pages are included and recently modified dates are current.

    These three items take under two hours to implement but create the technical foundation that enables AI systems to discover, crawl, and properly cite the enhanced content.

    Phase 4: Measurement and Iteration

    GEO measurement uses four metrics tracked monthly. AI Overview presence — the number of target keywords where the client’s content is cited in Google AI Overviews, tracked through Search Console’s AI Overview reporting. Featured snippet count — the number of target keywords where the client holds the featured position. AI platform citations — manual spot-checks querying ChatGPT and Perplexity with target questions and noting brand mentions. AI platform referral traffic — sessions from Perplexity, ChatGPT, and other AI search platforms tracked in analytics.

    The iteration cycle runs monthly. Pages that gained AI visibility get maintained. Pages that did not are re-audited for specific deficiencies — usually factual density or structural issues that prevent clean answer extraction. New pages are added to the enhancement queue based on ranking improvements from the concurrent SEO work.

    Typical Timeline and Results

    Month one: audit and first batch of content enhancements across the top 10 pages. Technical infrastructure implemented. Baseline measurements established.

    Month two: second batch of enhancements on pages 11 through 20. First featured snippet wins typically appear. Schema validation and refinement.

    Month three: full measurement cycle comparing baseline to current state. AI Overview citations typically begin appearing for 2 to 5 target keywords. Referral traffic from AI platforms begins showing in analytics.

    By month six, a well-executed engagement typically shows 8 to 15 featured snippet positions, measurable AI Overview citations, and AI platform referral traffic as a visible line in the analytics dashboard. These results sit on top of the organic SEO gains — they are additive, not substitutive.

    FAQ

    How many hours per month does a GEO engagement require?
    For an initial enhancement: a concentrated effort in month one, then a regular ongoing commitment for monitoring, iteration, and expansion to additional pages.

    Can GEO work be done without access to the client’s CMS?
    Content recommendations and schema code can be delivered as specifications for the client’s team to implement. But direct CMS access dramatically accelerates delivery and reduces the implementation gap between recommendation and execution.

    What is the minimum site size for a GEO engagement?
    Any site with at least 20 published pages targeting commercial or informational keywords has enough content for a meaningful GEO engagement. Smaller sites benefit from content creation alongside GEO optimization.

    {
    “@context”: “https://schema.org”,
    “@type”: “Article”,
    “headline”: “What GEO Delivery Actually Looks Like Inside a Real Client Engagement”,
    “description”: “Here Is How It Actually Works. Most content about Generative Engine Optimization reads like a research paper.”,
    “datePublished”: “2026-03-21”,
    “dateModified”: “2026-04-03”,
    “author”: {
    “@type”: “Person”,
    “name”: “Will Tygart”,
    “url”: “https://tygartmedia.com/about”
    },
    “publisher”: {
    “@type”: “Organization”,
    “name”: “Tygart Media”,
    “url”: “https://tygartmedia.com”,
    “logo”: {
    “@type”: “ImageObject”,
    “url”: “https://tygartmedia.com/wp-content/uploads/tygart-media-logo.png”
    }
    },
    “mainEntityOfPage”: {
    “@type”: “WebPage”,
    “@id”: “https://tygartmedia.com/what-geo-delivery-actually-looks-like-inside-a-real-client-engagement/”
    }
    }

  • The Before-and-After Framework: How to Build AEO/GEO Case Studies That Close Agency Deals

    The Before-and-After Framework: How to Build AEO/GEO Case Studies That Close Agency Deals

    Proof Sells Partnerships. Here’s How to Build It.

    Every agency owner has heard the pitch. Some vendor walks in, talks about a new optimization layer, shows a few charts, and expects you to sign. You’ve been on the receiving end of that pitch. You know how it feels. Hollow.

    So when you’re considering adding AEO and GEO capabilities to your agency — whether through a fractional partner like Tygart Media or by building internally — you need proof that isn’t a slide deck. You need a framework that shows exactly what changed, why it changed, and what it meant for the client’s business.

    This is the before-and-after framework we use at Tygart Media to document AEO and GEO impact. It’s the same framework we hand to agency partners so they can build their own proof library. Because the agencies that win the next decade of search aren’t the ones with the best pitch — they’re the ones with the best receipts.

    Why Traditional SEO Case Studies Don’t Work for AEO/GEO

    Traditional SEO case studies follow a familiar pattern: we ranked position 4, now we rank position 1, traffic went up 40%. That story works when the entire game is organic rankings and click-through rates. But AEO and GEO operate in spaces where those metrics tell an incomplete story.

    Answer Engine Optimization wins show up as featured snippet captures, People Also Ask placements, voice search selections, and zero-click visibility. A client might see their brand quoted directly in a Google search result without anyone clicking through. That’s a win — but it doesn’t look like one in a traditional traffic report.

    Generative Engine Optimization wins are even harder to capture with legacy metrics. When Claude, ChatGPT, Perplexity, or Google AI Overviews cite your client’s content as a source, that’s brand authority at scale. But it doesn’t show up in Google Analytics the way a backlink campaign does.

    The framework below captures these new forms of value so you can show clients — and prospects — exactly what AEO/GEO delivers.

    The Five-Layer Before-and-After Framework

    Layer 1: Baseline Snapshot

    Before you touch anything, document the current state across five dimensions. This becomes your “before” evidence. Miss this step and you have no story to tell later.

    For AEO baseline, capture: current featured snippet ownership (which queries, what format), People Also Ask presence, existing FAQ schema implementation, voice search readiness score, and zero-click visibility for target queries. Use tools like SEMrush or Ahrefs to pull SERP feature data, and manually search the top 20 target queries to screenshot current results.

    For GEO baseline, capture: current AI citation presence (search the client’s brand in ChatGPT, Claude, Perplexity, and Google AI Overviews), entity signal strength (do they have a knowledge panel, consistent NAP+W, organization schema), factual density score of key pages (verifiable facts per 100 words), and LLMS.txt status. This baseline often shocks agency owners — most clients have zero AI citation presence.

    Layer 2: The Optimization Map

    Document every change you make, categorized by type. This isn’t just for the case study — it’s your replication playbook. For each change, record: what was modified, which framework it falls under (SEO/AEO/GEO), the specific technique applied, and the expected impact mechanism.

    Example entry: “Restructured the main service page FAQ section. AEO framework. Applied the snippet-ready content pattern — question as H2, direct 40-60 word answer paragraph, then expanded depth. Expected to capture paragraph snippet for ‘what is [service]’ query cluster.”

    Layer 3: The 30-60-90 Day Measurement

    AEO and GEO results don’t follow the same timeline as traditional SEO. Featured snippets can flip within days. AI citations can appear within weeks of content optimization. But some wins compound over months. Structure your measurement in three phases.

    At 30 days, measure: new featured snippet captures, PAA placements gained, schema validation improvements, and initial AI citation checks. At 60 days, measure: snippet retention rate, voice search selection data (if available through Search Console), entity signal improvements in knowledge panels, and expanded AI citation checks across multiple AI platforms. At 90 days, measure: compound effects — are AI systems citing the client more consistently, are snippet wins holding, has the client’s topical authority score improved, and what’s the aggregate impact on brand visibility across both traditional and AI search?

    Layer 4: The Revenue Translation

    This is where most case studies fail. They show metrics but don’t connect them to money. For every AEO/GEO win, translate it to business impact. Featured snippet for a high-intent query? Calculate the equivalent PPC cost for that visibility. AI citation in Perplexity for a buying-intent query? Estimate the brand impression value. Zero-click visibility increase? Show the brand awareness equivalent in paid media terms.

    The formula we use: (estimated impressions from AEO/GEO placement) × (equivalent CPM if purchased through paid channels) = visibility value. Then layer on: (click-through rate from snippet/citation) × (conversion rate) × (average deal value) = direct revenue attribution. Both numbers matter. The visibility value justifies the investment. The revenue attribution proves the ROI.

    Layer 5: The Competitive Delta

    The most persuasive element of any case study isn’t what you did — it’s what the client’s competitors can’t do. Show the gap. For each major win, document: which competitors were previously holding that featured snippet (and lost it), which competitors have zero AI citation presence (while your client now has consistent citations), and which competitors lack the schema infrastructure to compete for these placements.

    This competitive delta turns a case study from “here’s what we did” into “here’s the moat we built.” Agency owners love moats. Their clients love moats even more.

    Building Your Proof Library

    One case study is an anecdote. Three is a pattern. Ten is a proof library that closes deals. Start building yours now, even if you’re just beginning to offer AEO/GEO services. Document every engagement from day one using this framework. The agencies that started building proof libraries six months ago are already closing partnership deals that the “we’ll figure out case studies later” agencies are losing.

    At Tygart Media, we provide our agency partners with templated versions of this framework, pre-built measurement dashboards, and quarterly proof library reviews. Because your case studies aren’t just marketing collateral — they’re the foundation of every partnership conversation you’ll have for the next five years.

    Frequently Asked Questions

    How long does it take to build a compelling AEO/GEO case study?

    A complete before-and-after case study using this five-layer framework takes 90 days from baseline to final measurement. However, you can show early AEO wins like featured snippet captures within 30 days, giving you preliminary proof while the full study matures.

    What tools do I need to measure GEO results?

    For GEO measurement, manually query AI platforms (ChatGPT, Claude, Perplexity, Google AI Overviews) for your client’s target terms and document citations. Automated GEO tracking tools are emerging but manual verification remains the gold standard for case study accuracy as of 2026.

    Can I use this framework for clients who only have SEO services currently?

    Absolutely. Running a baseline AEO/GEO audit on an existing SEO client is one of the most powerful upsell tools available. The baseline snapshot alone — showing zero featured snippet ownership and zero AI citations — creates immediate urgency to add these optimization layers.

    How do I calculate the revenue value of an AI citation?

    Use the equivalent paid media model: estimate impressions from the AI platform’s user base for that query category, apply equivalent CPM rates from paid channels, then layer on any measurable click-through and conversion data. Conservative estimates are more credible than inflated projections in case studies.

    {
    “@context”: “https://schema.org”,
    “@type”: “Article”,
    “headline”: “The Before-and-After Framework: How to Build AEO/GEO Case Studies That Close Agency Deals”,
    “description”: “A proven case study framework showing agency owners how to document AEO and GEO wins with before-and-after proof that converts prospects into partners.”,
    “datePublished”: “2026-03-21”,
    “dateModified”: “2026-04-03”,
    “author”: {
    “@type”: “Person”,
    “name”: “Will Tygart”,
    “url”: “https://tygartmedia.com/about”
    },
    “publisher”: {
    “@type”: “Organization”,
    “name”: “Tygart Media”,
    “url”: “https://tygartmedia.com”,
    “logo”: {
    “@type”: “ImageObject”,
    “url”: “https://tygartmedia.com/wp-content/uploads/tygart-media-logo.png”
    }
    },
    “mainEntityOfPage”: {
    “@type”: “WebPage”,
    “@id”: “https://tygartmedia.com/the-before-and-after-framework-how-to-build-aeo-geo-case-studies-that-close-agency-deals/”
    }
    }