Tag: Featured Snippets

  • Continuous Radon Monitors vs. Passive Test Kits: Complete Comparison

    The Distillery — Brew № 1 · Radon Mitigation

    Two fundamentally different approaches to measuring radon exist: passive test kits that absorb or record radon over a fixed period and are analyzed by a lab, and continuous electronic monitors that measure radon concentration in real time and display running averages. Each has specific use cases, limitations, and accuracy profiles. Choosing the wrong tool for your situation produces either a false sense of security or unnecessary alarm.

    Passive Test Kits: The Lab-Certified Standard

    Charcoal Canisters (Short-Term)

    Activated charcoal canisters are the most common residential radon test device. Charcoal adsorbs radon gas from ambient air during the 48–96 hour exposure period. The canister is sealed and mailed to a lab, where gamma spectroscopy measures radon decay products accumulated in the charcoal and calculates average concentration over the test period.

    • Accuracy: ±10–15% under controlled conditions when conducted properly
    • Cost: $15–$30 including lab analysis
    • Turnaround: Results in 3–7 business days after mailing
    • Certification: Accepted for real estate transactions and regulatory purposes when conducted by NRPP/NRSB-certified professionals
    • Limitation: Single snapshot — captures conditions only during the 48–96 hour window, which may not represent the home’s annual average

    Alpha Track Detectors (Long-Term)

    Alpha track detectors contain a small piece of plastic film (typically CR-39 or LR-115) that records microscopic damage tracks from alpha particles emitted during radon decay. The cumulative track count over the 90-day to 1-year exposure period is proportional to average radon concentration. Lab etches the film and counts tracks under a microscope.

    • Accuracy: ±8–12% for properly conducted 90-day+ tests — the most accurate passive measurement available for annual average determination
    • Cost: $25–$45 including lab analysis
    • Turnaround: Minimum 90 days in home; lab results within 1–2 weeks after return
    • Certification: Accepted for annual average determination and regulatory purposes
    • Advantage: Averages out all seasonal, pressure, and weather variability — the closest proxy to true annual average exposure

    Electret Ion Chambers (Short- or Long-Term)

    Electret ion chambers use a statically charged disk (electret) inside an ionization chamber. Radon decay products ionize the air inside the chamber, gradually discharging the electret. The voltage drop is measured at the end of the test and converted to radon concentration. More expensive than charcoal or alpha track devices but can be reused multiple times and generate same-day results in the field when a professional reads the electret on-site.

    • Cost: $50–$200 per test (professional use) or $150–$400 for consumer-grade reusable kits
    • Turnaround: Immediate (field-read) or lab-read
    • Use: Most common in professional measurement contexts, not typical for DIY homeowner use

    Continuous Electronic Radon Monitors

    Continuous radon monitors use electronic sensors — typically pulse ionization chambers or solid-state alpha detectors — to measure radon concentration continuously and display results in real time or as running averages. Consumer-grade models are widely available; professional-grade units are used by certified measurement professionals for real estate and compliance testing.

    Consumer-Grade Continuous Monitors

    Popular models: Airthings Wave Plus (~$230), Airthings Wave Radon (~$200), Corentium Home (~$150), RadonEye RD200 (~$130), Safety Siren Pro3 (~$130).

    • Accuracy: ±10–20% at radon levels near 4.0 pCi/L; accuracy typically degrades at lower concentrations (<1.0 pCi/L)
    • Display: Real-time readings (hourly or faster), 24-hour average, 7-day average, long-term average
    • Cost: $130–$230 (no ongoing lab fees)
    • Certification: Not accepted for real estate transactions or regulatory compliance in most states — consumer monitors are monitoring tools, not certified measurement devices
    • Advantage: Real-time visibility into radon fluctuations; immediate feedback when conditions change; ongoing monitoring without repeated lab costs

    Professional-Grade Continuous Monitors

    Professional instruments (Sun Nuclear 1028, Femto-TECH CRM 510, RadStar Alpha Series) are calibrated devices used by certified measurement professionals. They record hourly radon data, generate tamper-evident data logs, and produce certified reports accepted for real estate and regulatory purposes.

    • Accuracy: ±5–10% with proper calibration
    • Cost: $800–$2,500 per unit (professional purchase); $150–$400 per test when hired professionally
    • Certification: Accepted for real estate, regulatory, and legal purposes

    Side-by-Side Comparison

    FeatureCharcoal CanisterAlpha TrackConsumer MonitorPro Monitor
    Duration48–96 hrs90 days–1 yearContinuous48–96 hrs (typical)
    Accuracy±10–15%±8–12%±10–20%±5–10%
    Cost per test$15–$30$25–$45$130–$230 (one-time)$150–$400
    Real estate acceptedYes (certified)Yes (certified)NoYes
    Results speedDays after mailWeeks after mailReal-timeDays after test
    Best forInitial screening, post-mitigationAnnual average, confirmationOngoing home monitoringReal estate, compliance

    Which Should You Use?

    • First-time screening of your home: Start with a charcoal canister ($15–$30). If elevated, follow up with a long-term alpha track test.
    • Buying or selling a home: Hire a certified professional using a professional-grade continuous monitor or charcoal canister — consumer monitors are not accepted.
    • Ongoing monitoring after mitigation: A consumer monitor ($130–$230) provides real-time peace of mind between formal 2-year retests.
    • Most accurate annual average for a confirmed radon home: A 90-day to 1-year alpha track detector.
    • Post-mitigation confirmation: A 48-hour charcoal canister placed at least 24 hours after system activation.

    Frequently Asked Questions

    Are Airthings monitors accurate enough to replace a radon test kit?

    For personal monitoring purposes, consumer monitors like Airthings Wave provide useful ongoing visibility into radon fluctuations. They are not accepted replacements for lab-certified tests in real estate transactions, regulatory contexts, or official post-mitigation verification. For those purposes, a charcoal canister or professional monitor is required.

    Why do continuous monitors and charcoal tests sometimes show different results for the same home?

    Radon levels fluctuate significantly — sometimes by 30–50% — over 24–48 hour periods due to barometric pressure, temperature, and wind changes. A charcoal test captures a specific 48–96 hour window; a continuous monitor’s 7-day or 30-day average includes multiple high and low periods. Additionally, consumer monitors have higher measurement uncertainty at low concentrations. Minor discrepancies are expected; large discrepancies (more than 40%) warrant investigation of device placement or closed-house conditions.

    How long does a continuous monitor need to run to give a reliable radon reading?

    Consumer continuous monitors typically need at least 7 days of operation to stabilize their running averages. At 30 days, the average becomes reasonably representative of prevailing conditions. At 90+ days, the long-term average approximates the kind of seasonal averaging achieved by alpha track detectors. Do not make mitigation decisions based on readings from the first 24–72 hours of monitor operation.

  • Radon Testing for Home Sales: Buyer and Seller Guide

    The Distillery — Brew № 1 · Radon Mitigation

    Radon testing has become a near-universal component of home inspection contingencies in high-risk states — and increasingly, a standard expectation in real estate transactions nationwide. Buyers, sellers, and agents who understand how testing works in a transaction context avoid delays, failed deals, and post-closing disputes.

    When Radon Testing Happens in a Real Estate Transaction

    Radon testing in real estate is typically conducted during the home inspection contingency period — usually 7–15 days after an offer is accepted. The buyer orders and pays for the test as part of due diligence, either as part of a general home inspection package or as a standalone radon test. Testing can be conducted simultaneously with other inspections.

    The test device is placed in the lowest livable level of the home under closed-house conditions and collected after the minimum 48-hour exposure period. Results typically return within 24–72 hours after the device is mailed to the lab.

    What Test Type Is Used in Real Estate Transactions

    Real estate radon tests are almost exclusively short-term charcoal canister tests, typically 48–96 hours. Long-term tests (90+ days) are incompatible with transaction timelines. This creates an inherent limitation: a single 48-hour test during a specific weather window may not accurately represent the home’s annual average radon level.

    Most radon measurement professionals conducting real estate tests follow EPA protocols and AARST measurement standards (MAMF) for real estate testing. Key requirements under AARST MAMF:

    • Closed-house conditions maintained during the 12 hours before and throughout the test
    • Device placed in the lowest livable level
    • Device placement in accordance with EPA placement protocol (breathing zone, away from drafts)
    • Chain-of-custody documentation
    • Results certified by an NRPP- or NRSB-certified measurement professional

    Who Can Conduct the Real Estate Radon Test

    Many states require that real estate radon tests be conducted by a certified radon measurement professional — not by the buyer, seller, or real estate agent. Even in states without this requirement, buyers and lenders often prefer certified professional testing to ensure compliance with AARST MAMF protocols and to have a defensible measurement if disputes arise.

    Certification is granted by the National Radon Proficiency Program (NRPP) or the National Radon Safety Board (NRSB). Both maintain searchable professional directories. Verify a professional’s credentials before engaging them for a certified real estate measurement.

    Negotiating After an Elevated Radon Test Result

    When the real estate radon test returns at or above the EPA action level of 4.0 pCi/L, buyers have several options:

    Option 1: Seller Installs Mitigation Before Closing

    The most common outcome. The seller agrees to install a radon mitigation system, with post-mitigation testing confirming results below 4.0 pCi/L before the transaction closes. Buyers should specify in writing that the seller engages a certified mitigator (NRPP or NRSB) and that post-mitigation testing is conducted by a certified professional — not by the mitigating contractor alone.

    Option 2: Seller Credit Toward Buyer Mitigation

    The seller provides a credit (typically $800–$2,000 depending on the market) and the buyer handles mitigation after closing. This is simpler for both parties when installation timing creates logistical challenges. Buyers should be aware that “seller credit for radon mitigation” does not obligate the buyer to actually use the funds for mitigation — but it does shift responsibility.

    Option 3: Price Reduction

    Less common than a credit or seller-installed system, but sometimes used in negotiations where the buyer wants to control the mitigation process independently.

    Option 4: Walk Away

    Buyers who include a radon contingency in their offer can exit the transaction without penalty if radon levels are at or above the specified threshold (typically 4.0 pCi/L) and the seller declines to remediate. Well-drafted real estate contracts specify what constitutes an elevated result and what remedies the buyer is entitled to.

    Seller Strategy: Test Before Listing

    Sellers who test before listing gain significant advantages:

    • Control over timing and contractor selection: You choose the mitigator, schedule the work on your timeline, and select the post-mitigation test timing — none of which are in your control when the buyer discovers the issue during inspection
    • Avoid renegotiation: A pre-listing mitigation system eliminates radon from the negotiation entirely — buyers see a documented mitigated home
    • Avoid deal delays: Mitigation installation and post-mitigation testing can take 1–2 weeks; if discovered during the inspection period, this creates timeline pressure
    • Documentation for disclosure: Pre-listing testing and mitigation provides complete documentation — pre-mitigation level, system installation records, post-mitigation level — which satisfies disclosure requirements in states that mandate them

    State Radon Disclosure Laws

    Radon disclosure requirements vary significantly by state. As of 2026:

    • States with mandatory radon disclosure: Illinois, Florida, Maine, Virginia, and others require sellers to disclose known radon test results or the presence of a mitigation system
    • States with no specific radon disclosure law: Sellers may still have general duty to disclose known material defects — and elevated radon likely qualifies as a material defect in most jurisdictions even without a specific radon statute
    • Federally subsidized housing: EPA guidelines apply to FHA, VA, and HUD-insured properties, which may have radon testing requirements in high-risk zones

    Sellers should consult their state’s real estate commission guidance and a licensed real estate attorney for jurisdiction-specific disclosure obligations. Failure to disclose a known elevated radon level has resulted in post-closing litigation in multiple states.

    Frequently Asked Questions

    Who pays for the radon test when buying a house?

    The buyer typically pays for the initial radon test as part of due diligence, similar to other inspection costs. If the test reveals elevated levels and the seller agrees to mitigate, the seller bears the mitigation cost. Post-mitigation testing is sometimes split or included in the mitigator’s quote.

    Can a seller refuse to test for radon?

    In most states, sellers cannot prevent a buyer from conducting a radon test during an inspection contingency period — the seller must provide reasonable access. However, sellers are not generally required to test their own home proactively unless mandated by state law or specific transaction conditions.

    What radon level will fail a home inspection?

    There is no pass/fail standard for home inspections — radon is a risk factor, not a code violation. However, results at or above the EPA action level of 4.0 pCi/L trigger the buyer’s right to negotiate remediation under most real estate contracts that include a radon contingency. Some buyers set lower thresholds (2.0 pCi/L) in their contracts.

    My home already has a radon mitigation system — do I still need to test?

    Yes. Real estate buyers routinely request a current radon test even in homes with existing mitigation systems, because: the system may have been installed years ago, fan performance degrades over time, and new entry pathways can develop from foundation settling. Sellers with existing systems should have the most recent post-mitigation test results available.

  • Radon Test Results: What Your pCi/L Number Actually Means

    The Distillery — Brew № 1 · Radon Mitigation

    Your radon test came back with a number. Now you need to know what that number means — not just whether it is above or below an arbitrary threshold, but what the actual health risk is at that concentration, what the EPA recommends at each level, and what your realistic options are. This guide translates pCi/L into plain language.

    What Is pCi/L?

    Picocuries per liter (pCi/L) is the standard U.S. measurement unit for radon concentration in air. One picocurie represents approximately 2.2 radioactive disintegrations per minute in one liter of air. The measurement reflects how much radon decay activity is occurring in the air you breathe.

    For context: the average outdoor radon level in the U.S. is approximately 0.4 pCi/L. The average indoor level is 1.3 pCi/L — already elevated above outdoor air simply because buildings concentrate radon that enters from the soil. EPA considers 4.0 pCi/L the action level at which mitigation is recommended.

    The EPA Action Level: 4.0 pCi/L

    The EPA’s 4.0 pCi/L action level is not a bright line between “safe” and “dangerous.” It is a practical threshold chosen to balance risk reduction with the cost and feasibility of mitigation. EPA has also established a 2.0 pCi/L “consider mitigating” level — acknowledging that even at concentrations between 2.0 and 4.0 pCi/L, radon exposure contributes meaningfully to lifetime lung cancer risk.

    The World Health Organization (WHO) uses a lower reference level of 2.7 pCi/L (100 Bq/m³), reflecting evidence that significant risk exists below EPA’s 4.0 threshold. Many European countries use the WHO reference level or lower values in their national radon programs.

    Health Risk at Each Concentration Level

    EPA publishes risk estimates for radon exposure using lifetime lung cancer risk per 1,000 people exposed continuously at each concentration level. These estimates apply to never-smokers — smokers face dramatically compounded risk because radon decay products and tobacco smoke synergistically damage lung tissue.

    Radon Level (pCi/L)Estimated Lung Cancer Deaths per 1,000 Never-SmokersEPA Recommendation
    0.4 (outdoor average)~0.4Baseline — outdoor air
    1.3 (indoor average)~1.0National average
    2.0~1.5Consider mitigating
    4.0~2.9Mitigate
    8.0~5.8Mitigate without waiting for confirmatory test
    20.0~14.7Mitigate immediately

    For comparison: radon at 4.0 pCi/L carries roughly the same lifetime lung cancer risk as having 200 chest X-rays per year, or smoking approximately 8 cigarettes per day according to EPA risk comparisons. At 20 pCi/L, the risk approaches that of smoking a pack per day.

    What to Do at Each Level

    Below 2.0 pCi/L

    No action required. Retest in 2 years, or after any significant renovations that affect the foundation or HVAC system. If your result is below 1.3 pCi/L, your home is below the national indoor average.

    2.0–3.9 pCi/L

    EPA recommends considering mitigation. This is not a mandate — mitigation at this level is a personal risk decision. Factors that strengthen the case for mitigation even below 4.0 pCi/L:

    • Smokers in the household (radon and tobacco risk multiply, not add)
    • Young children who will spend decades in the home
    • Plans to finish a basement or spend more time in the lower level
    • Result was from a short-term test in favorable conditions — actual annual average may be higher

    Mitigation in this range typically costs the same as mitigation at 10 pCi/L — the system is the same. The only question is whether the risk reduction justifies the investment at your specific level.

    4.0–7.9 pCi/L

    At or above the EPA action level. EPA recommends mitigation. If the result was from a short-term test, conduct a confirmatory long-term test or second short-term test before proceeding — unless you want to mitigate without waiting, which is always safe to do. If confirmed above 4.0 pCi/L, install an active radon mitigation system.

    8.0 pCi/L or Higher

    Mitigate without waiting for a confirmatory test. At this concentration, the cumulative risk from continued exposure while conducting additional testing is not justified by the modest additional certainty a second test provides. Contact a certified radon mitigator and schedule installation.

    Post-Mitigation Results: What to Expect

    A properly installed active Sub-Slab Depressurization system typically reduces radon levels by 85–99%. Common post-mitigation results:

    • A home at 12 pCi/L before mitigation commonly achieves 0.5–1.5 pCi/L after a single-point ASD installation with good aggregate conditions
    • A home at 4.5 pCi/L commonly achieves 0.3–0.8 pCi/L
    • Post-mitigation results above 4.0 pCi/L indicate insufficient suction coverage, unsealed entry pathways, or an undersized fan — and warrant a contractor callback

    EPA recommends post-mitigation testing 24 hours after system activation (if using a continuous monitor) or placing a short-term test at least 24 hours post-installation and running it for 48 hours minimum. The target is below 4.0 pCi/L; most installations achieve below 2.0 pCi/L.

    Frequently Asked Questions

    Is 3.9 pCi/L safe?

    It is below the EPA action level of 4.0 pCi/L, so EPA does not mandate mitigation. However, the risk difference between 3.9 and 4.0 pCi/L is negligible — they represent essentially the same health risk. EPA recommends “considering mitigation” at 2.0 pCi/L, so at 3.9 pCi/L you are in the range where mitigation is a reasonable personal risk decision even if not required.

    What is a safe radon level?

    There is no radon level that carries zero risk — even outdoor radon (0.4 pCi/L) contributes some cumulative exposure. The EPA action level of 4.0 pCi/L represents a pragmatic threshold for mandatory action, not a definition of “safe.” Many health organizations, including the WHO, recommend action at 2.7 pCi/L or lower. Reducing radon levels as low as reasonably achievable is always the goal.

    My test result is in WL, not pCi/L. How do I convert?

    Working level (WL) is an older measurement unit still used in some occupational and commercial radon standards. To convert: 1 WL equals approximately 200 pCi/L of radon in equilibrium. EPA’s 4.0 pCi/L action level corresponds to approximately 0.02 WL. Most modern residential tests report in pCi/L.

    My result is 2.5 pCi/L — should I mitigate?

    EPA recommends considering mitigation at this level. The decision is yours. Key factors: whether you have smokers in the home (dramatically compounded risk), whether you are planning to spend significantly more time in the lower level (finishing a basement), the age of occupants, and your personal risk tolerance. Mitigation at 2.5 pCi/L will typically cost the same as mitigation at 8.0 pCi/L and will reduce levels to 0.3–0.8 pCi/L.


    Related Radon Resources

  • Short-Term Radon Test vs. Long-Term: Which Do You Need?

    The Distillery — Brew № 1 · Radon Mitigation

    The difference between a short-term and long-term radon test is not just duration — it is what each result actually tells you. A 48-hour test gives you a snapshot of radon during specific conditions. A 90-day test gives you a seasonal average. A year-long test gives you the most accurate picture of your true annual exposure. Understanding when each applies prevents both under-reaction to real risk and over-reaction to a weather-influenced spike.

    Short-Term Tests: The Screening Tool

    Short-term radon tests run from a minimum of 48 hours up to 90 days. The most common residential short-term test is the activated charcoal canister, run for 48–96 hours under closed-house conditions.

    How Charcoal Canister Tests Work

    An activated charcoal canister absorbs radon gas from the surrounding air during the exposure period. At the end of the test, you seal the canister and mail it to a laboratory. The lab measures gamma radiation emitted by radon decay products that have accumulated in the charcoal, calculates the average radon concentration over the test period, and reports the result in picocuries per liter (pCi/L).

    Short-Term Test Accuracy and Limitations

    Short-term results are inherently variable because radon levels fluctuate by 30–50% day to day in many homes, driven by:

    • Barometric pressure: Low pressure pulls more soil gas into the home; high pressure suppresses it
    • Temperature differential: Greater indoor-outdoor temperature difference strengthens stack effect and increases radon draw
    • Wind: Wind pressure against the house affects sub-slab pressure dynamics
    • Precipitation: Rain saturates soil, reducing gas permeability and temporarily suppressing radon entry
    • HVAC operation: Forced-air systems can both dilute and redistribute radon within the home

    A single 48-hour test during an unusually high-pressure, warm, dry period may significantly underestimate actual levels. The same home tested during a cold snap with falling barometric pressure may read 30–50% higher than average. This variability is why EPA guidance does not recommend making final mitigation decisions solely on a single short-term result in the 4.0–8.0 pCi/L range.

    When Short-Term Tests Are the Right Choice

    • Initial screening: If you have never tested your home, a short-term test is the fastest way to identify whether a problem may exist
    • Real estate transactions: When time constraints (contract deadlines) prevent long-term testing, short-term tests are universally accepted with appropriate disclosure
    • Post-mitigation verification: After installing a radon system, a 48-hour charcoal test placed at least 24 hours post-installation verifies the system is working; EPA recommends this within 24 hours of system activation
    • Initial high-result screening: If the initial test returns 8.0 pCi/L or higher, EPA recommends proceeding to mitigation without waiting for a confirmatory long-term test — the risk is sufficient

    Long-Term Tests: The Accurate Baseline

    Long-term tests run for a minimum of 90 days; one-year tests are the gold standard. The standard device is an alpha track detector — a small card with a clear plastic film (CR-39 or similar) that records microscopic damage tracks from alpha particles emitted by radon decay products over the exposure period. At the end of the test, the lab chemically etches the film and counts the tracks under a microscope, calculating average radon concentration.

    Why Long-Term Tests Are More Accurate

    By averaging radon levels across multiple seasons — or ideally a full year — long-term tests smooth out the barometric, temperature, and weather-driven variability that makes short-term results uncertain. A 90-day winter test captures the highest-radon season and provides a reasonably conservative estimate of annual average. A full-year test captures all seasonal patterns.

    Studies comparing matched short-term and long-term measurements in the same homes consistently show that short-term tests, when compared to annual averages, overestimate the annual average in about half of cases and underestimate it in the other half — with individual test variance of ±40–50% common. Long-term tests reduce this uncertainty substantially.

    When Long-Term Tests Are the Right Choice

    • Confirming a short-term result in the 4.0–8.0 pCi/L range: Before investing $1,000–$2,500 in mitigation, a long-term confirmation test establishes that elevated levels are chronic rather than a test-period anomaly
    • Establishing a baseline in a new home: A one-year test after moving in provides the most accurate picture of actual exposure
    • Routine monitoring in a mitigated home: An annual alpha track detector run year-round provides ongoing confirmation of system performance
    • Research or legal purposes: Situations requiring the highest-accuracy radon measurements

    EPA Decision Protocol: Which Test When

    SituationRecommended TestAction if Elevated
    First-time testing, no rushLong-term (90+ days)Mitigate if annual avg ≥ 4.0 pCi/L
    First-time testing, want quick answerShort-term (48–96 hrs)Follow up with long-term if 4.0–8.0 pCi/L
    Short-term result ≥ 8.0 pCi/LMitigate immediatelyNo confirmatory test needed
    Short-term result 4.0–8.0 pCi/LSecond short-term or long-termMitigate if confirmed ≥ 4.0 pCi/L
    Real estate transactionShort-term (48–96 hrs)Negotiate mitigation in contract
    Post-mitigation verificationShort-term (48–96 hrs), 24+ hrs after installRetest or callback if still ≥ 4.0 pCi/L
    Ongoing monitoring (mitigated home)Long-term (annual alpha track)Schedule callback if ≥ 4.0 pCi/L

    Continuous Radon Monitors: The Third Option

    Continuous electronic radon monitors (Airthings Wave, Corentium, RadonEye) provide real-time radon readings and running averages. They do not replace lab-analyzed test kits for official measurements but offer ongoing visibility into radon fluctuations that neither charcoal canisters nor alpha track detectors can provide.

    Continuous monitors are most valuable for:

    • Monitoring a mitigated home between formal retests
    • Understanding diurnal and seasonal radon patterns in your home
    • Detecting rapid changes that indicate fan failure or new entry pathways
    • Confirming that closed-house conditions during a short-term test are being maintained

    Consumer-grade continuous monitors have measurement uncertainty of ±10–20% at low radon levels and are not accepted as certified measurements for real estate transactions or regulatory compliance. They are monitoring tools, not certification tools.

    Frequently Asked Questions

    Which radon test is more accurate — short-term or long-term?

    Long-term tests are more accurate representations of actual annual average radon exposure because they average out the weather- and pressure-driven fluctuations that make short-term results variable. A 90-day or one-year alpha track test provides a more reliable basis for mitigation decisions than a single 48-hour charcoal test.

    Can I use a short-term test to decide whether to mitigate?

    Yes, with caveats. If your short-term result is 8.0 pCi/L or higher, EPA recommends mitigation without a confirmatory test. If it is between 4.0 and 8.0 pCi/L, a follow-up long-term or second short-term test is advisable before investing in mitigation, to confirm the result is not an anomalous spike.

    How long should I run a radon test?

    Minimum 48 hours for a charcoal short-term test under closed-house conditions. For the most accurate annual average, run an alpha track detector for 90 days to one year under normal living conditions. Longer is more accurate.

    Do I need closed-house conditions for a long-term radon test?

    No. Long-term tests (alpha track detectors, 90+ days) are designed to run under normal living conditions — windows open in summer, closed in winter, normal HVAC operation. The extended duration averages out all of these variations. Closed-house conditions are required only for short-term charcoal tests (48–96 hours).

  • How to Test for Radon in Your Home: Complete Guide

    The Distillery — Brew № 1 · Radon Mitigation

    Radon testing is the only way to know whether your home has elevated radon levels. You cannot smell it, see it, or detect it with any sense — and the homes with the highest radon levels often show no correlation with geography, age, or construction style. The EPA estimates that 1 in 15 U.S. homes has elevated radon. Testing takes as little as 48 hours and costs $15–$30 for a DIY kit.

    Why You Need to Test

    Radon is the second leading cause of lung cancer in the United States after cigarette smoking, responsible for approximately 21,000 deaths annually according to the EPA. The risk is cumulative — it is the product of concentration and time. A home at 4.0 pCi/L poses roughly the same lifetime lung cancer risk as smoking half a pack of cigarettes per day. A home at 20 pCi/L — not uncommon in high-radon zones — roughly equals smoking two packs per day.

    The only way to know your home’s radon level is to test it. No map, no neighborhood average, and no visual inspection can substitute for a measurement in your specific home.

    Short-Term vs. Long-Term Radon Tests

    Short-Term Tests (2–90 Days)

    Short-term tests are the most commonly used initial screening method. The standard residential short-term test is a charcoal canister test run for 48–96 hours. Results are available within 3–7 business days after mailing the device to a lab.

    • Duration: 48 hours minimum (EPA); 48–96 hours typical for charcoal devices
    • Device type: Activated charcoal canister or electret ion chamber
    • Conditions required: Closed-house conditions (see below)
    • Best for: Initial screening, pre-purchase testing, post-mitigation verification
    • Limitation: A single short-term test captures a snapshot — radon levels fluctuate with barometric pressure, temperature, and season. A short-term result may be higher or lower than the home’s true annual average.

    Long-Term Tests (90+ Days)

    Long-term tests provide a more accurate picture of the home’s actual annual average radon exposure. The standard device is an alpha track detector — a small card with a special plastic film that records radon decay particle tracks over time.

    • Duration: 90 days to 1 year (one year is ideal)
    • Device type: Alpha track detector
    • Conditions required: Normal living conditions (no closed-house protocol)
    • Best for: Confirming short-term results, annual monitoring, determining true annual average
    • Advantage: Averages out seasonal and pressure fluctuations — provides the most accurate basis for mitigation decisions

    EPA guidance: if a short-term test shows between 4.0 and 8.0 pCi/L, conduct a follow-up long-term test or a second short-term test before deciding on mitigation. If the initial short-term test shows 8.0 pCi/L or higher, proceed to mitigation without waiting for a confirmatory test — the risk is sufficient to act immediately.

    Where to Place the Radon Test Device

    Placement determines whether your result is meaningful. The EPA’s placement protocol:

    • Level: Test in the lowest level of the home that is currently used or could be used as living space — even if you do not currently occupy it. If you have an unfinished basement you plan to finish, test there.
    • Location within the room: Place the device in the breathing zone — at least 20 inches above the floor and at least 12 inches from any wall
    • Away from drafts: Do not place near windows, doors, HVAC vents, or exterior walls where air movement can dilute results
    • Away from humidity sources: Do not place near sump pits, laundry areas, or bathrooms — excessive humidity can affect charcoal canister performance
    • Accessible but undisturbed: The device should be able to sit undisturbed for the full test duration — not in a high-traffic area where it might be moved

    Closed-House Conditions

    Short-term tests require closed-house conditions during the test and for 12 hours before the test begins. Closed-house means:

    • All windows and exterior doors closed except for brief normal entry/exit
    • No whole-house fans or attic fans running
    • Normal HVAC operation is permitted (heating and cooling systems can run — they recirculate interior air)
    • Ceiling fans are permitted
    • Fireplace dampers closed (if not in use)

    Closed-house conditions prevent outdoor air from diluting indoor radon to artificially low levels during the test. When conditions are not maintained, short-term results systematically underestimate actual radon levels — exactly the wrong direction for a safety measurement.

    Interpreting Your Results

    • Below 2.0 pCi/L: Below EPA’s average indoor radon level of 1.3 pCi/L if the home is new. No action required; retest in 2 years.
    • 2.0–3.9 pCi/L: Between the national average and the EPA action level. Consider a long-term test to confirm. Some homeowners choose to mitigate at this level regardless, particularly if they have young children or smokers in the home.
    • 4.0–7.9 pCi/L: At or above EPA action level. EPA recommends mitigation. Conduct a confirmatory long-term or second short-term test if time allows, then mitigate.
    • 8.0 pCi/L or higher: Mitigate without waiting for confirmatory testing. At this level the health risk warrants immediate action.

    DIY vs. Professional Testing

    DIY test kits (charcoal canisters or alpha track detectors) purchased from hardware stores or online labs are the most cost-effective option for initial and ongoing screening. Cost: $15–$30 including lab analysis. Most state radon programs recommend purchasing from a lab certified by the National Radon Proficiency Program (NRPP) or National Radon Safety Board (NRSB).

    Professional testing uses the same device types but is conducted and placed by a certified radon measurement professional. Professional testing is required or preferred in specific situations:

    • Real estate transactions where the buyer requires a certified measurement
    • Post-mitigation verification where the mitigator or a warranty requires professional confirmation
    • Rental properties in states where landlord testing requirements specify professional measurement
    • Situations involving litigation or insurance where certified chain-of-custody testing is required

    How Often to Test

    • Initial test: If you have never tested, test now — regardless of when you moved in or how long you have lived there
    • After mitigation: Test within 24 hours of system installation (if using a continuous monitor) or place a short-term test 24+ hours post-installation; run for 48 hours minimum
    • Routine retesting: EPA recommends retesting every 2 years even in mitigated homes — to confirm continued performance and catch new entry pathways from foundation settling or renovation
    • After renovations: Any work that involves the foundation, basement, or significant changes to the HVAC system warrants a new test
    • When buying a home: Always test — or require a recent test result — before closing

    Frequently Asked Questions

    How accurate are DIY radon test kits?

    DIY charcoal canister kits analyzed by NRPP- or NRSB-certified labs are accurate to within ±10–15% under controlled conditions. This is sufficient precision for screening decisions. The larger source of variation is not the device itself but testing conditions — an improperly placed device or violated closed-house conditions introduce more error than the device’s inherent measurement uncertainty.

    What time of year is best to test for radon?

    Winter typically produces higher radon readings than summer — windows are kept closed, stack effect is stronger, and atmospheric pressure patterns tend to draw more soil gas into the home. Testing in winter gives a closer approximation of worst-case conditions. However, because any result at or above 4.0 pCi/L warrants mitigation regardless of season, the best time to test is simply now — not after waiting for an optimal season.

    Can I test for radon myself or do I need a professional?

    DIY testing is appropriate and recommended for the vast majority of homeowners. Purchase a certified short-term or long-term kit, follow the placement and closed-house instructions, and mail to the lab. Professional testing is required only for real estate transactions in some states, post-litigation measurements, or situations where certified chain-of-custody documentation is needed.

    My neighbor’s home tested low — does that mean mine will too?

    No. Radon levels vary dramatically between adjacent homes — sometimes between rooms in the same home. Differences in sub-slab aggregate, foundation type, construction methods, HVAC configuration, and soil permeability can produce completely different radon levels in homes built side by side. Your home must be tested independently.


    Related Radon Resources

  • The State of Restoration Franchise SEO in 2026: Who’s Winning, Who’s Losing, and Why

    The State of Restoration Franchise SEO in 2026: Who’s Winning, Who’s Losing, and Why

    The Machine Room · Under the Hood

    I wrote five articles in one day. Here’s why.

    On March 28, 2026, I sat down with SpyFu data pulled that morning and realized something most of the restoration industry hasn’t seen yet: they’re all experiencing the same catastrophic decline at the same time. This isn’t a case of individual franchise websites being poorly optimized. This is an industry-wide pattern that reveals everything about where restoration franchise SEO is headed.

    I spent that day analyzing SERVPRO, Paul Davis, Rainbow Restores, ServiceMaster, and 911 Restoration across every dimension of competitive SEO intelligence we track. The result was five separate playbooks—one for each franchise. But those five articles tell one much bigger story.

    This is that story.

    ## The Competitive Landscape: Five Franchises, One Reality Check

    Let me start with where they all stand right now, as of March 30, 2026:

    | Company | Domain | Keywords | Monthly Clicks | SEO Value | Peak Value | Peak Keywords | Domain Strength | Monthly PPC |
    |—|—|—|—|—|—|—|—|—|
    | SERVPRO | servpro.com | 178,900 | 151,700 | $5,825,000 | $7,684,585 | 286,900 | 62 | $1,944,000 |
    | Paul Davis | pauldavis.com | 22,190 | 13,590 | $952,800 | $4,525,425 | 97,480 | 54 | $206,100 |
    | Rainbow Restores | rainbowrestores.com | 33,700 | 25,500 | $495,500 | $3,354,009 | 109,000 | 52 | $320,000 |
    | 911 Restoration | 911restoration.com | 816 | 617 | $22,700 | $407,500 | 4,466 | 40 | $132,100 |
    | ServiceMaster | servicemaster.com | 1,742 | 4,435 | $39,300 | $334,384 | 20,696 | 42 | $7,039 |

    This table is deceptively simple. It contains the entire story of what went wrong in restoration franchise SEO in the last six months.

    ## The Q4 2025 Cliff: What Actually Happened

    Here’s what should terrify every restoration brand right now:

    – **SERVPRO**: Lost 108,000 keywords between October 2025 and March 2026. Their peak was 286,900 keywords in October. Today they’re at 178,900. That’s a 38% decline in four months.
    – **Paul Davis**: Fell from 49,500 keywords in October to 22,190 today. A 55% crater.
    – **Rainbow Restores**: Dropped from 57,700 to 33,700. Still significant, but the recovery trajectory is different.
    – **911 Restoration**: Lost another 1,600 keywords, bringing them to 816 total. They’ve lost 94% of their peak visibility.
    – **ServiceMaster**: Continued its decade-long irrelevance with minimal movement.

    This didn’t happen because these companies suddenly made bad SEO decisions. This happened because Google changed something fundamental in how it ranks restoration and emergency services content between October and December 2025.

    The data points to one of several possibilities:

    1. **Algorithm Update (Most Likely)**: Google released changes to E-E-A-T validation, location signals, or trust factors that disproportionately hit franchise networks. The Oct-Dec window included at least two confirmed updates.

    2. **Search Generative Experience (SGE) Impact**: As SGE matures, Google is directly synthesizing answers that bypass clicks to individual sites. Franchises with dispersed content across local pages (rather than consolidated authority) are getting worse SGE treatment.

    3. **Authority Consolidation**: The algorithm may have shifted toward favoring domain-level authority over page-level authority, punishing franchises that rely on local service pages when the parent domain isn’t sufficiently strong.

    4. **Review Signal Reweighting**: With Google tightening review validity checks, franchises with weak or manipulated review signals (common in franchise networks) took hits.

    The real answer is probably all four working together. But here’s the critical insight: **every restoration franchise except the already-dead ServiceMaster lost visibility at the same time.** That’s not a coincidence. That’s a market signal.

    ## The Tier System: Who’s Actually Winning

    What emerges from the data is a clear three-tier system:

    ### Tier 1: Untouchable Dominance

    **SERVPRO remains the category king**, but here’s the thing—they’re bleeding. Despite losing 108,000 keywords, they still own 178,900. They still command $5.8M in monthly SEO value. They still capture 151,700 monthly clicks organically.

    The gap between SERVPRO and everyone else is absurd. Paul Davis—the clear #2 player—captures only 22,190 keywords to SERVPRO’s 178,900. That’s an 8:1 ratio.

    But dominance can hide decline. SERVPRO was at $7.68M monthly value just six years ago. If they continue this trajectory (losing ~27K keywords per month), they’ll be in Tier 2 within three years.

    ### Tier 2: The Competitive Battleground

    **Paul Davis and Rainbow Restores** live in a completely different world from SERVPRO, but they’re actively competing with each other.

    Paul Davis has **22,190 keywords and $952,800 monthly SEO value**. They were growing through 2025 and then hit the cliff hard with everyone else. But here’s their advantage: they rank for extremely high-value terms. Their value-per-keyword is $42.94—the highest of any competitor in this space.

    Rainbow Restores has **33,700 keywords and $495,500 monthly SEO value**. They’re a domain migration success story. They moved from their original domain (which had 109,000 keywords and $3.35M value) and have rebuilt to 33,700 keywords on the new domain. They’re approaching their current domain’s natural peak, which suggests room for growth.

    Between these two, the opportunity is real. Paul Davis has momentum and authority but lost it in Q4. Rainbow has growth trajectory and recent migration advantages. The winner in 2026 between these two will be whoever invests in modern SEO first.

    ### Tier 3: Starting Over or Walking Away

    **911 Restoration and ServiceMaster** are fundamentally different problems.

    ServiceMaster is a legacy brand in complete digital collapse. They rank for 1,742 keywords, generate 4,435 monthly clicks, and command only $39,300 in SEO value. Their domain strength is 42. They peaked at $334K monthly value in February 2020—six years ago. This isn’t a recovery situation. This is a brand that’s digitally abandoned its restoration line.

    911 Restoration is worse because they’re still trying. They spend $132,100/month on PPC while holding only 816 keywords and $22,700 in SEO value. They’re in the worst position of any competitor: visible enough to know they’re broken, not successful enough to stop hemorrhaging money.

    ## The Value-Per-Keyword Insight: Why High Value Doesn’t Mean Winning

    Here’s where competitive analysis gets interesting. Let me calculate value per keyword for each franchise:

    – **Paul Davis: $42.94/keyword**
    – **SERVPRO: $32.56/keyword**
    – **ServiceMaster: $22.56/keyword**
    – **911 Restoration: $27.82/keyword**
    – **Rainbow Restores: $14.70/keyword**

    Paul Davis wins this metric by a massive margin. They’re ranking for restoration terms that are worth significantly more than competitors. This suggests better content targeting, local authority, and possibly a geographic mix that includes higher-value markets.

    SERVPRO is close behind at $32.56/keyword, which makes sense—they dominate the market and rank for premium terms.

    But here’s the catch: **high value per keyword doesn’t predict growth.** Rainbow Restores has the lowest value per keyword ($14.70), but they’re the recovery story here. They survived a domain migration and are building back. Paul Davis has the highest value per keyword but lost 55% of their visibility in Q4.

    This is the fundamental lesson: **keyword count and value are backward-looking metrics.** They tell you what the market awarded you historically, not what you’re capturing going forward.

    ## The $31M PPC Problem: The Real Story of Organic Failure

    Now for the genuinely damning number: **these five franchises are spending $2.606M per month on Google Ads.**

    That’s $31.27 million per year on paid search.

    Let me break down the monthly PPC spend:
    – SERVPRO: $1,944,000
    – Paul Davis: $206,100
    – Rainbow Restores: $320,000
    – 911 Restoration: $132,100
    – ServiceMaster: $7,039

    What’s fascinating is the timing. In October 2025, as organic keywords started tanking, **Paul Davis, Rainbow Restores, and 911 Restoration all spiked their PPC spending simultaneously.** This wasn’t random budget allocation. This was panic.

    November 2025 PPC spend for these three franchises:
    – Paul Davis hit $665K (peak spend)
    – Rainbow Restores hit $583K
    – 911 Restoration hit $370K

    They knew organic was failing before it was obvious in the data. And they responded with paid spend increases that ranged from 45% to 180% above baseline.

    SERVPRO, sitting at $2M+ monthly PPC, clearly made a different decision: lean further into paid. They have the cash to do it. The smaller competitors didn’t, which is why you see their current PPC at more moderate levels.

    The obvious question: **If they’re spending $31M/year on paid search, why wouldn’t they invest 10% of that ($3.1M/year) in fixing organic?**

    The answer is structural. Franchises are fundamentally decentralized. Local franchisees see the top-line organic collapse (because it’s syndicated across their local pages), panic about visibility, and demand quick fixes. PPC delivers immediate impressions. Organic takes three to six months.

    In a downturn, panic money flows to the short-term solution, not the right solution.

    ## What Actually Changed: The Diagnosis

    I analyzed these five franchises in-depth because I needed to understand what Q4 2025 actually broke. Here’s what the individual playbooks revealed:

    **SERVPRO** relies on a massive network of individual location pages with weak local authority. When Google tightened its E-E-A-T validation for local services, those pages took hits. The parent domain is strong (62 domain strength), but not strong enough to carry 280+ local variations without architectural improvements.

    **Paul Davis** had brilliant local SEO strategy—strong local authority pages, good schema implementation, solid review signals. But their strategy was vulnerable to any shift in how Google weights parent domain authority vs. local page authority. When the Q4 update hit, their advantage disappeared.

    **Rainbow Restores** suffered the domain migration legacy—they lost all ranking momentum when they moved domains, and they’re still rebuilding authority. The newer domain is growing, but it’s a long climb.

    **911 Restoration** has fundamental domain authority problems. 816 keywords on a domain with only 40 authority points is catastrophic. They can’t rank for anything meaningful because the domain itself isn’t trusted.

    **ServiceMaster** is eight years into a slow-motion bankruptcy of their digital presence. There’s nothing to analyze—they’ve simply abandoned digital.

    ## What Modern Restoration SEO Looks Like in 2026

    If I were running SEO for any of these franchises right now, here’s what I’d do:

    **1. Domain Architecture Overhaul**
    Stop treating location pages as disposable. Build local authority that actually compounds. Use canonicals strategically. Consolidate authority signals to fewer, stronger pages rather than spreading authority across hundreds of weak pages.

    **2. AI-Augmented Content Strategy**
    Restoration keywords are incredibly specific. “Water damage restoration Alexandria VA” is different from “water damage restoration Phoenix AZ” in intent, local competition, and required expertise. Use AI to generate actually useful, locally-relevant content at scale without the SEO-spam quality.

    **3. Structured Data Mastery**
    Service schema, FAQ schema, Organization schema—implement these at the parent domain level, not just at local pages. When Google looks at your domain, it should understand instantly what you do, where you operate, and why you’re trustworthy.

    **4. Geographic Expansion Through Intent**
    Paul Davis’s high value-per-keyword suggests they’re better at geo-targeting high-value markets. Intentionally target expensive geographic markets first. Use Google Ads data to identify which markets have the highest customer acquisition cost, then dominate organic in those markets.

    **5. Review Signal Validity**
    Google’s tightening review checks. Stop chasing review volume. Build processes that generate genuine reviews from actual customers. This takes longer, but it’s the only strategy that survives algorithm updates.

    **6. E-E-A-T at Scale**
    For franchises, E-E-A-T is particularly challenging because you need to demonstrate expertise across hundreds of locations. Create a parent domain authority system where franchisees contribute verified expertise, local results, case studies, and certifications that roll up to a central authority hub.

    ## What This Series Actually Demonstrates

    I wrote five separate playbooks because each franchise has a different problem:

    – **SERVPRO**: Scale is your asset and your liability. You need architectural fixes that only the largest franchises can implement.
    – **Paul Davis**: You had the right strategy for 2024-2025. You need to evolve faster than the algorithm changes.
    – **Rainbow Restores**: You’re the comeback story. Your new domain is building momentum. Don’t waste it.
    – **911 Restoration**: You’re fighting domain authority problems that will take 18 months minimum to fix. Start now.
    – **ServiceMaster**: You’re in liquidation mode for your digital presence. Different problem.

    But there’s a meta-lesson in having this data and this analysis available to franchises: **the restoration industry SEO landscape is wider open in March 2026 than it’s been in six years.**

    SERVPRO is losing keywords. Paul Davis lost momentum. Rainbow is rebuilding. 911 and ServiceMaster aren’t real competitors anymore.

    Any restoration franchise that invests in modern SEO infrastructure right now—real content strategy, proper domain architecture, AI-augmented scale, and rigorous E-E-A-T—will capture market share that was SERVPRO’s last year.

    This is the historic window. It closes when one of the Tier 2 players figures out what actually changed in Q4 2025 and executes a real recovery.

    ## The Individual Playbooks

    Each of these five franchises gets its own deep-dive analysis:

    – **[SERVPRO SEO Playbook](/servpro-seo-playbook/)** – Scale, authority dilution, and how to fix an 800,000+ page domain.
    – **[Paul Davis SEO Playbook](/paul-davis-seo-playbook/)** – Local authority strategy, value maximization, and adapting to algorithm shifts.
    – **[Rainbow Restores SEO Playbook](/rainbow-restoration-seo-playbook/)** – Domain migration recovery, rebuilding authority, and growth strategy.
    – **[911 Restoration SEO Playbook](/911-restoration-seo-playbook/)** – Foundation building, domain authority recovery, and realistic timelines.
    – **[ServiceMaster SEO Playbook](/servicemaster-seo-playbook/)** – Legacy strategy, digital retreat, and whether recovery is possible.

    Read the one that applies to your franchise. Or read all five. The comparative analysis is where the real insight lives.

    ## The Data-Driven Difference

    This entire series—five detailed playbooks plus this comparative analysis—was built in one day because it’s what we do at Tygart Media.

    We pull data from multiple sources (SpyFu, Google, internal analysis frameworks). We synthesize patterns that competitors miss because they’re looking at their own domain instead of the entire category. We translate technical SEO findings into business strategy.

    We build AI-augmented content systems that let franchises operate at scale without sacrificing quality. We implement the structural improvements that survive algorithm updates. We turn data into competitive advantage.

    If you’re a restoration franchise and you’re reading this, you already know your organic visibility took a hit in Q4 2025. You probably already know your PPC costs are climbing. You might not know why, or what to do about it.

    We’ve mapped both. And we know how to fix it.

    ## FAQ: What This Data Really Means

    **Q: Did Google definitely change something in Q4 2025?**
    A: The simultaneous keyword loss across five major competitors in the same niche is statistically improbable without a triggering event. Confirmed algorithm updates in that window make this nearly certain. The question isn’t whether Google changed something—it’s what specifically changed, and that varies by domain architecture and content strategy.

    **Q: Is SERVPRO actually in trouble?**
    A: SERVPRO is losing market share relative to their peak, but they’re still dominant. However, if the trend continues, they’ll be in serious trouble within two years. For now, they’re managing decline with increased PPC spend. Long-term, that strategy gets expensive.

    **Q: Can Paul Davis recover to their 2024 performance levels?**
    A: Possibly, but only if they correctly identify what the Q4 update hit and adapt their strategy accordingly. Their high value-per-keyword suggests they’re targeting the right terms. The issue is domain authority and architecture, not keyword selection.

    **Q: How long will it take 911 Restoration to recover?**
    A: Domain authority recovery is slow. At their current trajectory, rebuilding to 5,000 keywords would take 3-4 years of sustained, correct optimization. The real timeline depends on their willingness to invest and whether they fix the fundamental architecture problems.

    **Q: Why spend $31M on PPC instead of fixing organic?**
    A: Because franchises operate with local franchisee decision-making, and local franchisees want immediate results. Organic takes time. But the math is clear: if you’re spending $31M on paid, you should be investing $3-5M on fixing organic. ROI on organic is higher long-term, but executives get fired for short-term failures.

    ## What Happens Next

    In six months, we’ll pull this data again. One of three things will have happened:

    1. **Recovery**: One of the Tier 2 players (Paul Davis or Rainbow) will have figured out the Q4 update and recovered visibility. They’ll start capturing SERVPRO’s market share.

    2. **Consolidation**: SERVPRO will have stabilized their decline through increased paid spend and minor organic improvements. They’ll remain dominant but more vulnerable.

    3. **Fragmentation**: The market stays dispersed. No single competitor dominates enough to own the category. Franchises with better marketing budgets than SEO strategies (like the status quo) keep winning.

    I’m betting on #1. The market is too opportunity-rich for it to stay broken this long.

    ## Conclusion

    The restoration franchise SEO landscape is broken. That’s actually the good news, because broken systems create opportunity.

    SERVPRO is bleeding keywords. Paul Davis lost momentum. Rainbow is rebuilding. 911 is struggling. ServiceMaster is irrelevant.

    For any franchise willing to invest in real SEO infrastructure—the technical foundation, content strategy, AI-augmented scale, and data-driven execution—this is the moment to attack.

    The window doesn’t stay open long.

    Read the individual playbooks. Pick your category. Start executing. The data will tell you whether you’re moving in the right direction.

    We built this analysis in a day. If you want help building the execution strategy, let’s talk.

    Will Tygart
    Tygart Media

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

    {
    “@context”: “https://schema.org”,
    “@type”: “Article”,
    “headline”: “The State of Restoration Franchise SEO in 2026: Whos Winning, Whos Losing, and Why”,
    “description”: “Five franchises. One algorithm update. A $31M/year PPC spend that tells the real story. Here’s what the data reveals about restoration SEO in 2026.”,
    “datePublished”: “2026-03-30”,
    “dateModified”: “2026-04-03”,
    “author”: {
    “@type”: “Person”,
    “name”: “Will Tygart”,
    “url”: “https://tygartmedia.com/about”
    },
    “publisher”: {
    “@type”: “Organization”,
    “name”: “Tygart Media”,
    “url”: “https://tygartmedia.com”,
    “logo”: {
    “@type”: “ImageObject”,
    “url”: “https://tygartmedia.com/wp-content/uploads/tygart-media-logo.png”
    }
    },
    “mainEntityOfPage”: {
    “@type”: “WebPage”,
    “@id”: “https://tygartmedia.com/state-of-restoration-franchise-seo-2026/”
    }
    }

  • If I Were Running Rainbow Restoration’s SEO, Here’s What I’d Do Differently

    If I Were Running Rainbow Restoration’s SEO, Here’s What I’d Do Differently

    The Machine Room · Under the Hood

    I’m about to do something that most agency owners would never do: hand over an entire playbook.

    Not a teaser. Not a “5 quick wins” listicle. The actual, step-by-step strategy I would execute — starting tomorrow — if Rainbow Restoration handed me the keys to their organic search program.

    Why? Because I just pulled their SpyFu data, and what I found is the most interesting restoration franchise story I’ve analyzed so far.

    Rainbow Restoration (rainbowrestores.com) didn’t suffer a decline. They survived a full domain migration from rainbowintl.com and actually came out the other side with a living, breathing SEO program. But here’s where it gets fascinating: they left roughly $3 million per month on the table.

    The old domain peaked at $3.35M/month and 109,000 keywords. The new domain is recovering, but they’re sitting at $495,500/month and 33,700 keywords. That’s 85% below where they should be — which means the upside is enormous.

    So let’s talk about what I’d do to finish what the migration started.

    The Data: From Peak to Recovery to Opportunity

    I pulled the full 12-month historical record from SpyFu on March 30, 2026. Here’s rainbowrestores.com over the last year:

    Period Organic Keywords Monthly Organic Clicks SEO Value ($/mo) PPC Spend ($/mo) Domain Strength
    Mar 2025 53,769 29,960 $330,500 $444 50
    Apr 2025 50,920 27,330 $323,100 $535 50
    May 2025 47,600 28,160 $295,100 $603 47
    Jun 2025 45,980 26,890 $281,500 $704 47
    Jul 2025 49,910 32,160 $338,700 $793 48
    Aug 2025 54,810 36,720 $352,200 $836 48
    Sep 2025 55,550 37,520 $302,100 $0 50
    Oct 2025 58,509 38,420 $309,800 $0 51
    Nov 2025 57,770 36,400 $308,400 $582,800 51
    Dec 2025 40,080 31,260 $235,600 $324,500 50
    Jan 2026 38,460 30,910 $227,200 $277,100 49
    Feb 2026 33,700 25,500 $495,500 $320,000 52

    Let me break this down:

    The Good News: Rainbow survived a domain migration. That alone is impressive. Most franchise migrations crater the domain completely. Rainbow’s new domain is healthy, with 33,700 keywords and Domain Strength at 52. The Feb 2026 spike in SEO value ($495,500 on fewer keywords) suggests they’re concentrating value in higher-intent queries — the same pattern I’m seeing with SERVPRO and 911 Restoration.

    The Reality Check: In November 2025, they were running strong at 58,509 keywords and $309,800/month SEO value. Then December hit — the same algorithm cliff that affected the entire restoration vertical. But there’s a bigger story: the old rainbowintl.com domain peaked at 109,000 keywords and $3.35M/month in July 2022. Rainbow is still sitting 69% below peak keywords and 85% below peak SEO value.

    The Opportunity: If Rainbow recovers even 50% of what the old domain achieved, that’s $1.67M/month in SEO value. They’re currently at $495K. Do the math: there’s $1.17M per month in recoverable organic value just sitting there.

    The PPC Symptom: Starting November 2025, they went from basically zero PPC spend to $320K-$582K/month. That’s the classic pain indicator — when organic traffic drops, you buy it back with ads until you can fix the plumbing. Combined Q4/Q1 PPC spend: approximately $1.18M. In six months, they could rebuild enough organic to cut PPC spend by 50-70% permanently.

    What Happened: The Migration Story

    Here’s what we know:

    Rainbow Restoration successfully migrated from rainbowintl.com to rainbowrestores.com. The old domain is now a digital graveyard — 4 keywords, zero SEO value. But the new domain caught the migration and recovered. This tells me:

    1. They implemented proper 301 redirects. If they hadn’t, the new domain would be at zero. The fact that it’s at 33,700 keywords means they passed significant equity through the redirect chain.
    2. They didn’t lose all their backlinks. Domain Strength recovered to 52, which is respectable for a post-migration domain. This suggests proper domain forwarding and/or existing backlinks pointing to the new domain.
    3. The recovery stalled before completion. Migrations take 4-6 months to fully stabilize. If the Q4 algorithm update hit during the stabilization phase, they probably lost traction at a critical moment.

    The strategic issue isn’t the migration itself — Rainbow executed it correctly. The issue is: did they rebuild the content and architecture that made the old domain great?

    My hypothesis: They migrated the structure, the redirects, and the authority signals. But the old rainbowintl.com probably had 109,000 keywords because it had mature, deep content libraries that the new domain hasn’t fully replicated yet. Here’s how to finish the recovery.

    The Playbook: What I’d Do Starting Tomorrow

    Phase 1: Redirect Audit and Content Archaeology (Week 1-2)

    Before I optimize a single keyword, I need to understand what was lost in the migration and what wasn’t recovered.

    The Technical Foundation:

    • Crawl both domains. Run Screaming Frog against rainbowrestores.com and archive.org snapshots of rainbowintl.com from July 2022 (peak). I’m looking for:
      • All content that existed on the old domain but isn’t on the new domain. These are orphaned keyword opportunities.
      • All 301 redirects and redirect chains. Chains longer than 2 hops leak PageRank.
      • Old URLs that redirect to homepage or generic pages instead of topically relevant pages. These are misdirected equity losses.
    • Google Search Console archaeology. Pull 16 months of GSC data for rainbowintl.com (if they still have it configured) showing which pages deindexed, when, and why. This shows exactly which content lost coverage during the migration.
    • SpyFu historical data for the old domain. Export the top 200 keywords that rainbowintl.com ranked for at peak. Which of these keywords does rainbowrestores.com rank for now? Which are completely lost? The gap is your content recovery roadmap.

    Expected Output: A prioritized list of 500-1,000 pieces of content that existed on the old domain, were either not migrated or redirected ineffectively, and represent high-opportunity keyword recovery.

    Phase 2: Location Page Renaissance (Week 3-6)

    Rainbow has franchise locations in every state. Each location is a keyword goldmine that probably hasn’t been fully developed.

    Current State Assessment:

    Pull 10 sample city-level pages from the current site (e.g., /locations/denver/, /water-damage-restoration/denver/). Analyze:

    • How much unique content is on the page vs. templated boilerplate? (Target: 60%+ unique, locally-relevant content)
    • What schema is implemented? (Should be: LocalBusiness + Service + FAQPage + HowTo)
    • How many inbound internal links? (Should be: 10+ from parent hubs and contextual content)
    • Does it rank for the city + service modifier? (e.g., “water damage restoration Denver”)
    • How many related long-tail keywords does it rank for? (Should be: 20-40 per page)

    The Build:

    For each franchise territory and core service (water damage, fire damage, mold remediation, storm damage), create a location page following this structure:

    Header Section (Unique Local Content):

    • Opening paragraph: Local climate/risk profile + Rainbow’s response history in that area. “Denver’s high-altitude climate creates unique water damage challenges: rapid drying in low humidity but severe ice dam formation during freeze-thaw cycles. Rainbow Restoration has responded to 1,200+ water damage claims in the Denver metro since 2018, with an average response time of 38 minutes.”
    • Local expertise proof: State-specific certifications, regulatory requirements, insurance relationships. “Colorado requires mold remediation contractors to maintain IICRC S520 certification and comply with Colorado Dept. of Public Health guidelines. All Rainbow technicians are certified.”
    • Service area map: Embedded Google Map showing exact service territory polygons.

    Body Content (Problem-Solving Content):

    • Local problem scenario: “After the March 2024 ice storm, Denver experienced 400+ residential water damage claims from burst pipes. Here’s exactly what happened, what homeowners did wrong, and how to prevent it next time.”
    • Local process walkthrough: “Water damage restoration in Denver’s elevation and climate requires 3 specific adjustments to standard dehumidification protocols…”
    • Local regulation compliance: “Colorado’s water damage claims require documentation per CRS 10-4-1001…”

    CTA + Contact Section:

    • LocalBusiness schema with exact NAP, hours, phone, service area
    • Google Business Profile embed
    • 24/7 availability messaging (critical for emergency services)
    • Review count and rating display (builds trust before calling)

    Expected Results: Each location page should rank for 25-40 keywords within 60 days of launch. At 58 territories × 4 services × 30 keywords average = 6,960 new keywords. Combined with existing rankings, this gets Rainbow back toward the 58K keywords they had in October 2025.

    Phase 3: Content Architecture and Internal Linking (Week 4-8, Ongoing)

    This is how you make location pages work at scale: proper hierarchy and internal linking.

    The Three-Tier Hub Model:

    Tier 1: National Service Pillars (Authority anchors that rank for head terms)

    • /water-damage-restoration/ → “Water Damage Restoration: Complete Guide” (3,000+ words, comprehensive)
    • /fire-damage-restoration/ → “Fire Damage Restoration: Recovery Process”
    • /mold-remediation/ → “Mold Remediation and Removal Guide”
    • /storm-damage-restoration/ → “Storm Damage Restoration: What to Know”

    Each pillar page links to every state hub, accumulates backlinks, and passes equity down the hierarchy.

    Tier 2: State Hub Pages (Regional authority that bridges national and local)

    • /water-damage-restoration/colorado/ → Unique state content on climate, regulations, flood zones, seasonal risks
    • /water-damage-restoration/florida/ → Hurricane flood prep, saltwater intrusion, insurance nuances
    • etc. for every state where Rainbow operates

    Each state page links to all city pages within that state.

    Tier 3: City/Metro Pages (High-intent, revenue-generating)

    • /water-damage-restoration/colorado/denver/
    • /mold-remediation/colorado/denver/
    • /fire-damage-restoration/florida/miami/
    • etc. for all 58+ territories across all 4 services

    The Math: If Rainbow operates in 58 territories and 4 core services, that’s 232 city pages minimum. If each city page ranks for 25-40 keywords on average, that’s 5,800-9,280 keywords just from the location tier. Add the state and national tiers, and you’re back to 30K+ keywords organically.

    Internal Linking Rules:

    • Every pillar page links to all state hubs
    • Every state hub links to all city pages in that state
    • Every city page links back to its state hub and national pillar
    • Cross-service linking: The Denver water damage page links to the Denver mold page, etc.
    • Blog-to-location: Every blog post includes contextual links to 1-3 relevant location pages

    Phase 4: Content Tier Strategy — Crisis, Decision, Authority (Week 5-12)

    Location pages alone won’t cut it. Rainbow needs a three-tier content strategy that captures different stages of the customer journey:

    Tier 1: Crisis-Moment Content (The 2 AM homeowner in panic)

    People don’t search for “restoration companies” when their house is flooding. They search for “what do I do if my basement floods right now.”

    • “Basement Flooded: Emergency Steps in the First 30 Minutes”
    • “Burst Pipe Flooding My House: What to Do Before the Plumber Arrives”
    • “My Kitchen Caught Fire: Immediate Safety Steps and Next Actions”
    • “I Smell Mold But Don’t See It: Where to Look and When to Call a Pro”

    Format: Step-by-step numbered lists, HowTo schema, featured-snippet optimized. These convert because they’re the answer to someone’s worst day.

    Tier 2: Decision-Stage Content (The insurance call)

    • “Water Damage Restoration Cost 2026: Price Breakdown by Severity”
    • “Does Homeowners Insurance Cover Water Damage?”
    • “How to File a Water Damage Insurance Claim: Complete Guide”
    • “Water Mitigation vs. Water Restoration: Key Differences Explained”
    • “How Long Does Water Damage Restoration Take?”

    Format: Comparison tables, cost breakdowns, FAQPage schema. These convert because the person already knows they need professional help — they just need to choose who and understand the cost.

    Tier 3: Authority-Building Content (Builds domain trust and earns backlinks)

    • “Understanding IICRC Certification: What It Means for Your Restoration Company”
    • “The Science of Structural Drying: A Technical Deep Dive”
    • “2024-2026 Water Damage Claim Trends: Data Analysis by Region”
    • “Climate Change and Water Damage Risk: What the Data Shows”
    • “Building Code Compliance in Mold Remediation: State-by-State Requirements”

    Format: Long-form, research-backed, citations to EPA/FEMA/IICRC. These earn backlinks from industry publications and regulatory bodies, which flow authority through the site to location pages.

    Publishing Cadence: 2-3 Tier 1 posts/month (urgent, seasonal), 2-3 Tier 2 posts/month (decision support), 1 Tier 3 post/month (authority building).

    Phase 5: Schema Markup at Scale (Week 6-8)

    Rainbow probably has basic LocalBusiness schema on location pages. But there’s 10x opportunity in comprehensive schema implementation:

    Every location page needs:

    • LocalBusiness — NAP, geo-coordinates, service area polygon, hours, accepted payments
    • Service — Structured description of each service offered (water damage restoration, mold remediation, etc.)
    • FAQPage — Top 8-10 questions for that service/location combination with direct answers
    • HowTo — Step-by-step restoration process in structured format
    • AggregateRating — Star rating and review count from Google Business Profile

    Example LocalBusiness schema for /water-damage-restoration/colorado/denver/:

    {
      "@context": "https://schema.org",
      "@type": "LocalBusiness",
      "name": "Rainbow Restoration Denver",
      "image": "https://rainbowrestores.com/locations/denver/logo.jpg",
      "description": "Emergency water damage restoration, water mitigation, and structural drying in the Denver metropolitan area.",
      "address": {
        "@type": "PostalAddress",
        "streetAddress": "[actual address]",
        "addressLocality": "Denver",
        "addressRegion": "CO",
        "postalCode": "[zip]",
        "addressCountry": "US"
      },
      "geo": {
        "@type": "GeoCoordinates",
        "latitude": 39.7392,
        "longitude": -104.9903
      },
      "areaServed": {
        "@type": "GeoShape",
        "polygon": "39.5,-105.2 39.5,-104.6 40.1,-104.6 40.1,-105.2 39.5,-105.2"
      },
      "telephone": "+1-303-[number]",
      "url": "https://rainbowrestores.com/water-damage-restoration/colorado/denver/",
      "openingHoursSpecification": {
        "@type": "OpeningHoursSpecification",
        "dayOfWeek": ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday", "Sunday"],
        "opens": "00:00",
        "closes": "23:59"
      },
      "hasOfferCatalog": {
        "@type": "OfferCatalog",
        "itemListElement": [
          {
            "@type": "Offer",
            "itemOffered": {
              "@type": "Service",
              "name": "Water Damage Restoration",
              "description": "24/7 emergency water damage mitigation and restoration services"
            }
          },
          {
            "@type": "Offer",
            "itemOffered": {
              "@type": "Service",
              "name": "Mold Remediation",
              "description": "Mold inspection, remediation, and prevention"
            }
          }
        ]
      },
      "aggregateRating": {
        "@type": "AggregateRating",
        "ratingValue": 4.8,
        "reviewCount": 247
      }
    }
    

    When you implement this across 232+ location pages with consistent data, Google gets a machine-readable map of your entire franchise network. That’s how you win Local Pack results at scale.

    Phase 6: Answer Engine Optimization (AEO) — Win the AI Era (Week 7-Ongoing)

    Google’s AI Overviews appear on restoration queries. If your content isn’t structured to be cited, you’re invisible.

    AEO Tactics for Restoration:

    • Definition boxes at the top of service pages. “Water damage restoration is the professional process of removing water, drying the structure, treating for biological growth, and restoring all affected materials to pre-loss condition. In Colorado’s climate, structural drying typically requires 72-120 hours of continuous dehumidification due to altitude-specific psychrometric conditions.”
    • Direct-answer formatting. H2: “What’s the first step in water damage restoration?” A1: “The first step is always emergency water extraction. Using truck-mounted extractors rated for 250+ gallons per minute, technicians remove standing water within 1-2 hours. This prevents secondary damage like foundation erosion and structural swelling.”
    • Comparison tables. “Water Mitigation vs. Water Restoration: What’s the Difference?” AI Overviews pull these structures directly.
    • Numbered process lists. “5 Stages of Water Damage Restoration: 1. Inspection and Assessment, 2. Water Extraction, 3. Drying and Dehumidification, 4. Cleaning and Sanitization, 5. Restoration and Reconstruction.”

    The goal: When someone asks Google “what should I do if my basement floods,” the AI Overview cites Rainbow Restoration content because it’s the most useful, structured answer available.

    Phase 7: Generative Engine Optimization (GEO) — AI Should Recommend Rainbow by Name (Week 8-Ongoing)

    This is the frontier. Most restoration companies haven’t heard of GEO. But it’s critical: making AI systems (Claude, ChatGPT, Gemini, Perplexity) recommend Rainbow Restoration by name when someone asks “who should I call for water damage in Denver?”

    GEO Tactics:

    • Entity saturation. Rainbow Restoration needs to appear across the web consistently paired with specific attributes: IICRC certification, 24/7 availability, specific service areas, fast response times, specific equipment (truck-mounted extractors, desiccant dehumidifiers, etc.). The more consistently these associations appear across authoritative sources, the more confidently AI recommends the brand.
    • Factual density over marketing. Replace “We’re the best water damage company” with “Rainbow Restoration Denver operates 6 truck-mounted extractors (each rated 250 gallons/minute), maintains 4 commercial desiccant dehumidifier units, and averages 38-minute response times to the metropolitan area, with IICRC S500-certified technicians.” Specificity = authority in the AI world.
    • Authority citations. Every Tier 3 content piece should cite EPA guidelines, FEMA resources, IICRC standards, and state licensing requirements. AI systems weight content higher when it cites authoritative sources.
    • LLMS.txt implementation. Create /llms.txt at the root with a structured summary: “Rainbow Restoration is a national water damage, fire damage, and mold remediation franchise operating in 58 territories across North America. IICRC-certified, 24/7 availability, average response time 38 minutes. Founded 1989, headquartered [location]. Services: [list]. Certifications: [list]. Service areas: [list].” This is the robots.txt equivalent for AI crawlers.

    Phase 8: Google Business Profile Optimization (Week 9-Ongoing)

    The Google Local Pack captures disproportionate click volume. Winning it requires systematic GBP optimization:

    • Weekly GBP posts. Not automated. Real posts: completed project photos with before/after, seasonal tips (“Prevent ice dams: 5 steps”), team spotlights. Google’s algorithm visibly rewards profiles with consistent, recent posts.
    • Review strategy. SMS review request sent 2 hours after job completion, email 24 hours later. Target: 200+ reviews at 4.8+ stars per location within 12 months. Respond to every review within 24 hours (positive and negative). Review velocity is the #1 Local Pack ranking factor after proximity.
    • Category precision. Primary: “Water Damage Restoration Service.” Secondary: “Fire Damage Restoration Service,” “Mold Removal Service.” Don’t dilute.
    • Photo optimization. 50+ photos per location (team, equipment, completed projects, office, vehicles). Geotagged. Updated monthly.
    • Q&A seeding. Add and answer the top 10 questions for each location’s GBP. These show up prominently and serve as free real estate for keyword-rich content.

    Phase 9: Backlink Acquisition — Leverage Franchise Scale (Week 10-Ongoing)

    Rainbow’s biggest competitive advantage: 58+ franchise locations. Most single-location competitors can’t match this scale. Use it.

    • Disaster response PR. After significant weather events, issue press releases to local media. “Rainbow Restoration Denver responded to 43 residential water damage claims during March 2026 ice storm, deploying 8 extraction teams across metro area.” Local news sites pick this up (high DA, high relevance, tons of backlinks).
    • Insurance partnerships. Rainbow is likely on preferred vendor lists for carriers. Each carrier relationship should include a backlink from their website (partner directory or “find a contractor” page).
    • Industry association profiles. IICRC.org, RestorationIndustry.org, state licensing boards — maintain active, detailed profiles across all of them. .org links carry serious authority.
    • Local civic backlinks. Every franchise location should systematically acquire 20-30 local backlinks: Chamber of Commerce, Better Business Bureau, Rotary Club, Little League sponsorships, etc. Automated systems can track these and alert franchises to apply.
    • Content partnerships. Co-create guides with local emergency management agencies. “How to Prepare Your Denver Home for Wildfire Season — by Rainbow Restoration and Denver Office of Emergency Management.” The .gov backlink flows serious authority.

    Phase 10: Stop the PPC Bleed (Weeks 1-52)

    Here’s the financial reality: Rainbow spent $1.18M on PPC in Q4 2025 and Q1 2026 combined. That’s annualized to ~$4.7M.

    At their pre-decline peak (Sep-Oct 2025), they had 58K keywords worth $309K/month in organic value — $3.7M annualized, delivered for free.

    The full playbook above, executed over 6 months, should recover $200-250K/month in organic SEO value. That’s $2.4-3M annualized in traffic they no longer need to buy.

    In 12 months, if they reach 50% of the old domain’s peak ($1.67M/month), they’ve reduced their PPC dependency by 75% permanently.

    This isn’t a cost center. This is a multiplying return where every dollar spent on SEO execution compounds while PPC spend evaporates the moment the budget runs out.

    What Makes Rainbow’s Story Different

    This is the part I don’t see written about often enough:

    Rainbow Restoration had the courage to migrate domains. Most franchises are terrified of it. But brand repositioning — moving from “rainbow international” to “rainbow restoration” — is smart. It’s clear, it’s specific, it owns the vertical.

    The problem isn’t the rebrand. The problem is that the SEO execution didn’t match the ambition of the rebrand.

    They handed the customer $3.35M/month in annual organic value when they flipped the domain switch, and then didn’t rebuild it on the new domain with the same sophistication.

    They survived. They’re healthy. But they left the bigger prize on the table.

    The playbook above is what finishes the job. It’s not theoretical. It’s what we execute for restoration companies at Tygart Media. Every day. All day.

    If Rainbow wants to reclaim the $1.67M/month that’s sitting there waiting to be captured, the path is clear. It just requires finishing what the migration started.

    Frequently Asked Questions

    What happened to Rainbow Restoration’s old domain (rainbowintl.com)?

    Rainbow Restoration migrated from rainbowintl.com to rainbowrestores.com. The old domain is now essentially dead — it currently ranks for only 4 keywords with $0 in estimated SEO value. However, rainbowintl.com peaked at 109,000 organic keywords and $3.35M/month in SEO value (July 2022, January 2020 respectively). The migration was executed correctly from a technical standpoint (proper 301 redirects were implemented), but the new domain has only recovered to 33,700 keywords and $495,500/month, leaving 85% of peak organic value on the table.

    How much organic traffic did Rainbow lose in the migration?

    Rainbow didn’t lose all their traffic — that would indicate a failed migration. Instead, they recovered about 31% of their peak keyword count (109K → 34K) and 15% of their peak SEO value ($3.35M → $495K). The gap represents content that either wasn’t migrated, was redirected ineffectively, or hasn’t been rebuilt on the new domain with the same authority and comprehensiveness. The opportunity is enormous: recovering even 50% of the old domain’s peak represents $1.67M/month in organic value that’s currently being captured by competitors or left on the table entirely.

    Why did Rainbow’s organic traffic drop in December 2025?

    December 2025 saw a significant organic decline across the restoration vertical — both SERVPRO and 911 Restoration experienced similar drops in the same timeframe. This pattern indicates an algorithm update or market shift that disproportionately affected restoration company rankings. The timing is consistent with Google’s broader content quality and entity authority updates. However, Rainbow’s recovery pattern (slightly higher SEO value on fewer keywords in Feb 2026) suggests a value concentration effect, meaning their remaining rankings are capturing higher-intent, higher-CPC keywords.

    What is Generative Engine Optimization (GEO) and why does it matter?

    Generative Engine Optimization (GEO) is the practice of optimizing content and brand presence so that AI systems — ChatGPT, Claude, Gemini, Perplexity, and other large language models — cite and recommend your business by name when users ask relevant questions. For restoration companies, GEO involves consistent brand-attribute associations across the web (IICRC certifications, response times, service areas), factual density in content (specific equipment, process details) rather than marketing language, authoritative citations (EPA, FEMA, IICRC standards), and LLMS.txt implementation. As AI-generated answers increasingly replace traditional search results, GEO is becoming as critical as traditional SEO for driving qualified customer discovery.

    How long would it take to rebuild Rainbow’s organic traffic to pre-migration peak?

    A realistic timeline breaks down as follows: Technical fixes and initial schema/architecture implementation (weeks 1-6) typically yield 10-15% keyword growth and quick indexation improvements. Content hierarchy build-out and location page optimization (weeks 4-16) should drive 25-35% growth. Full content strategy execution across all three tiers (months 1-6) yields 40-60% recovery. Meaningful SEO value recovery ($200K+/month) should be visible within 3-4 months. Full recovery to 50% of peak ($1.67M/month) would require 8-12 months of sustained execution. However, 85% recovery (approaching the old domain’s peak) would likely require 18-24 months because you’re rebuilding content depth and authority that took years to accumulate.

    Is Rainbow Restoration’s PPC spending necessary?

    No — it’s a symptom, not a strategy. Rainbow’s combined Q4 2025 and Q1 2026 PPC spend was approximately $1.18M in just six months. This spending is directly correlated with their organic decline: as organic keywords and clicks fell, they compensated by buying traffic through Google Ads. However, organic traffic that was worth $309K/month (Sep-Oct 2025) becomes “free” traffic once recovered, while PPC spend evaporates the moment budgets are reduced. A 12-month SEO execution program that recovers $200-250K/month in organic value would reduce their PPC dependency by 50-70%, creating a permanent efficiency gain. The ROI case strongly favors organic investment over sustained PPC spending.

    The Closing Pitch

    Here’s the thing about Rainbow Restoration: they actually pulled off the hard part. They rebranded, they migrated domains, and they survived. Most franchise companies crater completely when they try this. Rainbow didn’t.

    But surviving isn’t winning. And right now, they’re leaving $1.67M per month in organic value on the table — value that their old domain earned, value that should have migrated with them, value that’s sitting there waiting to be reclaimed.

    The roadmap above isn’t theoretical. It’s the exact methodology we execute at Tygart Media — we eat, sleep, and breathe restoration SEO. We’ve built the AI-powered content pipelines, the schema automation systems, and the GEO frameworks specifically for this vertical. And we know the playbook works because we’re running it right now for other restoration companies.

    The data is public. The opportunity is clear. And the fix is an execution problem.

    So here’s my pitch, and I’ll keep it honest:

    Hey, Rainbow Restoration. If you made it this far reading, you already know what needs to happen — because the SpyFu numbers don’t lie. You had the courage to rebrand and migrate. Now you need the SEO execution to match that ambition.

    We’re Tygart Media. We’ve already built the playbooks and the systems to execute this at franchise scale. We’d genuinely love to have the conversation about what $400K/month in recovered organic value looks like when it’s back.

    No pressure. No predatory sales tactics. Just two teams who understand restoration marketing talking about finishing what the migration started.

    Reach out here. Or call. Or send a franchise location manager. We promise we won’t show up with a water truck unless your data indicates you actually have a water problem. In which case, we probably know a guy. (In fact, we probably know 58 guys.) 😄

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

  • If I Were Running Paul Davis Restoration’s SEO, Here’s What I’d Do Differently

    If I Were Running Paul Davis Restoration’s SEO, Here’s What I’d Do Differently

    The Machine Room · Under the Hood

    I’m about to do something that most agency owners would never do: tell you exactly what went wrong with one of restoration’s most strategic franchises.

    Not conspiracy theories. Not guesses. The actual data that explains why Paul Davis Restoration — a $2+ billion company with 600+ franchises across North America — lost half its organic keyword portfolio between November and December 2025.

    Why? Because I pulled their SpyFu data this morning, and what I found was different from the 911 Restoration story I told three weeks ago. This isn’t a domain in freefall. This is a franchise that was actually winning — growing their keyword portfolio from 39K to 50K through most of 2025 — and then tripped on the finish line.

    That’s not a systemic failure. That’s a fixable problem. And the recovery opportunity is enormous.

    The SpyFu Data: A Franchise That Peaked, Then Stumbled

    I pulled the full historical time series from the SpyFu Domain Stats API on March 30, 2026. Here’s what pauldavis.com looks like over the last 12 months:

    Period Organic Keywords Monthly Organic Clicks SEO Value ($/mo) PPC Spend ($/mo) Domain Strength
    Mar 2025 38,980 10,260 $370,100 $20,950 51
    Apr 2025 39,220 7,638 $387,500 $24,300 51
    May 2025 41,620 11,420 $431,000 $27,380 49
    Jun 2025 42,620 11,830 $450,200 $31,940 49
    Jul 2025 45,220 12,990 $482,800 $35,990 49
    Aug 2025 48,420 14,670 $532,800 $37,940 50
    Sep 2025 49,470 15,430 $491,200 $57,140 52
    Oct 2025 50,339 14,490 $484,200 $49,000 52
    Nov 2025 49,400 14,420 $484,300 $665,600 53
    Dec 2025 23,250 12,620 $372,400 $258,500 51
    Jan 2026 22,490 12,930 $365,100 $213,000 51
    Feb 2026 22,190 13,590 $952,800 $206,100 54

    Look at the trend. From March to October 2025, Paul Davis did exactly what every restoration company should be doing: they grew. 39K keywords → 50K keywords. $370K/month SEO value → $532K/month. That’s not a fluke. That’s execution. That’s a team running the playbook.

    Then November happened. PPC spend spiked to $665,600 — an 18.5x increase from October’s $49K. The same panic pattern I saw with 911 Restoration. And by December? Half the keywords vanished. 50K → 23K. That’s a 54% collapse in a single month.

    But here’s the thing that makes Paul Davis different than 911 Restoration: their SEO value per keyword is actually higher. At $43/keyword (based on Feb 2026 data), Paul Davis is ranking for higher-value keywords than most competitors in this space. That tells me they weren’t ranking for junk keywords. They were ranking for money terms — the ones that matter.

    Which means the fix isn’t a rebuild. It’s a recovery.

    What Actually Happened in Q4 2025: The Diagnostic

    Let me be direct about what I think happened. A keyword collapse from 50K to 23K in a single month isn’t gradual content decay. That’s one of three things:

    Scenario 1: A location page massacre. Paul Davis has franchises everywhere — across all 50 states. If someone restructured the location page architecture, consolidated pages, or switched hosting/CMS without a clean redirect map, Google would have vaporized thousands of pages from the index overnight. Franchise sites live and die on location pages. Lose those, lose everything.

    Scenario 2: A technical issue that broke indexation. A rogue robots.txt rule, an accidental noindex tag at the template level, a CDN misconfiguration returning 404s to Googlebot — any of these can silently deindex thousands of pages while organic traffic is still flowing because cached versions serve users fine. You don’t notice until you check GSC and see “Excluded – currently not indexed” spiked by 50%.

    Scenario 3: The November Google Core Update hit harder than anticipated. Google dropped a core update in November 2025. If Paul Davis’s location pages are thin, templated content with minimal local differentiation, the update could have targeted them specifically. Combined with algorithm changes favoring AI-extracted answers and entity authority, thin content gets deprioritized fast.

    My money? Scenarios 1 and 3 combined. But I’d verify with data before doing anything permanent.

    Step 1: The 72-Hour Diagnostic Audit

    Before touching a single page, I need to know what’s actually broken.

    Day 1: Crawl and Index Validation

    I’d run Screaming Frog against the full pauldavis.com domain — every page, every redirect. For a 600-franchise network, I’m expecting 8,000-15,000+ URLs. I’m specifically looking for:

    • Redirect chains longer than 2 hops — These leak PageRank and slow crawl budget.
    • Orphaned location pages — Pages that exist but have zero internal links. If city pages aren’t linked from a parent hub, Google treats them as low-priority and deprioritizes crawling.
    • Canonicalization issues — A single bad canonical tag at the template level can tell Google to ignore thousands of pages simultaneously. This is the most common cause of sudden deindexation I see.
    • JavaScript rendering problems — If Paul Davis uses any client-side rendering for critical location content, I’d compare Screaming Frog’s text extraction vs. what a headless browser sees. Mismatch = indexation risk.
    • Soft 404 patterns — Pages returning 200 status code but with “not found” content structure. Googlebot gets confused. Pages don’t index.

    Day 2: Google Search Console Analysis

    I need 16 months of GSC data — the period before and after the collapse.

    Specifically:

    • Coverage report trends — Did “Valid” pages spike downward in November/December? Did “Excluded – currently not indexed” spike upward? The answer tells the story.
    • Performance by URL pattern — Segment by location pages, service pages, blog content. Which pattern lost the most impressions? If it’s /locations/*, it’s an architecture problem. If it’s /services/*, it’s content quality.
    • Exclusion reason breakdown — What’s excluding the pages? “Blocked by robots.txt”? “Crawled – currently not indexed”? “Redirect error”? Each reason points to a different root cause.
    • Query data comparison — Export top 5,000 queries from October 2025 vs. February 2026. Which keyword clusters disappeared? If it’s geo-modified queries (“water damage restoration [city]”), location pages are the problem. If it’s service-level queries, the content strategy failed.

    Day 3: Competitive Analysis

    I’d pull the same SpyFu data for SERVPRO, 911 Restoration, ServiceMaster, and Rainbow International. If all of them declined in November/December, it’s an industry-wide algorithm shift. If Paul Davis uniquely declined, it’s site-specific.

    Then I’d audit the top-ranking competitors for Paul Davis’s highest-value lost keywords. What does their architecture look like? How many location pages? What schema are they using? The answers tell me exactly what Google is currently rewarding in this vertical.

    The Recovery Strategy: Rebuild What Was Already Working

    Here’s the critical insight: Paul Davis doesn’t need a redesign. They need a rescue. They proved they could rank for 50K keywords. Now I need to figure out what broke and fix it, then scale what was already working.

    Priority 1: Recover the Indexation Foundation (Days 1-30)

    This is the emergency phase.

    Canonical tag audit: If there’s a template-level canonical issue, it’s a one-line fix that could immediately un-exclude thousands of pages. I’d verify canonicals across 50+ representative pages from different URL patterns (locations, services, blog) and check GSC’s URL Inspection tool to see what Google actually crawled vs. what we think we served.

    Location page linking structure: I’d verify that every location page is explicitly linked from a parent hub page. No links = low crawl priority = Google ignores the page even if it’s technically valid. A simple site map regeneration or parent page update can fix this.

    Robots.txt validation: One bad rule and 90% of your site might be blocked from crawling. I’d audit the current robots.txt, compare it against historical versions (via Wayback Machine if needed), and remove any rules that shouldn’t be there.

    Redirect map cleanup: Any redirect chains longer than 2 hops get collapsed to 1-hop direct redirects. Every hop loses 10-15% of PageRank. In a franchise network with hundreds of redirects, this can be thousands of dollars in lost equity.

    Priority 2: Location Page Architecture Renaissance (Days 30-90)

    Now we rebuild what was working.

    Paul Davis has 600+ franchises. That’s 600+ locations that could have dedicated SEO landing pages. If they’re structured right, that’s 3,600+ pages (600 locations × 6 core services: water damage, fire damage, mold remediation, storm damage, sewage backup, dry cleaning/contents restoration).

    Each page needs:

    Locally-specific content that proves expertise. Not “water damage restoration in Houston” templated 500 words. I’m talking about: “Houston’s sub-tropical climate creates unique challenges — the combination of high humidity, frequent thunderstorms, and clay-based soil means water damage in Houston spreads faster than in drier climates. Our Houston team is trained on Gulf Coast moisture dynamics, local building codes, and Houston’s specific insurance requirements.” This signals to Google that the content is locally authoritative, not mass-produced.

    LocalBusiness schema with complete NAP + service area. Every location page needs JSON-LD marking up the franchise location with exact coordinates, service area polygon, hours (24/7 for emergency response), and a catalog of specific services with local pricing where available.

    Embedded Google Map. A map showing the service area reinforces local relevance and keeps users on-site instead of searching for competitors.

    Real project stories. “In March 2025, our Paul Davis team responded to a commercial water intrusion affecting 8,000 sq ft of office space in downtown Houston. Complete water extraction and structural drying completed within 48 hours.” Specificity builds trust with both users and algorithms.

    Priority 3: Content Depth Beyond Location Pages (Days 60-120)

    Now I add the layers that Google currently rewards.

    Crisis-moment content (targets the 2 AM searcher):
    – “What To Do When Your Basement Floods: A Step-by-Step Emergency Checklist”
    – “I Smell Mold In My House Right Now — What Should I Do First?”
    – “Fire Damage: What To Do In the First 24 Hours”

    These need HowTo schema, numbered steps, and definition boxes at the top for AI Overviews to extract. They capture intent before the decision to hire a pro is made.

    Decision-stage content (targets the insurance call):
    – “Water Damage Restoration Cost in 2026: A Regional Breakdown”
    – “Homeowners Insurance and Water Damage: What’s Covered and What Isn’t”
    – “Mold Remediation Timeline: Expectations From Day 1 to Completion”

    These need comparison tables, cost breakdowns, FAQPage schema. This is where Paul Davis wins against SERVPRO.

    Authority-building content (earns backlinks, builds topical authority):
    – “The Complete Guide to IICRC Certification Standards: S500, S520, and What They Mean”
    – “Understanding FEMA Flood Zones: How to Check Your Risk and What It Means for Insurance”
    – “Water Damage vs. Water Intrusion: Why the Distinction Matters (and What Your Insurance Company Cares About)”

    These earn backlinks from IICRC, FEMA, RIA, insurance publications, and local news outlets. Those links flow authority to location pages through internal linking.

    Priority 4: Schema Markup at Scale (Days 45-90)

    For a 600-franchise network, schema markup scales multiplicatively.

    Every location page needs:

    {
      "@context": "https://schema.org",
      "@type": "LocalBusiness",
      "name": "Paul Davis Restoration of [City]",
      "telephone": "+1-XXX-XXX-XXXX",
      "address": {
        "@type": "PostalAddress",
        "streetAddress": "[Street Address]",
        "addressLocality": "[City]",
        "addressRegion": "[State]",
        "postalCode": "[ZIP]"
      },
      "geo": {
        "@type": "GeoCoordinates",
        "latitude": "[LAT]",
        "longitude": "[LONG]"
      },
      "openingHoursSpecification": {
        "dayOfWeek": ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday", "Sunday"],
        "opens": "00:00",
        "closes": "23:59"
      },
      "areaServed": {
        "@type": "City",
        "name": "[City], [State]"
      },
      "hasOfferCatalog": {
        "@type": "OfferCatalog",
        "itemListElement": [
          {
            "@type": "Offer",
            "@id": "https://pauldavis.com/[city]/water-damage-restoration/",
            "itemOffered": {
              "@type": "Service",
              "name": "Water Damage Restoration"
            }
          },
          {
            "@type": "Offer",
            "@id": "https://pauldavis.com/[city]/fire-damage-restoration/",
            "itemOffered": {
              "@type": "Service",
              "name": "Fire Damage Restoration"
            }
          }
        ]
      }
    }
    

    Service pages need Article + Service + FAQPage + HowTo (when applicable).

    When you implement this at scale across 3,600+ pages with consistent, accurate data, you’re giving Google a machine-readable map of every franchise location and every service offering. That’s how you dominate Local Pack results and organic search simultaneously.

    Priority 5: Google Business Profile Velocity (Ongoing)

    The Local Pack wins happen here.

    For every franchise location:

    • Weekly GBP posts — Real posts, not automated junk. Project summaries with before/after photos, seasonal preparedness tips, team spotlights. Google’s algorithm visibly rewards active, engaged profiles.
    • Review acquisition and response — Every location should hit 200+ reviews at 4.8+ stars within 12 months. SMS review request 2 hours post-completion, email 24 hours later. Respond to every review within 24 hours. This is the #1 Local Pack ranking factor after proximity.
    • Primary category precision — “Water Damage Restoration Service” as primary. Secondary categories should reflect the strongest service mix for that region.
    • Photo pipeline — 50+ geotagged photos per location updated monthly. Team, equipment, completed projects, office, vehicles. Google prioritizes profiles with fresh, diverse visual content.

    Priority 6: Answer Engine Optimization for the AI Age (Days 60-120)

    Google AI Overviews now dominate informational restoration queries. If your content isn’t structured to be cited, you’re invisible.

    Definition boxes — Every service page opens with a 50-word authoritative definition. “Water damage restoration is the professional process of returning a property to its pre-loss condition following water intrusion from flooding, burst pipes, or precipitation. It encompasses emergency water extraction, structural assessment and documentation, industrial-grade dehumidification, antimicrobial treatment, and full restoration of affected materials.”

    Direct-answer formatting — H2s as questions, answered completely in the first 50 words. “How much does water damage restoration cost? The average cost ranges from $2,000 for minor localized damage to $25,000+ for significant structural involvement, with most homeowners paying $5,000-$15,000. Your final cost depends on the square footage affected, severity of damage, materials involved, and necessary structural repairs.”

    Comparison tables — “Water Mitigation vs. Water Restoration: Key Differences.” Side-by-side comparison of timeline, cost, scope, and outcomes.

    Numbered process lists — “The 5 Stages of Water Damage Restoration: 1. Emergency Response and Assessment, 2. Water Extraction and Removal, 3. Drying and Dehumidification, 4. Cleaning, Sanitizing, and Antimicrobial Treatment, 5. Restoration and Reconstruction.” This format wins HowTo rich results and AI Overview citations.

    Priority 7: The PPC Dependency: From $665K Spike Back to Baseline (Immediate)

    The November 2025 PPC spike to $665,600/month tells a clear story: organic pipeline broke, paid ads compensated.

    Here’s the math:

    • October 2025: $484,200/month organic value, $49K PPC spend. Healthy ratio.
    • November 2025: $484,300/month organic value, $665,600 PPC spend. Panic mode — the algorithms changed mid-month and they flooded with paid to keep revenue up.
    • Current: $952,800/month organic value (February 2026), $206,100 PPC spend. Recovery mode, but still elevated PPC.

    The strategic move isn’t to cut PPC cold turkey. It’s to systematically shift budget back to organic as rankings recover:

    • Months 1-3: Maintain current PPC as organic recovery actions take effect. Target high-intent paid keywords that should be ranking organically but aren’t.
    • Months 4-6: As location pages recover and start ranking, reduce PPC spend by 20-30% on those keywords and reinvest savings into content creation.
    • Months 6-12: If organic recovery hits 60%+ of the pre-November level, reduce PPC spend by another 50%.

    The goal: In 12 months, get back to a $50K-75K/month PPC baseline (for new market testing and seasonal peaks) while organic carries the core demand.

    That $206K/month in current PPC spend? Reinvested in organic SEO gives you a 8-12 month payoff at which point that traffic is free for the next 5 years.

    Why Paul Davis’s Recovery is Easier Than 911 Restoration’s Rebuild

    Here’s the critical difference:

    911 Restoration peaked at 4,466 keywords in July 2024. By March 2025 when we wrote the playbook, they were down to 3,306. Now (February 2026) they’re at 816. They’ve been declining for 20+ months. The recovery path is long.

    Paul Davis peaked at 50,339 keywords in October 2025 — last year. They were still growing in September. The fundamental SEO infrastructure that generated 50K keywords is still there. The content is still there. The domain authority is still there (54, up from 51 in March).

    The problem is fixable because the foundation is recent and sound. It’s not a rebuild. It’s a bounce-back.

    With the 7-step strategy above, here’s what I’d expect:

    • Month 1-2: Technical fixes and canonicalization repair shows up in GSC coverage. Expect 500-1,000 re-indexed pages.
    • Month 2-3: Location page architecture updates and schema implementation. Expect rankings to improve on the most valuable pages first.
    • Month 3-6: New content layers (crisis-moment, decision-stage) start ranking. Keywords begin recovering. Conservative estimate: 35,000-40,000 keywords by June.
    • Month 6-12: Full content architecture matures. Location pages reinforce each other through internal linking. Authority content earns backlinks. Expect 45,000-50,000 keywords recovered.

    That trajectory puts Paul Davis back to $450K+/month organic value within 12 months, which means cutting PPC spend from $206K to $50-75K and freeing up $150K+/month in marketing budget that can be reinvested in growth.

    The Playbook Works Because Paul Davis Proved It Works

    The reason I’m confident in this recovery isn’t theory. It’s data. Paul Davis demonstrated they could execute SEO at scale — they grew from 39K to 50K keywords over eight months. That’s not luck. That’s a team running a good playbook.

    The November collapse wasn’t a signal that the playbook failed. It was a signal that something broke in execution — a technical issue, a structural change, an algorithm shift.

    But the foundation is there. The domain authority is there. The franchise network is there. All that’s missing is the diagnostic (days 1-3), the fix (days 4-30), and then doubling down on what already works (months 2-12).

    I’ve built the systems to execute this at franchise scale — the AI-powered content pipelines, the schema automation, the GEO optimization frameworks. And honestly? Watching a company that was actually winning bounce back is far more satisfying than watching a company rebuild from 800 keywords.

    Frequently Asked Questions

    What caused Paul Davis Restoration’s 54% keyword drop in December 2025?

    Based on the data pattern — a collapse from 50K to 23K keywords in a single month, combined with a spike in PPC spending — the most likely causes are a location page architectural change without proper redirects, a technical indexation issue (robots.txt, noindex tag, or CDN misconfiguration), or the November 2025 Google Core Update hitting thin location pages specifically. The best way to confirm is through a 72-hour audit of GSC coverage data (checking when “Excluded – currently not indexed” spiked) and a URL crawl to identify redirect errors, orphaned pages, or canonicalization issues.

    Why is Paul Davis’s SEO value higher per keyword than other restoration companies?

    Paul Davis has an estimated SEO value of $43/keyword ($952,800 ÷ 22,190 keywords in February 2026), compared to SERVPRO’s $33/keyword. This suggests Paul Davis is ranking for higher-value, higher-intent keywords — likely more commercial terms and geo-modified queries rather than informational content. It’s a quality-over-quantity advantage: fewer keywords, but more profitable ones. This is actually the ideal position for recovery, since restoring 5,000 high-value keywords is more profitable than restoring 20,000 low-value ones.

    How should Paul Davis balance PPC spending during SEO recovery?

    Don’t cut PPC immediately — that leaves money on the table and risks losing customers to competitors during the recovery window. Instead, maintain current PPC baseline (around $206K/month) during the first 60-90 days of recovery actions, then systematically shift budget to organic as rankings improve. A realistic timeline: reduce PPC by 20-30% by month 6 (when organic is recovering), then by another 50% by month 12 (when organic has achieved 60%+ recovery). This keeps revenue stable while investing in the long-term organic channel.

    What’s the difference between Paul Davis’s situation and 911 Restoration’s?

    911 Restoration has been declining for 20+ months (peaked July 2024 at 4,466 keywords, now at 816). It’s a comprehensive, systemic failure requiring a full rebuild. Paul Davis peaked in October 2025 (50,339 keywords) and collapsed sharply in November/December — suggesting a fixable technical or structural issue rather than a fundamental SEO failure. Paul Davis’s recovery is faster and more straightforward because the foundation (domain authority, content corpus, franchise network) is recent and proven to work. It’s a bounce-back, not a rebuild.

    How important is location page optimization for franchise restoration companies?

    It’s the engine of the entire strategy. If Paul Davis has 600 franchises across 6 core services, that’s 3,600+ location-service pages. A well-optimized location page can rank for 15-40 related keywords through local modifiers, long-tail variants, and service-specific searches. The math: 3,600 pages × 15 keywords average = 54,000 potential ranked keywords. Paul Davis currently has 22,190, meaning they have capacity for 32,000+ additional keyword rankings just by optimizing what exists. Location pages are where restoration companies win.

    What is Generative Engine Optimization (GEO) and why does Paul Davis need it?

    GEO is optimizing content so that AI systems — ChatGPT, Claude, Gemini, Google AI Overviews, Perplexity — cite and recommend your business by name. For restoration, GEO involves entity saturation (consistent brand-attribute associations across the web), factual density (specific claims about IICRC certification, response times, service areas), authoritative citations (EPA, FEMA, IICRC standards), and implementing LLMS.txt to guide AI crawlers. As AI-generated answers increasingly replace traditional search results, GEO becomes as important as traditional SEO. Paul Davis needs GEO to win when someone asks an AI system “who should I call for water damage in Houston?”

    What’s the realistic timeline for Paul Davis to recover to 40,000+ keywords?

    Based on the severity of the collapse (54% in one month) but the strength of the foundation (recent peak, high domain authority, proven content infrastructure), I’d estimate:

    • Month 1-2: Technical fixes and indexation recovery (expect 1,000-2,000 page re-indexing)
    • Month 3-6: Location page optimization and new content layers take effect (expect climb from 22K to 35,000-40K keywords)
    • Month 6-12: Full architecture maturity and authority building (expect 45,000-50,000 keywords)

    The path is faster than 911 Restoration because the problem is fixable, not systemic.


    There’s a reason I’m telling you all this instead of keeping it proprietary. Paul Davis Restoration was doing it right through most of 2025. They hit 50K keywords because they executed a real strategy at real scale. Then something broke. But broken things can be fixed.

    We’re Tygart Media. We build the systems that execute this playbook for restoration companies at franchise scale. We’ve already figured out the location page architecture, the schema automation, the content velocity pipeline, the GEO optimization. And honestly? Helping a company that knows how to execute bounce back is exactly the kind of project we live for.

    The data is public. The opportunity is real. And the timeline for recovery is tight — every month without action is another month where competitors gain ground.

    Reach out here if you want to have the conversation. Or don’t. But at least you’ll know what’s possible.

    (And hey, if you actually do have a water damage emergency while you’re thinking about this, we can recommend a Paul Davis location. We probably know a guy. Actually, at this point, we’ve worked with enough franchises that we definitely know a guy.)

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

  • If I Were Running ServiceMaster’s SEO, Here’s What I’d Do Differently

    If I Were Running ServiceMaster’s SEO, Here’s What I’d Do Differently

    The Machine Room · Under the Hood

    I’m about to do something that most agency owners would never do: give away the entire playbook.

    Not a teaser. Not a “5 tips to improve your SEO” fluff piece. The actual, technical, step-by-step strategy I would execute — starting tomorrow — if **ServiceMaster** handed me the keys to their organic search program.

    Why? Because I pulled their SpyFu data this morning, and what I found stopped me mid-coffee. ServiceMaster essentially invented modern restoration franchising. They built the playbook that every restoration company has copied for the last three decades. They have brand recognition that money can’t buy. And they’re watching their organic search presence get destroyed in real time while they seem completely unconcerned.

    This isn’t gossip. This is data. And data deserves a response.

    ## The SpyFu Data: A Legacy Brand in Free Fall

    I pulled the full historical time series from the SpyFu Domain Stats API on March 30, 2026. Here’s what servicemaster.com looks like over the last 12 months:

    | Period | Organic Keywords | Monthly Organic Clicks | SEO Value ($/mo) | PPC Spend ($/mo) | Domain Strength |
    |——–|——————|———————-|——————|—————–|—————–||
    | Mar 2025 | 7,582 | 9,055 | $77,130 | $0 | 45 |
    | Apr 2025 | 7,612 | 8,755 | $86,940 | $0 | 45 |
    | May 2025 | 6,169 | 7,911 | $54,900 | $0 | 41 |
    | Jun 2025 | 5,413 | 6,592 | $48,260 | $0 | 41 |
    | Jul 2025 | 5,718 | 7,363 | $68,590 | $0 | 42 |
    | Aug 2025 | 3,168 | 5,604 | $28,880 | $253 | 39 |
    | Sep 2025 | 2,462 | 5,708 | $24,980 | $401 | 40 |
    | Oct 2025 | 2,548 | 5,664 | $30,280 | $512 | 41 |
    | Nov 2025 | 2,514 | 5,766 | $28,270 | $4,920 | 41 |
    | Dec 2025 | 1,870 | 3,910 | $15,380 | $9,266 | 39 |
    | Jan 2026 | 1,593 | 4,436 | $13,460 | $7,096 | 38 |
    | Feb 2026 | 1,742 | 4,435 | $39,300 | $7,039 | 42 |

    Let that sink in.

    **Peak SEO value: $334,384/month** (February 2020, historical data). **Current: $39,300/month.** That’s an **88.3% decline in six years**.

    **Peak keywords: 20,696** (August 2017). **Current: 1,742.** A **91.6% catastrophic wipeout in nine years**.

    And look at the trajectory from April to February 2026. In just 10 months, they hemorrhaged from 7,612 keywords down to 1,742. That’s a 77% collapse in a single year. The PPC column tells the real story: $0 in spend through most of 2025, then desperately cranking it up to $7,000/month by early 2026. They’re not marketing. They’re triage.

    That’s not strategy. That’s a company that’s stopped fighting.

    ## What Likely Went Wrong (And What It Means)

    Before I hand over the playbook, I need to be honest about what I think happened — because you don’t fix symptoms, you fix disease.

    A keyword portfolio shrinking from 20,696 to 1,742 over nine years isn’t content decay. Content decay is gradual — maybe 10-15% annually. This is **structural abandonment**. There are really only a few things that cause this pattern:

    **Scenario 1: Corporate Deprioritization.** ServiceMaster is a publicly traded company (part of Serco Group plc). If corporate decided that restoration franchising wasn’t a priority — maybe they divested or consolidated the business — then suddenly, nobody’s funding the SEO team. No budget = no optimization = rank collapse over time.

    **Scenario 2: Franchise Model Shift.** ServiceMaster franchises are independently owned and operated. If the franchisor stopped providing central marketing support and pushed franchisees to run their own local marketing, you’d see exactly this pattern: the parent domain deteriorates while individual franchise sites (if they’re managed well) might hold their own. But the national brand suffers catastrophically.

    **Scenario 3: Algorithm Penalties or Core Web Vitals Failures.** If servicemaster.com experienced technical issues — slow page load times, poor Core Web Vitals, indexation problems — and nobody fixed them over several years, Google would systematically de-rank the domain.

    **Scenario 4: Content Strategy Atrophy.** The simplest explanation: they stopped creating new content. No blog updates since 2021. No location page optimization. No response to algorithm updates. Just letting an old site sit on autopilot while Google moved on.

    My bet? It’s Scenario 1 and 4 combined. ServiceMaster owns the restoration space, but they’ve clearly decided it’s not where corporate energy goes anymore.

    ## Step 1: The 72-Hour Emergency Audit

    Before I write a single word of content or restructure a single URL, I need to understand what’s actually broken. This is a diagnostic sprint.

    ### Day 1: Crawl and Indexation Analysis

    I’d run **Screaming Frog** against the full servicemaster.com domain — every page, every redirect, every canonical tag. For a company this size, I’m expecting 3,000-8,000 URLs. I’m looking for:

    * **Redirect chains and loops** — Years of site updates create redirect chains that leak authority. Every 301 chain longer than 2 hops costs you PageRank.
    * **Orphan pages** — Pages that exist but have zero internal links pointing to them. If service pages or location pages aren’t linked from the main navigation, Google won’t prioritize crawling them.
    * **Duplicate content signals** — Thin location pages that share 90%+ identical content get consolidated by Google. If you have 50 city pages that all say the exact same thing, Google is ignoring 49 of them.
    * **JavaScript rendering issues** — If servicemaster.com uses client-side rendering for critical content, Google’s bot might not see what humans see.
    * **Canonical tag audit** — One broken template-level canonical directive can tell Google to ignore every page using that template. This is more common than you’d think on old franchise sites.

    ### Day 2: Google Search Console Deep Dive

    I need 48 months of GSC data — enough to cover the entire collapse. Specifically:

    * **Coverage report** — How many pages are in “Valid” vs. “Excluded”? When did the exclusion count spike? That tells me exactly when things broke.
    * **Exclusion reasons** — “Discovered – currently not indexed,” “Blocked by robots.txt,” “Alternate page with proper canonical tag.” Each reason points to a different root cause.
    * **Performance by page group** — Segment by URL pattern: /locations/*, /services/*, /franchise/*, /blog/*. Which group lost the most impressions? That’s where the problem is.
    * **Query decay over time** — Export 5 years of query data. When did the keyword count start declining? What types of queries disappeared first? If it’s all branded queries, the brand authority is intact but topical authority is gone. If it’s all location-based queries, the local pages are the problem.

    ### Day 3: Competitive Benchmarking

    I’d pull SpyFu data for their direct competitors — **SERVPRO**, **911 Restoration**, **Paul Davis Restoration**, **Belfor** — and chart the trajectories side by side.

    The question: did the entire restoration industry decline, or is this a ServiceMaster-specific problem?

    If everyone declined together, it’s an algorithm shift or industry disruption. ServiceMaster can compete by being smarter.

    If only ServiceMaster declined, it’s a self-inflicted wound that’s fixable.

    ## Step 2: Location Page Architecture — The Engine of Franchise Dominance

    This is the difference between a franchise that owns Google and a franchise that rents from Google. ServiceMaster’s corporate network spans restoration across North America with different legal entities, different service mixes, and different regional focuses. That complexity is an opportunity if architected correctly.

    ### The Hub-and-Spoke Model (Adapted for ServiceMaster’s Structure)

    Here’s the architecture I’d build:

    **Tier 1: National Service Pillar Pages**

    These are the authority anchors:

    * /water-damage-restoration/ → Targets “water damage restoration,” “water damage restoration company,” etc.
    * /fire-damage-restoration/ → Targets “fire damage restoration,” “fire damage repair”
    * /mold-remediation/ → Targets “mold removal,” “mold remediation”
    * /commercial-restoration/ → Targets “commercial water damage,” “business restoration services”
    * /carpet-cleaning-restoration/ → Targets “carpet cleaning,” “carpet restoration”

    Each pillar page is 3,500+ words of comprehensive, authoritative content that positions ServiceMaster as the category leader. These pages accumulate backlinks and pass equity down the hierarchy.

    **Tier 2: Regional Hub Pages**

    ServiceMaster should have one page per major region or state where they operate:

    * /restoration-services/texas/
    * /restoration-services/california/
    * /restoration-services/northeast/

    These pages contain regional-specific information — common restoration issues by climate, local building codes, regional partnership relationships. They link down to every service-specific page in that region.

    **Tier 3: Location/Franchise Pages**

    One page per franchise or operating location per service:

    * /restoration-services/texas/water-damage-restoration/
    * /restoration-services/texas/fire-damage-restoration/
    * /restoration-services/california/water-damage-restoration/

    If ServiceMaster operates 80+ locations across 4-5 core service categories, that’s **400-500 location-service combinations**. At 25 long-tail keywords per page, that’s **10,000-12,500 rankable keywords** — which is more than the 1,742 they currently have.

    ## Step 3: Content Strategy — Crisis, Decision, Authority

    Restoration companies make a fatal mistake: they only create bottom-of-funnel content. Every page says “call ServiceMaster for water damage restoration.” But a homeowner standing in an inch of water isn’t searching for a restoration company. They’re searching for “what should I do right now?”

    Whoever answers that question gets the call.

    ### Tier 1: Crisis-Moment Content (The 2 AM Searcher)

    * “What to Do When Your House Floods: Emergency Steps Before Professional Help Arrives”
    * “My Basement Is Flooded — What Do I Do Right Now?”
    * “House Fire Damage Assessment: What to Check First”
    * “Black Mold Found in My House: Immediate Steps to Take”
    * “Pipe Burst During Winter: Emergency Response Checklist”

    Format: Numbered steps, definition boxes, HowTo schema, featured snippet optimization. These pages are designed to be cited in Google AI Overviews and answered in voice search.

    ### Tier 2: Decision-Stage Content (The Insurance Conversation)

    * “Does Homeowners Insurance Cover Water Damage? Complete 2026 Guide”
    * “Water Damage Restoration Cost: Regional Breakdown and Pricing Factors”
    * “Water Mitigation vs. Restoration: What’s the Difference?”
    * “Choosing a Restoration Company: What to Look For”
    * “Timeline for Water Damage Restoration: What to Expect”

    These pages need comparison tables, cost breakdowns, and FAQPage schema. They’re designed for someone who already knows they need professional help but is shopping around.

    ### Tier 3: Authority-Building Content

    * “IICRC Certification Explained: Why It Matters in Water Damage Restoration”
    * “The Science of Structural Drying: Complete Technical Guide”
    * “Mold Testing vs. Mold Inspection: What’s the Difference?”
    * “How to Prepare Your Home for Storm Season: Disaster Preparedness Guide”
    * “Understanding FEMA Flood Zones and What They Mean for Your Property”

    These pages earn backlinks from industry associations, insurance publications, local news, and real estate blogs. Those links flow equity to the money pages.

    ## Step 4: Schema Markup — The Technical Foundation

    Structured data is where most restoration companies leave 20-30% of their ranking potential on the table.

    ### Required Schema Implementation

    **LocalBusiness schema on every location page:**

    “`json
    {
    “@type”: “LocalBusiness”,
    “name”: “ServiceMaster of [City Name]”,
    “address”: {
    “@type”: “PostalAddress”,
    “streetAddress”: “[Address]”,
    “addressLocality”: “[City]”,
    “addressRegion”: “[State]”,
    “postalCode”: “[ZIP]”,
    “addressCountry”: “US”
    },
    “geo”: {
    “@type”: “GeoCoordinates”,
    “latitude”: “[latitude]”,
    “longitude”: “[longitude]”
    },
    “telephone”: “[Phone Number]”,
    “openingHoursSpecification”: [
    {
    “@type”: “OpeningHoursSpecification”,
    “dayOfWeek”: [“Monday”, “Tuesday”, “Wednesday”, “Thursday”, “Friday”, “Saturday”, “Sunday”],
    “opens”: “00:00”,
    “closes”: “23:59”
    }
    ],
    “areaServed”: {
    “@type”: “City”,
    “name”: “[City]”
    },
    “hasOfferCatalog”: {
    “@type”: “OfferCatalog”,
    “itemListElement”: [
    {
    “@type”: “Offer”,
    “itemOffered”: {
    “@type”: “Service”,
    “name”: “Water Damage Restoration”
    }
    },
    {
    “@type”: “Offer”,
    “itemOffered”: {
    “@type”: “Service”,
    “name”: “Fire Damage Restoration”
    }
    },
    {
    “@type”: “Offer”,
    “itemOffered”: {
    “@type”: “Service”,
    “name”: “Mold Remediation”
    }
    }
    ]
    }
    }
    “`

    **On service pages:** Article + Service + FAQPage + BreadcrumbList + Schema.org/Service

    **On blog posts:** Article + FAQPage + Speakable (on answer paragraphs)

    When implemented across 400+ pages with consistent data, you’re giving Google a machine-readable map of ServiceMaster’s entire franchise network.

    ## Step 5: Google Business Profile Management — The Local Pack Battleground

    In restoration, the Local Pack (the 3 map results) captures more high-intent traffic than organic results. When someone searches “water damage restoration near me,” they look at the map first.

    Winning the Local Pack requires systematic GBP optimization:

    * **Weekly GBP posts** — Real posts about completed projects, seasonal preparedness tips, team spotlights. Google’s algorithm rewards consistent posting activity.
    * **Review velocity** — Every location needs a systematic review request process. Target: 200+ reviews at 4.8+ stars per location within 12 months. Respond to every review within 24 hours.
    * **Photo strategy** — 50+ photos per location: team, equipment, projects, office, vehicles. Geotagged. Updated monthly.
    * **Q&A seeding** — Proactively add and answer the top 10 questions for each location’s GBP.
    * **Service area clarity** — Define service areas as precise polygons, not just “surrounding areas.”

    ## Step 6: Answer Engine Optimization (AEO) — Win the AI Results

    Google’s AI Overviews now appear on most informational queries. When someone asks “what do I do if my house floods,” Google generates a synthesized answer and cites specific sources.

    If ServiceMaster’s content isn’t structured to be cited, they’re invisible.

    * **Definition boxes** — Open every service page with a 50-word authoritative definition. This is what Google AI extracts and cites.
    * **Direct-answer formatting** — Structure H2s as questions. Answer them completely in the first 50 words. AI Overviews pull from this pattern.
    * **Comparison tables** — “Water Damage vs. Fire Damage” with side-by-side tables. AI loves structured comparisons.
    * **Numbered process lists** — “The 7 Stages of Water Damage Restoration.” This format wins HowTo rich results and AI citations simultaneously.

    ## Step 7: Generative Engine Optimization (GEO) — Be the Company AI Recommends

    This is the frontier. Most restoration companies don’t even know this exists. GEO is about making AI systems — Claude, ChatGPT, Gemini, Perplexity — recommend ServiceMaster by name.

    * **Entity saturation** — “ServiceMaster” needs to appear across the web in consistent association with specific attributes: IICRC certified, 24/7 availability, regional expertise, specific certifications, risk response capability.
    * **Factual density** — Replace “we provide excellent restoration services” with “ServiceMaster’s team is trained to IICRC S500/S520 standards and deploys truck-mounted extractors capable of removing 300+ gallons per minute.”
    * **Authoritative citation weaving** — Link to EPA mold guidelines, FEMA flood resources, IICRC standards, state-specific regulations. AI systems weight this higher because it signals expertise.
    * **LLMS.txt implementation** — Add a /llms.txt file to root domain providing AI crawlers with a structured summary of ServiceMaster’s business, services, geographic coverage, and authoritative attributes.

    ## Step 8: Internal Linking — The Circulatory System

    A franchise site without proper internal linking is a highway system with no on-ramps.

    * **Pillar → State → City cascade** — National pillar links to every regional hub. Regional hubs link to every city page in that region. City pages link back up. Closed loop of authority.
    * **Cross-service linking at the city level** — Houston water damage page links to Houston mold page, Houston fire page. Keeps users on site and signals contextual relevance.
    * **Blog-to-location contextual links** — Every blog post includes natural in-text links to relevant city pages. “If you’re dealing with flooding in Chicago, our IICRC-certified team is available 24/7 — [learn more about ServiceMaster’s Chicago water damage restoration].”
    * **Related content blocks** — Automated bottom-of-page blocks showing 3-5 topically related pages. Scales automatically as you publish more content.

    ## Step 9: Backlink Acquisition — Leverage the Franchise Network

    ServiceMaster’s franchise structure is an asset most competitors can’t match:

    * **Disaster response PR** — After every major emergency, issue press releases to local media with quotes from location owners. Local news sites (high authority, high relevance) pick these up.
    * **Insurance partnerships** — ServiceMaster should be on preferred vendor lists with insurance carriers. Each carrier relationship should include a backlink from their website.
    * **Industry association profiles** — Active profiles on IICRC.org, RestorationIndustry.org, state contractor licensing boards. These .org links carry significant trust signals.
    * **Civic partnerships** — Chamber of Commerce, BBB profiles, Rotary sponsorships, local organization memberships. Each location should systematically acquire 20-30 local directory backlinks.
    * **Content partnerships** — Co-create disaster preparedness guides with FEMA, emergency management agencies, fire departments. “Hurricane Preparedness Guide — by ServiceMaster and the American Red Cross.” The .gov backlink is worth the effort.

    ## Step 10: Kill the PPC Dependency (And Rebuild the Organic Engine)

    ServiceMaster spent an estimated **$21,587 on Google Ads in the last 12 months** (increasing from $0 to $7,039/month). That’s reactive and unsustainable. Here’s the math:

    * At their 2020 peak, ServiceMaster’s organic traffic was worth **$334,384/month** — **$4.01 million/year** in equivalent ad spend delivered for free.
    * A comprehensive SEO program would cost a fraction of their current PPC spend.
    * If they rebuild to just **half their peak value** ($167K/month), that’s **$2 million/year** in traffic they no longer need to buy.
    * Organic traffic compounds. SEO is a long-term asset. PPC is a treadmill.

    The ROI case is overwhelming.

    ## The Bottom Line

    ServiceMaster invented the restoration franchise. They built the playbook that SERVPRO and 911 Restoration have copied. They have 70+ years of brand history. They have franchise infrastructure across North America. They have domain authority that still ranks at 42 despite years of neglect.

    And they’re getting outranked by companies 1/10th their size because those companies are actually trying.

    ServiceMaster didn’t fail because restoration franchising is saturated. They’re failing because they stopped investing in the channel that built their brand — organic search.

    The opportunity isn’t a mystery. It’s an execution problem. And the 10-step playbook above is how you fix it.

    Here’s my real talk:

    **Hey, ServiceMaster. You invented this industry. You should own Google for every restoration keyword that exists. The data is public. The decline is real. The fix isn’t a mystery — it’s investment and execution.**

    **We’re [Tygart Media](https://tygartmedia.com). We live and breathe restoration SEO. We’ve built the systems to execute everything above at franchise scale. We’ve already done this for companies in your space. And honestly? We’d love to have the conversation about what $200K+/month in organic value looks like when it’s back.**

    **[Reach out here](https://tygartmedia.com/contact). No pressure. No hard sell. Just two teams who understand the industry talking about what a digital resurrection looks like.**

    **Or don’t. Keep spending $7K/month on Google Ads for the traffic you’re literally giving away.**

    **Your choice. We’ll be here either way. Just maybe not for your competitors. 😄**

    ## Frequently Asked Questions

    ### How much organic traffic has ServiceMaster lost?

    ServiceMaster’s organic presence has declined catastrophically over the last nine years. Their peak of 20,696 organic keywords (August 2017) has collapsed to 1,742 keywords as of February 2026 — a 91.6% reduction. Their peak SEO value was $334,384/month (February 2020), compared to just $39,300/month today (February 2026) — an 88.3% decline. In the last 10 months alone (April 2025 to February 2026), they lost 77% of their keywords, dropping from 7,612 to 1,742.

    ### Why isn’t ServiceMaster spending on Google Ads if they understand the traffic problem?

    ServiceMaster spent $0 on Google Ads for most of 2025, then gradually increased spending to $7,039/month by February 2026. This pattern suggests they may not have recognized the organic decline urgently, or corporate prioritization shifted away from the restoration vertical. The recent increase in PPC spending indicates they’re now buying back traffic they used to capture organically — which is more expensive and less sustainable than organic search.

    ### What is the most critical SEO fix for ServiceMaster?

    The most impactful single fix would be rebuilding and optimizing the location page architecture. ServiceMaster’s franchise structure creates a natural advantage: 80+ locations × 4-5 service categories = 400-500 location-service combinations. Each properly optimized page targeting unique, locally-relevant content could drive 25+ keywords. That alone could restore 10,000+ keywords within 12 months. Currently, they’re capturing a fraction of this potential.

    ### How does ServiceMaster’s situation compare to 911 Restoration?

    Both companies have experienced severe organic decline, but ServiceMaster’s is more dramatic. 911 Restoration’s peak was $407,500/month (March 2022) vs. $22,700 current. ServiceMaster’s peak was $334,384/month (February 2020) vs. $39,300 current. However, ServiceMaster’s keyword collapse is steeper (91.6% over nine years). 911 Restoration’s decline happened faster (94.4% from peak) but more recently. Both represent massive opportunities for comprehensive SEO rebuilding. [Read the 911 Restoration playbook here](https://tygartmedia.com/911-restoration-seo-playbook/).

    ### What is Generative Engine Optimization (GEO) and why does it matter?

    Generative Engine Optimization is the practice of optimizing your content and online presence so that AI systems — Google AI Overviews, ChatGPT, Claude, Gemini, Perplexity — recommend your business by name. For restoration companies, this means consistent entity saturation across the web (brand + attributes), factual density (specific, verifiable claims), authoritative citations (EPA, FEMA, IICRC standards), and LLMS.txt implementation. GEO is becoming critical as AI-generated answers increasingly replace traditional search results.

    ### How long would it take to restore ServiceMaster’s organic traffic?

    A realistic timeline for ServiceMaster would be 6-12 months for technical fixes and content architecture to take effect, with meaningful improvement visible within 4-6 months. Full recovery to even half their peak (75 years of organic value) would require 12-18 months of sustained effort. The first 90 days typically show the highest-impact gains because fixing technical issues (indexation, redirects, schema) often produces immediate improvements once Google re-crawls the corrected pages.

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

    {
    “@context”: “https://schema.org”,
    “@type”: “Article”,
    “headline”: “If I Were Running ServiceMasters SEO, Heres What Id Do Differently”,
    “description”: “ServiceMaster built modern restoration. Now their digital presence looks like 1989. A $334K/month peak vs. $39K today. Here’s the exact playbook to resurr”,
    “datePublished”: “2026-03-30”,
    “dateModified”: “2026-04-03”,
    “author”: {
    “@type”: “Person”,
    “name”: “Will Tygart”,
    “url”: “https://tygartmedia.com/about”
    },
    “publisher”: {
    “@type”: “Organization”,
    “name”: “Tygart Media”,
    “url”: “https://tygartmedia.com”,
    “logo”: {
    “@type”: “ImageObject”,
    “url”: “https://tygartmedia.com/wp-content/uploads/tygart-media-logo.png”
    }
    },
    “mainEntityOfPage”: {
    “@type”: “WebPage”,
    “@id”: “https://tygartmedia.com/servicemaster-seo-playbook/”
    }
    }

  • If I Were Running SERVPRO’s SEO, Here’s What I’d Do Differently

    If I Were Running SERVPRO’s SEO, Here’s What I’d Do Differently

    The Machine Room · Under the Hood

    If I Were Running SERVPRO’s SEO, Here’s What I’d Do Differently

    SERVPRO owns 178,900 keywords worth $5.8 million per month in organic search value. They’re the 800-pound gorilla of the water restoration space. But they just lost 108,000 keywords in four months—a 38% collapse from their October 2025 peak. And they’re spending $2 million per month on PPC to paper over the cracks.

    The Math That Should Keep SERVPRO’s CMO Up at Night

    Let that sink in. In October 2025, SERVPRO ranked for 286,900 keywords. By February 2026—four months later—they were down to 178,900. That’s not algorithmic drift. That’s not seasonal. That’s a Category 5 hurricane hitting your organic search machine, and it happened almost silently while they threw another $2M at Google Ads to keep the lights on.

    Here’s the thing: SERVPRO has domain strength of 62, the strongest I’ve seen in the restoration vertical. They have brand authority. They have content. They have traffic. But they’re treating SEO like a legacy channel while they shovel money into PPC—the exact opposite of what their competitive position should demand.

    I ran the numbers on SERVPRO’s performance over the last 12 months. Take a look.

    Month Keywords Ranking Monthly Clicks SEO Value Domain Strength PPC Spend
    Feb 2025 245,100 148,300 $3,950,000 60 $1,820,000
    Mar 2025 251,200 152,400 $4,180,000 60 $1,950,000
    Apr 2025 248,900 150,100 $4,100,000 60 $1,880,000
    May 2025 253,400 153,900 $4,270,000 61 $1,920,000
    Jun 2025 259,100 157,200 $4,420,000 61 $1,880,000
    Jul 2025 265,300 161,000 $4,580,000 61 $1,950,000
    Aug 2025 272,100 164,800 $4,750,000 61 $2,010,000
    Sep 2025 281,200 170,400 $5,120,000 61 $2,080,000
    Oct 2025 286,900 174,000 $5,420,000 62 $2,150,000
    Nov 2025 268,400 162,500 $4,840,000 62 $2,090,000
    Dec 2025 223,100 135,200 $3,200,000 62 $1,980,000
    Feb 2026 178,900 151,700 $5,825,000 62 $1,944,000

    Wait. Stop. Look at February 2026 again. Keywords tanked to 178,900, but SEO value exploded to $5,825,000. How is that possible?

    Because SERVPRO stopped chasing long-tail volume and started extracting revenue from money keywords. They’re ranking for fewer terms, but the terms they *are* ranking for convert harder. That’s actually a sign that something—either an algorithm shift or a deliberate technical decision—forced them to consolidate their keyword real estate.

    But here’s what kills me: they’re still spending $1.944M per month on PPC. If they could stabilize their organic keyword portfolio and clean up their technical architecture, they could cut that spend by half and *increase* total revenue. Instead, they’re patching the hole with paid traffic.

    What Likely Went Wrong (And Why It Matters)

    SERVPRO owns 2,000+ franchise locations across North America. Each location is its own business, often with its own digital presence. That’s the double-edged sword of their model: massive reach, but fragmented authority.

    When you have that much real estate spread across the internet, a single algorithm update—or a deliberate consolidation on Google’s part—can evaporate keyword rankings overnight. Here are the most likely culprits:

    1. Location Page Cannibalization

    If SERVPRO has 2,000 location pages all competing for “water damage restoration near me” or “SERVPRO [city],” they’re killing their own rankings. Google gets confused. It doesn’t know which page to rank. So it ranks fewer of them.

    The fix: Implement a tiered location strategy. National hub page > regional cluster > local pages. Internal link from hub to region to local. Avoid keyword duplication. Use structured data (LocalBusiness with serviceArea) to signal geographic relevance without creating duplicate content.

    2. Content Architecture Decay

    SERVPRO’s main site probably wasn’t architected with 2,000+ location pages in mind when it was built. Over time, internal linking broke, breadcrumb trails became inconsistent, and authority stopped flowing predictably. No one’s actively managing the link graph at scale.

    The fix: Conduct a full internal linking audit. Map out which pages should funnel authority to which. Restore broken links. Create programmatic breadcrumb trails. Use topic clusters to create thematic authority hubs that feed into location pages.

    3. E-E-A-T Fragmentation

    Google’s moved heavily toward E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) in recent years. A national franchise system’s E-E-A-T is strong at the brand level, but uneven at the franchise location level. Some franchisees have reviews and credentials. Some don’t.

    The fix: Standardize E-E-A-T signals across the network. Ensure every location page has aggregated reviews, credentials, licenses, and “about” information. Use Author entities to link individual technicians to content. Make the system defensible against algorithm swings.

    4. Technical Debt From Franchise Independence

    Here’s the ugly truth: SERVPRO franchisees run their own businesses. Some have modern websites. Some are running 2015-era WordPress themes. Some use white-label platforms that Google barely indexes. When you have 2,000 franchise sites under one umbrella, you’re battling technical inconsistency at scale.

    The fix: Offer franchisees a standardized tech stack. Migrate independent sites into a consolidated platform (either subdomains or a federated network). Enforce technical requirements (Core Web Vitals, mobile responsiveness, schema markup). Make SEO non-negotiable.

    The SERVPRO SEO Playbook: 8 Steps to Recover 150,000+ Keywords

    Step 1: Conduct a Keyword Bleed Forensics Audit

    Pull your keyword history for the last 24 months in SpyFu. Sort by rank drop (now ranking outside top 100). Segment by keyword type:

    • Money keywords (water damage restoration, fire damage, mold removal): Why did you lose these? Pull them up in GSC. Are impressions down? CTR down? Rank dropped?
    • Branded + geo keywords (SERVPRO [city], water damage [city]): You should own almost all of these. If you’ve lost them, it’s likely location page cannibalization.
    • Long-tail keywords (what can I do about water damage in my basement): This is where the 108,000-keyword drop is probably concentrated. These are lower-value keywords. Maybe that’s intentional. Maybe it’s not.
    • Competitor keywords (911 restoration competitors, other local services): Are you losing share in competitive space, or just retracting from low-intent terms?

    Once you’ve segmented, you know exactly where the damage is. Then you can fix the right thing instead of guessing.

    Step 2: Audit Your Location Page Architecture

    Pull a sample of 50 location pages across different regions. Check these metrics:

    • Are they templated consistently, or do they vary widely?
    • Do they have unique content (service descriptions, local reviews, technician bios), or are they duplicates?
    • How do they link to each other? Is there an authority flow from national > regional > local?
    • Are they indexed individually, or are some being de-indexed?

    Run a GSC export to see which location pages are getting search impressions. You’ll likely see a long tail where 80% of your locations get minimal organic traffic.

    That’s your content architecture problem. Fix it and watch rankings come back.

    Step 3: Implement a Three-Tier Location Page System

    Replace the flat structure with depth:

    Tier 1: National Hub — One authority page covering water damage restoration, fire damage, mold removal, etc. This page should be a semantic authority fortress: comprehensive content, strong internal linking, high-quality backlinks. All location pages link back to this.

    Tier 2: Regional Clusters — Group your 2,000 locations into 20-30 regions (Northeast, Southeast, Midwest, etc.). Create regional pages covering “water damage restoration in [region]” with:

    • Aggregated statistics (e.g., “SERVPRO has restored 50,000+ properties in the Northeast”)
    • Links to all location pages in that region
    • Regional case studies or testimonials
    • Regional licensing/credentials information

    Tier 3: Local Pages — One page per location (or market). Include:

    • Unique local content (service menu tailored to local disasters, local team bios, local case studies)
    • LocalBusiness schema with full address, phone, reviews
    • Internal links from regional page and national hub
    • Links to adjacent locations (e.g., nearby franchise territories)
    • Unique on-page content that distinguishes this location from others (at least 500-1000 words)

    This structure signals to Google: “These are related but distinct properties. Each one has authority and relevance to its geography.”

    Step 4: Repair Internal Linking at Scale

    Your 286,900-keyword peak suggests you had strong internal linking. Your 178,900-keyword current state suggests it broke. Here’s how to rebuild it:

    Map the authority flow: Create a spreadsheet showing how authority should flow. National page (highest authority) > Regional pages (medium) > Location pages (local). Add cross-links between adjacent locations. Add contextual links from blog content to relevant location pages.

    Fix broken links: Run your site through Screaming Frog. Find all 404s and redirect chains. Fix them. Broken links kill authority flow.

    Create topic clusters: Your main content topics (water damage, fire damage, mold, etc.) should each have a hub page. Every blog post should link to the relevant hub. Every location page should link to the relevant hub. This creates thematic relevance signals that help with rankings.

    Implement breadcrumb navigation: Home > Service > Location. This signals site structure to Google and improves crawlability.

    At scale, this is a 6-8 week project, but it’s foundational. You can’t have 5.8M in monthly SEO value without a solid internal link graph.

    Step 5: Standardize E-E-A-T Across All Locations

    Create a template/playbook for franchisees that includes:

    • Local review aggregation: Pull Google, Yelp, and industry reviews to each location page. Show star ratings. Highlight top reviews. Aggregate to the brand level.
    • Credentials display: State licenses, certifications, insurance. Show that this franchisee is legit. Make it dynamic (pull from a central database, don’t hardcode).
    • Local team bios: Include photos and bios of the top 3-5 technicians at each location. Give them Google Author profiles if possible. Make E-E-A-T tangible.
    • Local case studies: Every location should have at least 2-3 case studies showing real work they’ve done. Before/after photos, descriptions. This builds Experience + Authoritativeness.
    • Trust signals: Display member affiliations (DRIstoration Network, IICRC, etc.), “Featured in” logos, awards. Design signals matter.

    This isn’t optional. It’s the baseline for ranking in a trust-dependent vertical. Do it across all 2,000 locations and you’ll see keyword recovery.

    Step 6: Implement Generative Engine Optimization (GEO)

    Google’s Gemini, ChatGPT, and Claude are increasingly the first place people go for answers. You should own that real estate too.

    Make your site AI-friendly:

    • Add a FAQ schema on every page with questions people actually ask. Make sure your answers are comprehensive and cite-worthy.
    • Create a structured data layer that AI engines can parse: LocalBusiness, FAQPage, HowTo, Review. The richer your data, the more likely AI pulls from you.
    • Target conversational queries in your content: “What should I do if I have water damage?” “How much does restoration cost?” “Can I restore water-damaged documents?” These are the queries AI-powered search will prioritize.
    • Build a knowledge base or glossary explaining restoration terminology. AI systems will index this as foundational content.

    The restoration vertical is perfect for GEO. People are panicked when they need you. An AI system recommending “SERVPRO is the largest restoration franchise” is worth millions in future organic traffic.

    Step 7: Cut Waste From Your $1.944M/Month PPC Spend

    I’m not saying cut PPC entirely. But you’re spending $1.944M per month while owning 178,900 keywords. That’s insurance money. Here’s where to redirect it:

    • Kill low-ROAS keywords: Pull your Google Ads data. Find keywords with CPA > 3x your conversion value. These are money sinks. Pause them. Let organic handle them if it can.
    • Shift budget from branded to high-intent: You should own branded keywords (SERVPRO + geo) organically. Paying for them is waste. Redirect that budget to high-intent non-branded terms where you’re not yet ranking in top 3.
    • Test seasonal PPC budgets: Restoration demand spikes after storms. You don’t need to bid aggressively in January. Build a seasonal playbook. Save $100K-200K per month in off-season.
    • Consolidate accounts and campaigns: 2,000 franchisees = probably 1,000+ Google Ads accounts. Consolidate them under a central management structure. Eliminate duplicate bidding. Unified budget allocation is way more efficient.

    Conservative estimate: You could cut $500K-750K per month from PPC and improve overall ROI by moving budget to organic. That’s $6-9M annually. Worth it.

    Step 8: Build a Fragmented Franchisee Network Into a Federated Authority System

    This is the long-term play. Right now, SERVPRO likely looks like this to Google: 2,000 separate businesses with the SERVPRO brand. Google doesn’t really know how to rank them as one system.

    Here’s what you should build instead:

    • Consolidated location architecture: servpro.com/locations/[city-state] for all locations, managed centrally. Not franchisee.com or subdomain.servpro.com. One unified system, 2,000 variations.
    • Federated content model: National content hub (servpro.com/restoration-guides) serves as the authoritative source. Franchisees republish and localize. Create a content syndication system that keeps authority centralized while allowing local customization.
    • Unified review aggregation: Pull all franchisee reviews into a central system. Rank locations by star rating. Make the whole network defensible.
    • Centralized link building: One brand-level link-building strategy, feeding authority down to locations. Not 2,000 franchisees all trying to build links independently.

    This takes 12-18 months to execute, but when you land it, you’ll see your keyword count jump by 150,000+ and you’ll be basically unbeatable in your vertical.

    The Opportunity Cost of Staying Put

    SERVPRO lost 108,000 keywords in 4 months. Let’s say half of those were low-intent long-tail (worth $20-50 per click). That’s about 54,000 keywords × $30 average = $1.62M per month in lost organic value.

    They made up for it by extracting more revenue from fewer, higher-value keywords (Feb 2026 value spike). But they’re also spending $1.944M per month on PPC to maintain traffic volume.

    If SERVPRO recovered to 240,000 keywords (their level in August 2025), they’d likely add another $1.5-2M per month in organic value *and* be able to cut PPC spend by 40-50%. That’s a $3-4M monthly swing.

    Over a year, that’s $36-48M in additional profit from fixing SEO.

    And that’s being conservative. SERVPRO’s brand is so strong that if they could demonstrate to Google that they’re the E-E-A-T authority in restoration, they could probably rank for *more* keywords than they did at their October 2025 peak.

    The Playbook in Practice

    You’d execute this in three phases:

    Phase 1 (Month 1-2): Diagnosis & Architecture — Forensics audit, location page audit, three-tier architecture design. Identify quick wins (broken links, obvious cannibalization). Get executive buy-in on the federated model.

    Phase 2 (Month 3-6): Execution & Standardization — Roll out three-tier system. Repair internal linking. Standardize E-E-A-T templates. Implement GEO. Test PPC reductions on low-ROAS keywords. Monitor GSC for ranking recovery.

    Phase 3 (Month 7-12): Optimization & Scale — Feed winners. Scale what works. Build federation toward the long-term model. By month 12, you should see 60-70% of your lost keywords recovered. By month 18, you should be back to 240,000+ keywords.

    Is this work? Yes. Is it technical? Absolutely. But SERVPRO has the authority, the domain strength, and the economic incentive to execute it. They just need fresh eyes on the architecture and a willingness to think bigger than “add more PPC.”

    Why SERVPRO Specifically

    I picked SERVPRO for this analysis because they represent something important: dominance is fragile.

    They have domain strength 62. They own 178,900 keywords. They’re the category leader. But they’re also spending $2M per month on PPC to maintain that position—which suggests their organic is leaking. They peaked at 286,900 keywords just 5 months ago, and they lost 38% of that in 4 months flat.

    That’s not normal erosion. That’s a system breaking.

    And here’s what kills me: they have all the ingredients to fix it. They have authority. They have traffic. They have the budget. They just need someone to say “your location page architecture is the problem, and here’s how to rebuild it.”

    The restoration vertical is also perfect for this because SERVPRO competes on brand + trust, not pure convenience. If you can dominate Google’s algorithm while also dominating AI-powered search (GEO), you own the entire funnel. The CMO who pulls that off will be a legend.

    Common Questions

    The Complete Restoration Franchise SEO Playbook Series

    This article is part of a 6-part series analyzing the SEO performance of every major restoration franchise in America. Read the full series:

    Q: Could algorithm changes alone explain the 108,000-keyword drop?

    Maybe partially. But 38% keyword loss in 4 months is unusual even for a major core update. Algorithm changes typically cause 5-15% fluctuation across a healthy site. The magnitude here suggests an underlying technical issue got exposed by an algorithm shift.

    Most likely explanation: SERVPRO’s location pages were competing with each other (cannibalization). An algorithm update prioritized consolidation (ranking fewer pages more strongly per topic). When that happened, SERVPRO lost the “also ran” rankings but kept the top positions. The keyword *count* looks bad, but the keyword *value* stayed strong. Still, you’re leaving revenue on the table.

    Q: Isn’t running 2,000 location pages inherently limited?

    Not at all. If you build the architecture right. Think about how many pages Wikipedia ranks for (millions). Think about how many pages e-commerce sites rank for (hundreds of thousands). The issue isn’t scale—it’s whether your site is optimized for scale.

    SERVPRO’s issue is probably that their location pages were built incrementally (added as franchisees joined) without a master architecture in mind. So the system grew organically but unsystematically. Rebuild the architecture and you solve it.

    Q: Could they focus only on organic and eliminate PPC?

    Not immediately. PPC is insurance. SERVPRO operates in a trust-dependent, high-intent vertical. They need to own the top of the SERP to win. During the recovery period (months 1-12), PPC is your safety net.

    But long-term, if you recover 240,000+ keywords and your E-E-A-T is solid, you can cut PPC by 50-60% and probably *increase* revenue because organic converts better (higher intent) than paid ads.

    Q: How do you measure success on this playbook?

    Three metrics: Keywords ranking (target 240K+), monthly organic clicks (target 160K+), and SEO value (target $5.5M+). You should also track PPC spend reductions and ROI improvements.

    Monthly GSC reports showing ranking recovery. Monthly rank tracking on your 200 highest-value keywords. Quarterly attribution reports tying organic to revenue.

    Q: What’s the biggest risk of this playbook?

    Consolidation risk. Moving from 2,000 independent location pages to a federated system means centralizing control. Franchisees lose some autonomy. Some franchisees will resist. You need executive support to force the technical change, even if it annoys franchisees short-term.

    But the alternative is bleeding 38% of your keywords every 4 months. At some point, you have to choose: fight the SEO problem or accept the $2M/month PPC tax forever.

    The Ask

    If I were SERVPRO’s CMO, I’d take this playbook to the CEO and say:

    “We’ve lost 108,000 keywords in 4 months. We’re spending $2M per month on PPC to compensate. Our domain strength is 62—the strongest in the industry. If we fix the location page architecture, we’ll recover 150,000 keywords, add $2-3M per month in organic value, and cut PPC spend by 40-50%. That’s a 3:1 ROI on the project. And the brand will own the restoration category for the next 5 years.”

    It’s the right move. Whether SERVPRO makes it is up to them.

    But if you’re running a site with hundreds (or thousands) of location pages, apply this playbook to your business. Audit your keyword loss. Rebuild your architecture. Fix your E-E-A-T. You don’t have to be as big as SERVPRO to benefit. Most franchised verticals have this exact vulnerability.

    If you want help implementing this—or diagnosing why your keywords are bleeding—reach out here. We’ve done this at scale for franchise networks and multi-location enterprises. It works. 😄

    P.S.: If you found this useful, check out our SEO analysis of 911 Restoration—a different player in the same vertical with a different set of SEO problems. Comparing the two gives you a masterclass in how different strategies lead to different outcomes.