robots.txt: The Hidden GMC Killer Most Stores Miss

Unblockr Team··7 min read
technicalrobots.txtpoliciesGMC

You've written perfect policy pages. Your shipping policy is detailed. Your return policy is clear. Your privacy policy is CCPA-compliant. You submit to Google Merchant Center and... rejected.

The reason? Your robots.txt file is telling Google NOT to read those policy pages. And you probably don't even know it.

The Problem: Shopify's Default robots.txt

Shopify automatically generates a robots.txt file for every store. By default, it includes these two lines:

Disallow: /policies/
Disallow: /*/policies/

This tells every search engine crawler — including Google Merchant Center's crawler — to skip your entire /policies/ directory. Your shipping policy, refund policy, and privacy policy become invisible to Google.

Why This Matters for GMC

Google Merchant Center requires three things from every store:

  1. A crawlable shipping policy
  2. A crawlable return/refund policy
  3. A crawlable privacy policy

If GMC's crawler tries to access /policies/shipping-policy and gets blocked by robots.txt, it can't verify your policies exist. Result: automatic rejection or suspension.

How to Check

Visit yourdomain.com/robots.txt in your browser. Search for "policies" on the page. If you see any Disallow line containing "/policies/", you have the problem.

How to Fix It

  1. In Shopify, go to Online Store → Themes
  2. Click the three dots (...) on your active theme
  3. Click Edit Code
  4. Find robots.txt.liquid in the file list
  5. Remove the lines containing /policies/
  6. Save

If robots.txt.liquid doesn't exist, you may need to create it as a custom template to override Shopify's default behavior.

Verify the Fix

After saving, visit yourdomain.com/robots.txt again. The /policies/ disallow lines should be gone. Then try accessing your policy pages directly:

  • yourdomain.com/policies/shipping-policy
  • yourdomain.com/policies/refund-policy
  • yourdomain.com/policies/privacy-policy

All three should load and be visible. If they are, you're clear.

A Real Example

We audited a mid-century furniture store using ChatSEO. Everything looked perfect — optimized titles, complete descriptions, consistent business identity. But the ChatSEO audit flagged that robots.txt was blocking /policies/. The store owner had no idea. This single issue would have caused their GMC submission to fail.

It took 30 seconds to fix. But if you don't know to look for it, you'd never find it.

What Else to Check in robots.txt

While you're in there, make sure these paths are NOT blocked:

  • /pages/ — your info pages (about, contact, FAQ)
  • /products/ — your product pages
  • /collections/ — your collection pages

Shopify's default robots.txt is generally fine for these. The /policies/ block is the main issue.

Check your store's compliance for free

Scan your Shopify store against 146+ Google Merchant Center rules in minutes.

Start Free Scan