Block Search Engine Crawlers: How to Hide Your Site from Search Results
Web crawlers are automated bots that scan websites to index their content for search engines. By default, these crawlers can access and index all public web content, but you can control how your site appears in search results.
To hide your entire site from search engines:
- Navigate to Settings > Crawlers
- Check "Block Search engine crawlers"
- This adds a robots.txt rule preventing search engine indexing
To hide specific pages from search results, you have two options:
Using Page Settings (Recommended)
- Open Pages panel
- Click the gear icon next to the page
- Go to SEO tab
- Toggle "Hide page from search results"
Benefits:
- Available on all plans
- No coding required
- Removes page from sitemap
- Automatically enabled on demo pages
Using Code Injection
- Add this code to the page header:
<meta name="robots" content="noindex">
Benefits:
- Works on homepages
- Keeps page in sitemap
- Requires Business plan or higher
Important Notes:
- Hiding a collection page also hides all items within it (products, blog posts, etc.)
- Individual collection items can't be hidden separately
- Hiding an index page doesn't hide its sub-pages automatically
- These methods only affect external search engines, not internal site search
- Changes may take time to reflect in search results as search engines need to re-crawl your site
For additional protection against AI crawlers, you can set up specific exclusion rules in your site settings to prevent AI models from scanning your content.