Hide Your Website From Search Engines: A Complete Guide to Blocking Crawlers
A web crawler is a bot used by search engines to scan and index website content for search results. Here's how to control your site's visibility in search engines:
Block All Search Engines
To prevent all search engines from indexing your site:
- Go to Settings
- Click Crawlers
- Check "Block known AI crawlers"
This adds a robots.txt rule that prevents search engine indexing.
Hide Specific Pages
You can exclude individual pages using two methods:
Page Settings Method:
- Open Pages panel
- Click settings (gear icon) on desired page
- Go to SEO tab
- Enable "Hide page from search results"
Benefits:
- Available on all plans
- No coding required
- Removes page from sitemap
- Automatically works on demo pages
Code Injection Method:
- Add this code to the page header:
<meta name="robots" content="noindex" />
Benefits:
- Works on homepage
- Available on all pages
- Maintains sitemap listing
- Offers more control
Important Notes:
- Hiding a collection page also hides all items within it
- Main page settings don't affect subpages
- Hidden pages still appear in site search
- You must hide each subpage individually
- These settings only affect external search engines
For blocking AI crawlers specifically, use the dedicated AI crawler exclusion settings. Remember that custom code modifications aren't covered by standard support, but you can:
- Follow custom code best practices
- Consult the Squarespace Forum
- Hire a Squarespace Expert for assistance
These settings help maintain your desired level of visibility while keeping your site accessible to visitors.