Customize robots.txt in Shopify

Shopify generates a default robots.txt file that works for most stores. However, if you want to make changes to the default file, then you can add the robots.txt.liquid template in order to make the following customizations:

Add a new rule to an existing group
Remove a rule from an existing group
Add custom rules

The robots.txt.liquid file supports only the following Liquid objects:

robots
group
rule
user_agent
sitemap


Add a new rule to an existing group

Block all crawlers from accessing pages with the URL parameter ?q=:

{% for group in robots.default_groups %}
  {{- group.user_agent }}
  {%- for rule in group.rules -%}
    {{ rule }}
  {%- endfor -%}
  {%- if group.user_agent.value == '*' -%}
    {{ 'Disallow: /*?q=*' }}
  {%- endif -%}
  {%- if group.sitemap != blank -%}
      {{ group.sitemap }}
  {%- endif -%}
{% endfor %}

Remove a default rule from an existing group
Remove the rule blocking crawlers from accessing the /policies/ page:

{% for group in robots.default_groups %}
  {{- group.user_agent }}
  {%- for rule in group.rules -%}
    {%- unless rule.directive == 'Disallow' and rule.value == '/policies/' -%}
      {{ rule }}
    {%- endunless -%}
  {%- endfor -%}
  {%- if group.sitemap != blank -%}
      {{ group.sitemap }}
  {%- endif -%}
{% endfor %}

Add custom rules

Block certain crawlers

<!-- Liquid for default rules -->

User-agent: discobot
Disallow: /

Allow certain crawlers

<!-- Liquid for default rules -->

User-agent: discobot
Allow: /

Add extra sitemap URLs

<!-- Liquid for default rules -->

Sitemap: [sitemap-url]