Custom instructions example

Allows Full Access

User-agent:*
Disallow:

Disallows Access to All Folders

User-agent:*
Disallow: /

Default Instructions

User-agent: *
Disallow: /index.php/
Disallow: /*?
Disallow: /checkout/
Disallow: /app/
Disallow: /lib/
Disallow: /*.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Disallow: /sendfriend/
Disallow: /review/
Disallow: /*SID=

Configure robots.txt

  1. On the Admin sidebar, go to Content > Design > Configuration.

  2. Find the Global configuration in the first row of the grid and click Edit.

    Global design configuration

  3. Scroll down and expand Expansion selector the Search Engine Robots section and do the following:

    Design configuration - search engine robots

    • Set Default Robots to one of the following:

      OptionDescription
      INDEX, FOLLOWInstructs web crawlers to index the site and to check back later for changes.
      NOINDEX, FOLLOWInstructs web crawlers to avoid indexing the site, but to check back later for changes.
      INDEX, NOFOLLOWInstructs web crawlers to index the site once, but to not check back later for changes.
      NOINDEX, NOFOLLOWInstructs web crawlers to avoid indexing the site, and to not check back later for changes.
    • If needed, enter custom instructions into the Edit Custom instruction of robots.txt file box. For example, while a site is in development, you might want to disallow access to all folders.

    • To restore the default instructions, click Reset to Default.

  4. When complete, click Save Configuration.

Previous pageCreate email reminders
Next pageMeta data

Commerce