Custom instructions example
Allows Full Access
User-agent:*
Disallow:
Disallows Access to All Folders
User-agent:*
Disallow: /
Default Instructions
User-agent: *
Disallow: /index.php/
Disallow: /*?
Disallow: /checkout/
Disallow: /app/
Disallow: /lib/
Disallow: /*.php$
Disallow: /pkginfo/
Disallow: /report/
Disallow: /var/
Disallow: /catalog/
Disallow: /customer/
Disallow: /sendfriend/
Disallow: /review/
Disallow: /*SID=
Configure robots.txt
-
On the Admin sidebar, go to Content > Design > Configuration.
-
Find the Global configuration in the first row of the grid and click Edit.
-
Scroll down and expand
-
Set Default Robots to one of the following:
Option Description INDEX, FOLLOW
Instructs web crawlers to index the site and to check back later for changes. NOINDEX, FOLLOW
Instructs web crawlers to avoid indexing the site, but to check back later for changes. INDEX, NOFOLLOW
Instructs web crawlers to index the site once, but to not check back later for changes. NOINDEX, NOFOLLOW
Instructs web crawlers to avoid indexing the site, and to not check back later for changes. -
If needed, enter custom instructions into the Edit Custom instruction of robots.txt file box. For example, while a site is in development, you might want to disallow access to all folders.
-
To restore the default instructions, click Reset to Default.
-
-
When complete, click Save Configuration.