Resolve robots.txt not updating or displaying default settings

This article provides a solution for cases where the robots.txt file in Adobe Commerce is configured correctly, but still displays default settings or fails to update. To resolve this issue, ensure that indexing by search engines is enabled.

Description description

Environment

Adobe Commerce on cloud infrastructure 2.4.x

Issue

Unable to change the default robots.txt settings in Adobe Commerce.

Steps to reproduce:

  1. Access the Admin panel.
  2. Navigate to Content > Design > Configuration, then edit the Custom instruction of robots.txt field.
  3. Add custom content (for example, the text “hello”) and save the changes.
  4. Visit the robots.txt URL.

Expected result:
The robots.txt file displays the saved custom content.

Actual result:
The robots.txt file does not change and continues to show the default content.

Cause

Indexing by search engines is disabled, which prevents the custom robots.txt content from being applied.

Resolution resolution

Method 1: On the Adobe Commerce Cloud Console, configure indexing by search engines as described in our developer documentation.

Method 2: Using the magento-cloud CLI, run the following command:

magento-cloud environment:info -p <cluster> -e production restrict_robots false

Add site map and search engine robots

recommendation-more-help
3d58f420-19b5-47a0-a122-5c9dab55ec7f