404 error on accessing robots.txt in Adobe Commerce on cloud infrastructure
If you get a 404 error when accessing the robots.txt file in Adobe Commerce on cloud infrastructure, disable the Nginx rule that redirects /robots.txt requests to /media/robots.txt.
Description description
Environment
Adobe Commerce on cloud infrastructure (all versions)
Issue
The robots.txt file is not working and throws an Nginx exception. It is generated dynamically “on the fly” and is not accessible via the /robots.txt URL because Nginx has a rewrite rule that forcibly redirects all /robots.txt requests to the /media/robots.txt file, which does not exist.
Cause
This occurs when Nginx is not configured properly.
Resolution resolution
To resolve the issue, disable the Nginx rule that redirects /robots.txt requests to the /media/robots.txt file.
- If self-service is not enabled (or if you’re unsure whether it is), submit a Adobe Commerce Support ticket requesting the removal of the Nginx redirect rule from
/robots.txtto/media/robots.txt. - If self-service is enabled, upgrade ECE-Tools to version 2002.0.12 or later. Then, remove the Nginx redirect rule from your
.magento.app.yamlfile.
For detailed guidance, refer to Add site map and search engine robots in the Adobe Commerce developer documentation.
Related reading
- Block malicious traffic for Magento Commerce Cloud on Fastly level in our support knowledge base
- Search Engine Robots in our user guide