robots.txt not found nginx
I keep getting a message from my SEO dashboard that the “robots.txt” is returning an error.
nginx/1.24.0
When I run a search in the linode glish, it says no directory exists. When I try to use a plugin on our Wordpress site to create a robots.txt file, the site or server crashes.
When I run /?robots=1 it works. What do I do now? nginx and Ubuntu
user-agent: *
Disallow: /wp-content/uploads/wpo-plugins-tables-list.json
Start Yoast Block------
User-agent: *
Disallow:
Sitemap:https://horizonpointconsulting.com/sitemap_index.xml------
End Yoast Block
1 Reply
By default, WordPress should be dynamically generating its own response to robots.txt requests using PHP, rather than relying for a robots.txt file on the server.
My best guess is that there's something in your nginx configuration which is failing to pass the request to PHP to generate the file. The solution near the bottom of this page might work for you:
https://medium.com/@oktay.acikalin/wordpress-nginx-virtual-robots-txt-and-404-bd5cc082725d