How do I fix my robots.txt?
My SEO tools are saying that my robots.txt is not crawlable. How do I check if it's a server problem in nginx?
1 Reply
I'm not too familiar with using SEO tools or what configuration you have set up, but if you're using AIOSEO(All in One SEO), they no longer generates its own robots.txt as a dynamic page. Instead, it uses the robots_txt
filter in WordPress to tap into the default robots.txt created by WordPress. It would be helpful if we had the exact error code that you're running into, but this forum post goes into detail about what could potentially cause that error and what solutions you could try to resolve it.
Additionally, if you're trying to check your robots.txt file in your NGINX configurations, I was able to find a few resources that could point you in the right direction:
- What is robots.txt? How to add robots.txt on Nginx? | NameOcean
- How to set robots.txt globally in nginx for all virtual hosts - Server Fault
- SEOPress - Nginx Configuration Rules | GridPane
This GitHub post in particular goes over how you can allow access to all User-agents in a robots.txt file.
Beyond the information that I provided, you may also want to reach out to NGINX's forums, as there may be knowledgeable users who may be able to provide in depth assistance.