For some reason, robots.txt keeps appearing in the root folder of one of my accounts. As I run AdSense etc, blocking crawlers is the last thing I want. Why could this be happening? Is there some automated "crawler protection" in place?
Also, is there a way for me to find out what is creating a file? (Is it done by a script or via an FTP session or whatever.)
Also, is there a way for me to find out what is creating a file? (Is it done by a script or via an FTP session or whatever.)
Comment