Various search engine bots, mainly from Google (googlebot) and MSN, are causing massive bandwidth usage on my vBulletin 3, I'm already at 22GB usage this month, when usually I rarely go past 8GB.
To make a long question short, I have my vB3 in a subdirectory, how would I construct a robots.txt file to deny all search engine bots just for that subdirectory and everything in it?
To make a long question short, I have my vB3 in a subdirectory, how would I construct a robots.txt file to deny all search engine bots just for that subdirectory and everything in it?
Comment