I noticed the other day that my bandwidth had been spiking at an alarming rate. I live in absolute paranoia that I'll wake up one day and get a horrid screen that says, "Your site has been suspended...". So the first thing I thought when I saw that my site had used more in a day that it usually does in a week I freaked out.
Looking through my stats it would seem it is Googlebot. It is listed as using up roughly the same amount of data that I find so unusual. Fine, I would love to be indexed. I have about 90GB bandwidth a month, I think, and even though I am alarmed that just a few days in to the month I'm close to using a GB (I usually only use between 4-6 the whole month) I think I should hold up okay. I only have 2GB worth of content after all... just how many times is Googlebot going to visit me?
That's as far as my bandwidth limit goes. What my concern is, what is a visit from Googlebot like? This is shared hosting after all. I've read up on the situation and seen that many people have had their sites shut up from exceeding their monthly limit - fair enough, not cool, but this situation has two easy solutions: buy more bandwidth or just wait til next month rolls over. But has anyone heard of sites being shut down due to Googlebot causing problems in a shared enviroment? Is the Googlebot method like one user going from one page to the next, or is it like 1000s of users all hitting my site at once?
This thing seems like a blessing and a curse. On one hand, I'd love to have my site indexed as it isn't indexed at all so far. On the other hand, I certainly don't want to lose it all together because it is causing stability issues and I get booted off my server. It's a good thing I am still working on my site so I have a lot of unused bandwidth to spare. If I had launched it, I'd probably be sweating even more.
Should I worry? I read the other posts here about that txt file trick to stop it. I actually don't want to stop Googlebot from visiting me, unless it might get the site shut down, or, if it will be doing this every day for a year or more.
Looking through my stats it would seem it is Googlebot. It is listed as using up roughly the same amount of data that I find so unusual. Fine, I would love to be indexed. I have about 90GB bandwidth a month, I think, and even though I am alarmed that just a few days in to the month I'm close to using a GB (I usually only use between 4-6 the whole month) I think I should hold up okay. I only have 2GB worth of content after all... just how many times is Googlebot going to visit me?
That's as far as my bandwidth limit goes. What my concern is, what is a visit from Googlebot like? This is shared hosting after all. I've read up on the situation and seen that many people have had their sites shut up from exceeding their monthly limit - fair enough, not cool, but this situation has two easy solutions: buy more bandwidth or just wait til next month rolls over. But has anyone heard of sites being shut down due to Googlebot causing problems in a shared enviroment? Is the Googlebot method like one user going from one page to the next, or is it like 1000s of users all hitting my site at once?
This thing seems like a blessing and a curse. On one hand, I'd love to have my site indexed as it isn't indexed at all so far. On the other hand, I certainly don't want to lose it all together because it is causing stability issues and I get booted off my server. It's a good thing I am still working on my site so I have a lot of unused bandwidth to spare. If I had launched it, I'd probably be sweating even more.
Should I worry? I read the other posts here about that txt file trick to stop it. I actually don't want to stop Googlebot from visiting me, unless it might get the site shut down, or, if it will be doing this every day for a year or more.
Comment