MicrostockGroup

Microstock Photography Forum - General => Symbiostock => Symbiostock - Technical Support => Topic started by: cathyslife on September 02, 2013, 21:19

Title: Robots.txt and url server errors
Post by: cathyslife on September 02, 2013, 21:19
Last night i got an email from google saying it couldnt crawl my site because it was missing the robots.txt file. I used fetch as google, and it found it just fine, so i sent it to index. Tonight i have gotten an email saying i have a large number of urls that cant be accessed. 238 to be exact. Any idea what might be causing this? Or how i might fix it?
Title: Re: Robots.txt and url server errors
Post by: Kerioak~Christine on September 03, 2013, 01:15
I would be interested to know the answer to this as well as I has similar emails but when I tried to access the folders they were usually there
Title: Re: Robots.txt and url server errors
Post by: cathyslife on September 03, 2013, 06:32
Since it happened the day we were troubleshooting, my best guess would be that the site was down at the time when the googlebot arrived. Then, because of the unusual activity, bluehost throttled my account. Which may have caused pages to load really slow and the bot gave up. These might not have happened all at the same instance but over the 1-2 days. But id like to hear ideas from someone with more knowledge about it than i.
Title: Re: Robots.txt and url server errors
Post by: cathyslife on September 03, 2013, 14:34
Anyone?
Title: Re: Robots.txt and url server errors
Post by: Redneck on September 03, 2013, 15:21
Didn't you answer the question already?
Crawling errors usually happens when a site has a down time or when links are broken.
Title: Re: Robots.txt and url server errors
Post by: cathyslife on September 03, 2013, 15:46
Didn't you answer the question already?
Crawling errors usually happens when a site has a down time or when links are broken.


Thanks for confirming. I answered with my guess, which isnt the same as knowing.  :)