MicrostockGroup

Microstock Photography Forum - General => Symbiostock => Symbiostock - Technical Support => Topic started by: click_click on July 11, 2013, 16:37

Title: SEO disaster need help/advice
Post by: click_click on July 11, 2013, 16:37
I had SEO Ultimate running until I realized that some SEO features that "I need" (aparently like meta keywords) wouldn't work.

Since YOAST has been brought up before I installed it and just got overwhelmed by it.

Anywho, I had my meta robot tag sorted out before I installed YOAST. Now I apparentlty have to use a robots.txt file on the server to pick up the indexing instructions for those little critters.

All fine and dandy but since I need to "learn" how to write the robots.txt I started to realize that I must have been sharing my entire server contents for all these years by allowing the robots to index everything and follow all links.

Here is my question: My Symbiostock site is running on my main hosting package (root folder). I do have a few subfolders/domains that have their own top level domains. Of course I don't want those to be indexed under the main URL/subdomain/etc.etc.

So do I have to write exclusions for each and every subfolder on my server that I don't want to bots to crawl? Or do the bots only crawl the readable links on my actual Symbiostock page?

I have no clue how all this works and what these robots actually can see. Do they see just the links on my index.php page and all subsequent links from all those pages or can the bots simply see my entire folder structure on my web host as if I logged on via FTP or using the file manager?

I hope I could explain the issue I'm having. If not please ask me what to clarify.

The code I wanted to add to the robots.txt would have been:

Quote
User-agent: *
Disallow:
Title: Re: SEO disaster need help/advice
Post by: Ron on July 11, 2013, 16:40
This is what I get from Webmaster Tools

Quote
Blocked URLs
If your site has content you don't want Google or other search engines to access, use a robots.txt file to specify how search engines should crawl your site's content.
Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)




User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/


http://semmickphoto.com/robots.txt (http://semmickphoto.com/robots.txt)
Title: Re: SEO disaster need help/advice
Post by: Ron on July 11, 2013, 16:41
Whats up with the Avatar?