I had SEO Ultimate running until I realized that some SEO features that "I need" (aparently like meta keywords) wouldn't work.
Since YOAST has been brought up before I installed it and just got overwhelmed by it.
Anywho, I had my meta robot tag sorted out before I installed YOAST. Now I apparentlty have to use a robots.txt file on the server to pick up the indexing instructions for those little critters.
All fine and dandy but since I need to "learn" how to write the robots.txt I started to realize that I must have been sharing my entire server contents for all these years by allowing the robots to index everything and follow all links.
Here is my question: My Symbiostock site is running on my main hosting package (root folder). I do have a few subfolders/domains that have their own top level domains. Of course I don't want those to be indexed under the main URL/subdomain/etc.etc.
So do I have to write exclusions for each and every subfolder on my server that I don't want to bots to crawl? Or do the bots only crawl the readable links on my actual Symbiostock page?
I have no clue how all this works and what these robots actually can see. Do they see just the links on my index.php page and all subsequent links from all those pages or can the bots simply see my entire folder structure on my web host as if I logged on via FTP or using the file manager?
I hope I could explain the issue I'm having. If not please ask me what to clarify.
The code I wanted to add to the robots.txt would have been:
User-agent: *
Disallow: