said 5 months, 2 weeks ago:
Per Wiki: The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard is different from, but can be used in conjunction with, Sitemaps, a robot inclusion standard for websites.
It sounds complicated but it’s not. It’s just a protocol to inform google what directories you don’t want them to look in or index.
You can learn more at http://www.robotstxt.org/robotstxt.html
You can generate your own at http://tools.seobook.com/robots-txt/generator/
Since your reptilerob.org uses wordpress you could consider installing a pludin like: http://wordpress.org/extend/plugins/irobotstxt-seo/
Google actually offers a Robots.txt Generator. You can learn more about that here: http://searchengineland.com/google-offers-robotstxt-generator-13653
Let me know if you have any questions.