RapidBot is the incredibly easy tool designed to create a robots.txt file for your site directly from RapidWeaver. Robots.txt file is retrieved by search engines, like Google and Bing, and used to state what pages need indexing and what must be ignored. A robot (also known as Spider or Web Crawler) is a program that automatically traverses the web's hypertext structure by retrieving a document and recursively retrieving all documents that are referenced.
You decide what spiders will visit in your site. Choose a crawler among our presets or a custom one. Then type a folder name or select it using default link picker. RapidBot is perfectly integrated with RapidWeaver!
Provide specific indexing rules thanks to RapidBot. Enable or disable files and folders crawling in a natural way: RapidBot will translate your directives into something that is understood by search engine bots.
Don't forget that instructing search engines bots help you adding visibility to your site and let people reach you in a more efficient way excluding irrelevant contents. RapidBot is a must-have SEO tool! Only if search engines know what to do with your pages, they can give you a good ranking.
2 years ago
Without the need to manually create a robots.txt file, Foreground has made this RapidWeaver plugin that is outstanding in doing what you need for creating of a robots.txt file. I use it and find it very user friendly and thorough.