RapidBot is the incredibly easy tool designed to create a robots.txt file for your site directly from RapidWeaver.
Foreground##Your first SEO tool RapidBot is the incredibly easy tool designed to create a robots.txt file for your site directly from RapidWeaver. Robots.txt file is retrieved by search engines, like Google and Bing, and used to state what pages need indexing and what must be ignored. A robot (also known as Spider or Web Crawler) is a program that automatically traverses the web's hypertext structure by retrieving a document and recursively retrieving all documents that are referenced.
##You're in control You decide what spiders will visit in your site. Choose a crawler among our presets or a custom one. Then type a folder name or select it using default link picker. RapidBot is perfectly integrated with RapidWeaver!
##Don't get lost Provide specific indexing rules thanks to RapidBot. Enable or disable files and folders crawling in a natural way: RapidBot will translate your directives into something that is understood by search engine bots.
##Teach bots how you want to be reached Don't forget that instructing search engines bots help you adding visibility to your site and let people reach you in a more efficient way excluding irrelevant contents. RapidBot is a must-have SEO tool! Only if search engines know what to do with your pages, they can give you a good ranking.
Version 3.0.0
Released 2014-11-04