Sitemaps and Robots
Get instant access to this and every other training video with a 7 day free trial!
Start Free Trial

Sitemaps and Robots

  • 5 Likes
  • Jul, 26 2016

A sitemap is an XML file with a list of links to all of your pages. This XML file is used by search engines to more easily index all of your pages. If your site’s pages are properly linked, search engines will be able to discover all of your content. A robots.txt file is a way for you to control how search engines index the pages of your site.

RapidWeaver newsletter

The latest news, articles, and resources, sent to your inbox weekly.

© 2021 Realmac Software Limited. All rights reserved.