The update allows webmasters to host sitemap files on any domain, using a robots.txt file to indicate the location of the sitemap file and the domain it represents.
For example, MSN.com offers content across subdomains such as health.msn.com, travel.msn.com and moneycentral.msn.com, but is unable to host all of its sitemaps in one location, such as sitemaps.msn.com.
Until now the protocol did not support this scenario, each sitemap would have needed to be hosted directly under the domain it described," Fabrice Canel, program manager for Microsoft's Live Search Crawler, said. "This update now introduces support for this scenario, with the requirement that you simply include a reference to the sitemap in your robots.txt file."
Canel offered moneycentral.msn.com/robots.txt as an example, which could now include the following line; Sitemap: https://sitemaps.msn.com/index_moneycentral.msn.com.xml, adding that all URLs in the sitemap file must be within the same domain as the robots.txt file (e.g. moneycentral.msn.com/ in this example).
Multiple "Sitemap:" references are supported, although Microsoft recommends keeping robots.txt files under 1 MB in size and limiting individual sitemap files to under 10 MB, whether they are in XML, RSS or Text formats.
One feature that comes as a direct result of webmaster feedback is the ability to ping Live Search to notify it of updates to sitemaps.
In support of the changes to the Sitemap Protocol, Microsoft offers a Sitemap Forum for webmasters.