I was perplexed to find two different sitemaps in Google sites:
http://sites.google.com/site/(name of the site)/system/feeds/sitemap
http://sites.google.com/site/(name of the site)/system/app/pages/sitemap/hierarchy
Now, I am ready ask the bewildering question "What are the sitemaps?" I have always thought that they are things that allow crawlers to see what is on the site, as the former example confirms. The latter example challenged my knowledge. What are they actually? Is the former only for human beings while the latter for crawlers? Are there even more different types of sitemaps?
The term sitemap can have two slightly different meanings:
A) The sitemap for humans
A webpage to give your users an overview of your site. This is what your example
http://sites.google.com/site/(name of the site)/system/app/pages/sitemap/hierarchy
is. Just paste it in the URL-Bar of your browser and see yourself.
B) The sitemap for machines
This kind of sitemap is a machine readable (txt of xml) list of URLs that comprise a website. It's the kind of sitemap that allows crawlers to see what is on the site.
You can even have multiple sitemaps of this kind. The reason for this is the same as why we don't have all source code in one file usually. It just easier to manage if you split the sitemap into multiple files.
In your example
the "feeds" indicates that this is a sitemap which contains URLs for just the RSS feeds.
To learn what it contains you will have to take a closer look at it. One way to do this is to download the file like this
and open it in your favourite text editor.
At my site at google this file looks like this:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
</urlset>
It is nearly empty. The reason is probably that I have just created the site and no feed entries exist.
Sitemap can be a)A visual representation of site structure intented for human audience b)File, intended for web crawlers (google, yahoo) who informs them what urls are available for crawling. It is often provided in conjuction with robots.txt file which informs crawlers which urls can be indexed and with cannot. Most common format for sitemaps is XML, which allows to specify importance of URL and change frequency. You can read specification in http://www.sitemaps.org/protocol.php. Uncommon, but possible format of sitemaps is just plaint text file, which seperates URLs with newlines. It is not as flexible as XML format, so XML format is more preffered to SEO efforts. You can have multiple XML sitemaps and link them in sitemap index. It is often used by large sites as sitemap protocol limits sitemap size to 10 MB. Also, you can use RSS or ATOM feeds to notify crawlers about urls. Drawback of this approach is that you can just notify about newest URLs .