7 Days to Better Your SEO
(Search Engine Optimization)
Day 3: Sitemaps
Today I want to talk about Site Maps. A Sitemap is basically a list of all the pages in your website. They are an easy way for webmasters to inform search engines about pages in their sites that are available for crawling. Sitemaps are not “Site Indexes” and people sometimes use the term interchangablilty. In reality a Sitemap categorizes from top to bottom while a Site Index sorts and stores web page information alphabetically. These are usually web pages that you have on your site and list all the pages. They are also good to have as they serve the human purpose first allowing people to easily find pages and the search engines will crawl them as well. Sitemaps or Site Indexes are important to use if you have a site that uses a lot of Adobe Flash or any medium that a search engine would have trouble indexing. They also stand in as a navigational aid by providing a summary of a websites content at a glance.
A new concept was put into action in 2005; Google introduced their idea of XML Sitemaps. The XML Sitemaps (approved by MSN, Yahoo, Ask and Google) let the major search engines receive updates when the content on a web page is edited/updated. Web crawlers usually discover pages from links within the site and from other sites. Sitemaps supplement this data to allow crawlers that support Sitemaps to pick up all URLs in the Sitemap and learn about those URLs using the associated metadata. Using the Sitemap protocol does not guarantee that web pages are included in search engines, but provides hints for web crawlers to do a better job of crawling your site. A newer feature of these XML Sitemaps is that it can summarize how frequent a specific website is updated and can store the last time the content on that website was updated.
Site maps were introduced because the complexity and amount of websites that are added to the internet everyday is far beyond what it was even 15 years ago. The search engines were even skipping over websites because there was just too much information to process. Now Google, MSN, Ask and Yahoo can find information faster and with more efficiency.
So now the question is how do you build an XML page for the search engines. Luckily there are plenty of programs out their what will do that for you quite easily. Some are free and some cost a nominal amount. Just go to CNET’s Download.com and search the term “sitemap” and you will find plenty. I have used SoftPLUS GSite Crawler and SiteMap Pro in the past and they both work very well. GSite was free the last time I checked and you are encouraged to make a donation to support the future development while SiteMap Pro was $59.
Before I close I want to just touch 2 terms I referred to in this post.
Metadata is simply “data about other data”. It is used to describe information about the information you are using. For example for Sitemap maps, certain metadata may describe when the page was last updated, how often it usually changes, and how important it is, relative to other URLs in the site.
XML (Extensible Markup Language) is a general-purpose specification for creating custom markup language. The term extensible is used to indicate that a markup-language designer has significant freedom in the choice of markup elements. Simply put it is a format that a programmer (or program) uses to create a sitemap or other files that use the XML protocol. Then the search engines in this example will understand how to read the file since they know how to read the XML protocol.
I will be back Monday with Day 4: Writing Good Content