Free Online XML Sitemap Generator

Search Engine Optimization
domainIQ on AISERP.com

Specialized Domain Name Deep Dive Research.

XML Sitemap Generator


Enter a domain name


Modified date
dd/mm/yyyy
Change frequency
Default priority
How many pages do I need to crawl?

Captcha

Crawling...
Links Found: 0


                
                

domainIQ on AISERP.com

Specialized Domain Name Deep Dive Research.

About XML Sitemap Generator

Automatically Crawl and create and XML sitemap to submit to Google in just a click.

Sometimes they include brief descriptions of the different pages and the content they contain. Sometimes they are nothing more than a long and somewhat generic list of page links.Some people create sitemaps with the sole purpose of giving their viewers a comprehensive web page directory.Some people create sitemaps simply to make certain the search engine crawlers find each and every available page on their website.

Google Sitemaps…

Like all search engine crawlers, GoogleBot is out there with the express purpose of gathering valuable data that can be added to its searchable index. The sooner it can return with new and updated information the better. For both Google and the people who use their search engine.With that in mind, the Google sitemap service offers a twofold solution.First, it lightens GoogleBot’s burden of having to constantly crawl the same places over and over again looking for new and updated content.Now, with a system that tells the bot when and where to crawl, the result is simply a great deal of time being saved. Time that can be spent much more efficiently.Rather than waste time on pages that have not been (and might never be) updated or changed, the bot can zero in on places that have valuable and current content that can be added to the search database.

For webmasters, Google Sitemaps offers a way to send immediate notification when any change or addition takes place within their websites. This not only increases the possibility of getting pages indexed faster, it ensures that GoogleBot can easily locate pages that are available and bypass any and all pages that aren’t meant to be public. For the sitemap files themselves, there are two different types that you can implement. The first one is your typical list of individual pages (just like any other sitemap would display). The second type would be used as an index, listing multiple sitemaps (in the event you have more than one).

The limit is 50,000 URLs per sitemap with a maximum of 1,000 sitemaps.

Google accepts plain text versions but gives higher priority for sitemaps that are written in XML format. That’s because the XML version includes valuable notification options that can be associated with each URL.

Here is a brief explanation of each of those options.

Last Modified <lastmod>

Allows you to specify the exact time and date a page was last changed or updated. This should conform to the ISO 8601 format (your can read these specifications at http://www.w3.org/TR/NOTE-datetime) . If you choose not to include the time, the format for the date alone would be YYYY-MM-DD. March 9, 2006, for example, would be displayed as <lastmod>2006-03-06</lastmod>.

Change Frequency <changefreq>

Allows you to specify how often a page will change or be updated. Valid values are always, hourly, daily, weekly, monthly, yearly, and never. Be aware, however, that the value is merely used as a guide and not a command. It’s possible that any given page can be crawled more or less frequently than the specified value.

Priority <priority>

Allows you to specify a number that tells how important you feel any page is in relation to all the other pages on your website. Valid values range from an absolute low of 0.0 to a maximum high of 1.0 (the default priority value of a page is 0.5).

Keep in mind that the priority you set has no bearing with regard to what search engine results position your page achieves (if any). It merely tells GoogleBot which page should be given the most importance when crawling your website.

XML Sitemap Example

<?xml version="1.0" encoding="UTF-8"?>

<urlset xmlns="http://www.google.com/schemas/sitemap/0.84">

<url>

<loc>http://www.example.com/</loc>

<lastmod>2005-01-01</lastmod>

<changefreq>monthly</changefreq>

<priority>0.8</priority>

</url>

<url>

<loc>http://www.example.com/page1.html</loc>

<changefreq>weekly</changefreq>

</url>

<url>

<loc>http://www.example.com/page2.html</loc>

<lastmod>2004-12-23</lastmod>

<changefreq>weekly</changefreq>

</url>

<url>

<loc>http://www.example.com/page3.html</loc>

<lastmod>2004-12-23T18:00:15+00:00</lastmod>

<priority>0.3</priority>

</url>

<url>

<loc>http://www.example.com/page4.html</loc>

<lastmod>2004-11-23</lastmod>

</url>

</urlset>

Sitemap Index Example

<?xml version="1.0" encoding="UTF-8"?>

<sitemapindex xmlns="http://www.google.com/schemas/sitemap/0.84">

<sitemap>

<loc>http://www.example.com/sitemap1.xml.gz</loc>

<lastmod>2004-10-01T18:23:17+00:00</lastmod>

</sitemap>

<sitemap>

<loc>http://www.example.com/sitemap2.xml.gz</loc>

<lastmod>2005-01-01</lastmod>

</sitemap>

</sitemapindex>

Notice the additional .gz extension. To reduce bandwidth, you have the option of compressing your sitemap files using gzip. Uncompressed sitemap files cannot exceed ten megabytes.

Naturally, if you have a relatively small website, managing your sitemap won’t be difficult or overly time consuming. But having a program that automates the process of updating and delivering the sitemap would still be beneficial.

Of course, you probably don’t have one small website. You most likely have (or will have at some point) numerous websites with hundreds if not thousands of pages each. And under those circumstances, you an automated system would definitely be an asset