XML Sitemap Generator

Generate XML sitemaps from your URL list with priority and change frequency settings

Safe conversion with no data sent to server

Last updated: March 2026

Sitemap Settings

0.01.0

What is an XML Sitemap?

An XML sitemap is a structured file that lists all the important URLs on your website, providing search engines with a roadmap for discovering and crawling your content. Following the Sitemaps Protocol (sitemaps.org), this standardized XML format includes metadata for each URL such as the last modification date, change frequency, and relative priority. Google, Bing, Yandex, and other major search engines use sitemaps as a primary mechanism for content discovery and efficient crawling.

Think of an XML sitemap as a table of contents for search engine crawlers. While search engines can discover pages by following internal links, a sitemap explicitly declares which URLs you consider important and when they were last updated. This is especially critical for large websites with thousands of pages, new websites with few inbound links, sites with deep page architectures where some content is many clicks from the homepage, and dynamically generated pages that may not be easily discoverable through standard link-following.

Google supports several sitemap formats and extensions. A standard sitemap can include up to 50,000 URLs or be up to 50MB in size (uncompressed). For larger sites, you can use a sitemap index file that references multiple individual sitemaps. Specialized sitemap types include image sitemaps, video sitemaps, and news sitemaps, each providing additional metadata that helps Google surface your content in specialized search verticals like Google Images, Google Video, and Google News.

How to Use This Tool

Generate a valid XML sitemap from your URL list:

  1. Enter your base URL - Provide your website's root URL (e.g., https://example.com). Relative paths in your URL list will be prepended with this base URL to create full, absolute URLs in the sitemap.
  2. Set the default priority - Use the slider to set a priority value between 0.0 and 1.0. The priority is relative to your own pages: 1.0 is highest importance, 0.5 is default. Set your homepage to 1.0, main category pages to 0.8, and individual content pages to 0.5-0.6. Google has stated it largely ignores the priority value, but Bing and other engines may still reference it.
  3. Choose the change frequency - Select how often the content at these URLs typically changes: "daily" for news sites and blogs, "weekly" for regularly updated content, "monthly" for stable pages, or "yearly" for archived content. This is advisory and helps crawlers schedule revisits.
  4. Add your URLs - Enter one URL per line. You can use relative paths (like /about or /blog/my-post) or full absolute URLs. The tool automatically resolves relative paths against your base URL. Up to 500 URLs are supported per generation.
  5. Generate and download - Click "Generate Sitemap" to create the XML output. Review the sitemap for accuracy, then download the file as sitemap.xml and upload it to your website's root directory.

Why XML Sitemaps Matter for SEO

XML sitemaps are a fundamental technical SEO asset that directly influences how search engines discover, crawl, and index your content:

Faster indexation of new content: When you publish new pages, a sitemap with a current lastmod date signals to search engines that fresh content is available. Google's crawling systems use sitemap data to prioritize which URLs to visit. Sites that submit updated sitemaps through Google Search Console often see new content indexed within hours rather than days, a critical advantage for time-sensitive content like news articles, product launches, and event pages.

Crawl budget optimization: Every website has a crawl budget, the number of pages Googlebot will crawl in a given time period. Your sitemap communicates which pages matter most, helping search engines allocate crawl resources efficiently. By only including canonical, indexable pages in your sitemap (excluding noindex pages, redirects, and low-value URLs), you direct crawl budget toward content that actually contributes to your organic traffic.

Orphan page discovery: Orphan pages are URLs that exist on your site but are not linked from any other page. Without an XML sitemap, these pages are invisible to search engines because crawlers discover pages by following links. Your sitemap ensures that every important page is discoverable, regardless of your internal linking structure. This is especially important after site migrations, URL restructuring, or when working with large databases of dynamically generated content.

Search Console insights: Submitting a sitemap to Google Search Console unlocks valuable indexation reports. You can see how many of your submitted URLs are indexed, which ones have errors or warnings, and which are excluded from the index. This data is essential for monitoring the health of your site's presence in Google and diagnosing indexation problems early before they impact traffic.

FAQ

How many URLs can an XML sitemap contain?

A single XML sitemap file can contain up to 50,000 URLs and must not exceed 50MB (uncompressed). If your site has more than 50,000 URLs, create a sitemap index file that references multiple individual sitemap files. The sitemap index itself can reference up to 50,000 individual sitemaps, supporting up to 2.5 billion URLs in total. For most websites, a single sitemap file is sufficient.

Should I include every page in my sitemap?

No. Only include URLs that you want search engines to index. Exclude noindex pages, paginated result pages, URL parameter variations, redirected URLs, error pages, and duplicate content. Including non-indexable URLs wastes crawl budget and can send confusing signals. Your sitemap should be a curated list of your site's most important, canonical, indexable pages.

Does Google use the priority and changefreq values?

Google has stated that it largely ignores the priority value in sitemaps, as it determines crawl priority through its own algorithms. The changefreq value is used as a hint for scheduling re-crawls but is not strictly followed. The lastmod date is the most useful metadata element: Google uses accurate lastmod values to identify which pages have been updated and may need re-crawling. Always ensure your lastmod dates reflect actual content changes, not automated timestamp updates.

Where should I place my sitemap file and how do I submit it?

Place the sitemap at your website root (https://example.com/sitemap.xml) for best discoverability. Submit it through three channels: (1) Google Search Console under the Sitemaps section, (2) Bing Webmaster Tools, and (3) reference it in your robots.txt file with a "Sitemap: https://example.com/sitemap.xml" directive. The robots.txt reference ensures any compliant crawler can find your sitemap automatically, even without manual submission to individual search engines.