Help:Database download

This page explains how to download an XML dump of the entire site, or of one specific page. If you are trying to download the entire site, please use the XML dump instead of a web crawler.

What is MediaWiki?


The Wikimedia Foundation hosts Wikibooks on servers running the free software MediaWiki. MediaWiki is a PHP script that provides the wiki to web browsers and stores the pages in a MySQL database. MediaWiki was developed for our sister project, Wikipedia. It is now used by all wiki projects hosted by the Wikimedia Foundation, including Wikibooks itself.

To learn more about MediaWiki or to download a copy, visit:

How is Wikibooks copyrighted?


Contributors to Wikibooks own the copyright to their contributions, but they all grant permmission under the GNU Free Documentation License and the CC-BY-SA 3.0 license for copying, modifying, and reusing our textbooks and other content. Some embedded images and media are under other free licenses. Some copyrighted content is used under "fair use". For more information, see the page Wikibooks:Copyrights.

How do I download my copy?


Because Wikibooks is free documentation, you are allowed to copy our content from our MediaWiki server to your own computer. You might want to download a machine-readable copy designed for human modification, that is, the Transparent Version. Or you might want to load a copy of into your own copy of MediaWiki as a mirror. Our XML dumps fulfill both purposes.

Understanding different formats

  • Wiki markup is the format in which authors write pages for MediaWiki. This is what you see when you use the "Edit this page" tab on Wikibooks.
  • HTML is the format that web browsers understand. This is what you obtain when you use the "Save Page" function of your web browser.
  • An XML dump is a format from MediaWiki that contains the complete wiki markup of a page, plus additional information about the page such as its title. Thus an XML dump is the best format for copying the wiki markup from Wikibooks. You can also reimport an XML dump into your own copy of MediaWiki.

Downloading the ENTIRE site


The site contains an entire site dump for To download the entire collection of textbooks:

  1. Visit
  2. Look for "enwikibooks"
  3. Pick an XML dump and download it

Note that "enwiki" is the English Wikipedia, and does not contain any part of "enwikibooks".

Downloading some pages


The Special:Export feature allows you to download one or more pages from Wikibooks. To download some pages:

  1. Visit Special:Export
  2. Follow the instructions to submit a query
  3. Save the resulting XML to disk

... or just use Special:Export/Main Page to quickly get the XML dump of the Main Page, for example.