Best MySQL Backup Solutions for MediaWiki (2025)

Your MediaWiki powers knowledge bases, documentation sites, internal wikis, or public collaborative platforms. Years of accumulated knowledge, thousands of contributors, and millions of edits represent irreplaceable institutional memory.

MediaWiki's strength is its complete revision history - every edit to every page preserved forever. This same feature makes databases massive and backups challenging. A wiki that started at 500MB can grow to 50GB-500GB as revision tables accumulate. Losing this data means losing not just current content but the entire evolution of your knowledge base.

This guide compares MySQL backup solutions specifically for MediaWiki's unique characteristics - massive revision tables, continuous editing activity, and the critical importance of preserving complete historical data.

Why MediaWiki Needs Specialized Database Protection

Exponentially Growing Revision Tables

MediaWiki stores every version of every page in the revision table. A page edited 100 times creates 100 revision records. Active wikis accumulate millions of revisions over time. The revision table becomes the largest single table in your database, often 60-80% of total database size. A wiki with 10,000 current pages might have 500,000+ revisions spanning years. Backup solutions must handle these massive, continuously growing tables efficiently.

Continuous Collaborative Editing

Unlike traditional websites with scheduled content updates, wikis have continuous editing activity. Users edit pages 24/7 across time zones. Each edit creates new revision records, updates page metadata, generates recent changes entries, and modifies linking structures. Your database never has "quiet periods" for backups. You need backup approaches that don't impact editor experience during active periods.

Critical Importance of Complete History

For wikis, history isn't just nice to have - it's fundamental. Revision history enables rollbacks after vandalism, tracks attribution for licensing, allows comparing versions, and provides accountability for collaborative work. Losing revision history destroys core wiki functionality. Some organizations face legal requirements to preserve edit history for audit trails. Your backup solution must preserve every revision with perfect fidelity.

Complex Inter-Table Relationships

MediaWiki maintains intricate relationships between tables: pages link to revisions, revisions link to text storage, recent changes track page updates, category tables organize content, and page links create the wiki's navigation structure. These relationships must remain perfectly consistent. A backup that corrupts these relationships creates a non-functional wiki even if individual tables seem intact.

Knowledge Preservation Requirements

Many wikis serve as organizational knowledge bases - documentation that can't be recreated. Technical wikis document systems only the contributors understand. Internal wikis contain institutional knowledge accumulated over years. Community wikis preserve collective contributions from hundreds or thousands of people. The cost of data loss isn't just restore time - it's permanently lost human knowledge and collaboration.

Top MySQL Backup Solutions for MediaWiki Databases

We've evaluated the most common backup approaches MediaWiki users rely on, from manual methods to automated services. Here's what works, what doesn't, and where each solution fits best.

Manual mysqldump Backups

How it works: mysqldump exports your MediaWiki database to SQL files. You schedule dumps via cron and transfer to remote storage. This is the traditional approach for MediaWiki backups.

Advantages

  • Free and universally available
  • Works on any hosting environment
  • Documented in MediaWiki guides
  • SQL format is portable
  • Can be combined with XML dumps

Limitations

  • Very slow for large wikis (hours for 50GB+)
  • Database locks slow wiki during backup
  • No automated verification
  • Restore is extremely slow (hours/days)
  • Storage inefficient (full dumps each time)
  • Difficult to test regularly

Best for: Small wikis (under 2GB), personal wikis with infrequent editing, or situations where backup time isn't critical.

Not suitable for: Active wikis, large knowledge bases (10GB+), or wikis where backup operations can't impact editor experience.

MediaWiki XML Dumps

How it works: MediaWiki's dumpBackup.php script exports pages and revisions to XML format. This is how Wikipedia creates their public dumps. The XML includes page content and revision history but not all database tables.

Advantages

  • Human-readable XML format
  • Can import to different MediaWiki versions
  • Doesn't lock database during export
  • Portable between different databases
  • Good for content migration

Limitations

  • Extremely slow (slower than mysqldump)
  • Doesn't backup all tables (users, permissions, etc.)
  • Restore requires manual import scripts
  • Can't do point-in-time recovery
  • Large storage requirements (verbose XML)
  • Not suitable as primary backup method

Best for: Content migrations, public content sharing, or supplementary backups alongside database backups.

Not suitable for: Primary backup strategy, complete wiki restoration, or situations requiring rapid recovery.

Hosting Provider Snapshot Backups

How it works: Many hosting providers offer automated server/VM snapshots that capture entire disk state including databases. Snapshots typically run daily and store on provider infrastructure.

Advantages

  • Often included with hosting
  • Captures entire server (database + files)
  • Quick snapshot operation
  • Simple restore process
  • No database-specific configuration

Limitations

  • Usually limited to daily backups
  • 24-hour recovery point objective
  • May not be database-consistent
  • Expensive storage for large wikis
  • Limited retention (7-30 days)
  • Tied to hosting provider

Best for: Baseline protection, wikis where 24-hour data loss is acceptable, or combined with other backup strategies.

Not suitable for: Active wikis with continuous editing, situations requiring sub-daily recovery points, or long-term retention requirements.

Custom Backup Scripts

How it works: Many MediaWiki administrators write custom scripts combining mysqldump, file copying, compression, and rotation logic. Scripts typically run via cron and upload to S3 or similar storage.

Advantages

  • Customized to your specific needs
  • Can combine database and file backups
  • Full control over retention and storage
  • No ongoing costs beyond storage
  • Can exclude non-critical tables

Limitations

  • Requires significant development time
  • Maintenance burden (updates, fixes)
  • No verification unless coded
  • Error handling often inadequate
  • Monitoring/alerting manual
  • Knowledge transfer issues if admin leaves

Best for: Organizations with dedicated systems administrators, unique requirements not met by existing solutions, or strong preference for internal tools.

Not suitable for: Small teams, wikis without dedicated admin resources, or situations where reliability is more important than customization.

DBCalm: Purpose-Built MySQL Backup for MediaWiki

We built DBCalm to handle the challenges MediaWiki creates for database backups. Instead of slow full dumps that lock your database and take hours, DBCalm uses physical incremental backups every 15 minutes via Mariabackup/XtraBackup to protect your content and complete revision history efficiently.

How DBCalm Protects Your MediaWiki Site

Continuous Incremental Backups Every 15 Minutes

DBCalm creates physical incremental backups every 15 minutes using Mariabackup/XtraBackup. This means you're never more than 15 minutes away from your most recent backup, dramatically reducing potential data loss from vandalism, accidents, or system failures.

Unlike full dumps that copy your entire 50GB+ database repeatedly, these physical incremental backups only capture what changed at the file level. Even with continuous editing, a 20GB MediaWiki database might only generate 300-800MB of changes per day, making backups extremely storage-efficient.

Optimized for Large Revision Tables

MediaWiki's revision tables can reach tens or hundreds of gigabytes. Traditional mysqldump takes hours to dump and hours more to restore these massive tables. DBCalm's physical backups handle large revision tables efficiently:

  • Initial full backup: 20-40 minutes for 20GB wiki (vs 2-4 hours with mysqldump)
  • Incremental backups: 30-90 seconds regardless of database size
  • Restore time: 25-50 minutes for 20GB (vs 3-6 hours with SQL import)

Zero Impact on Wiki Performance

Traditional mysqldump creates read locks that slow wiki queries during backup. For large wikis, this means sluggish page loads and edit conflicts during backup windows. DBCalm's physical backups don't lock tables - editors won't notice backup operations even during peak activity.

Point-in-Time Recovery Every 15 Minutes

With backups running every 15 minutes, you can restore to any 15-minute interval. This is critical for wikis because:

  • If vandalism starts at 2:20 PM, restore to 2:15 PM before damage began
  • If a bot goes rogue at 10:05 AM, restore to 10:00 AM before incorrect mass edits
  • If an extension update corrupts data at 3:40 PM, restore to 3:30 PM before the update

Automated Backup Verification

DBCalm automatically verifies every backup by restoring it and running validation queries against the data. For MediaWiki, this includes checking for recent revisions and verifying critical table integrity. You'll know immediately if a backup is corrupted, not when you need it during an emergency.

Long-Term Retention for Knowledge Preservation

Wikis need longer retention than typical websites because historical data has permanent value. DBCalm supports flexible retention policies:

  • 15-minute backups: Keep for 14 days (recent recovery)
  • Daily backups: Keep for 1 year (historical reference)
  • Monthly backups: Keep for 7+ years (permanent archive)

MediaWiki Recovery Scenarios

Scenario 1: Vandalism Attack
A vandal starts deleting content and making destructive edits at 2:22 PM. You notice at 2:35 PM. With daily backups, you'd lose all day's legitimate edits across thousands of pages. With DBCalm's 15-minute backups, you restore to 2:15 PM and only lose 7 minutes - likely zero legitimate edits during that brief window.

Scenario 2: Database Corruption from Extension
An extension update corrupts the page table at 11:45 AM. Your wiki won't load. With DBCalm, you restore to 11:30 AM, and your wiki is back online in 30 minutes with only 15 minutes of edits lost. With daily backups, you lose all morning's collaborative work.

MediaWiki-Specific Considerations

  • Revision table efficiency: Physical backups handle massive revision tables 5-10x faster than SQL dumps
  • Zero editor impact: No table locks means editors never notice backup operations
  • Complete history preservation: Every revision backed up with perfect fidelity
  • Large wiki optimization: Efficiently backs up wikis with millions of revisions and 100GB+ databases
  • Extension compatibility: Works with all MediaWiki extensions and custom table structures

Complete MediaWiki Backup Comparison

SolutionBackup FrequencyRecovery PointEncryptionStarting PriceBest For
mysqldump Daily typically 24 hours Manual setup Free Small wikis
XML Dumps Weekly typically Varies No Free Content migration
Hosting Snapshots Daily 24 hours Optional Included Baseline protection
Custom Scripts Varies Varies Manual setup Free (time cost) Custom needs
DBCalm Every 15 minutes 15 minutes AES-256 $49/month Active wikis

Frequently Asked Questions

Can I backup MediaWiki while users are editing?

Yes, but the method matters. Traditional mysqldump creates locks that can slow your database during backup, causing slow page loads and edit conflicts during backup windows.

Physical backup systems like DBCalm use Mariabackup/XtraBackup and don't lock tables, so editors never notice backup operations even during peak editing periods.

Do MediaWiki backups include full revision history?

Yes. Database backups capture the entire revision table which contains every version of every page ever created or edited. This is the largest table in most wikis but essential for wiki functionality.

DBCalm's incremental backups efficiently handle large revision tables by only storing changes, making even 100GB+ wikis practical to backup every 15 minutes.

How long does it take to restore a MediaWiki database?

DBCalm's physical backups are typically 5-10x faster than importing SQL dump files, critical for minimizing downtime.

  • 5GB wiki: 8-12 minutes with physical backups, 40-90 minutes with SQL dumps
  • 20GB wiki: 30-50 minutes with physical backups, 3-6 hours with SQL dumps
  • 100GB wiki: 2-4 hours with physical backups, 15-24 hours with SQL dumps

Ready to Protect Your MediaWiki Site?

Try DBCalm SaaS

Fully managed MySQL backup solution with 15-minute incremental backups, automated verification, and expert support.

  • 15-minute recovery points
  • Automated backup testing
  • 24/7 monitoring and alerts
  • Expert support team

Starting at $29/month (50% off for first 200 customers)

Get Early Access

Deploy Open Source

Self-host DBCalm on your own infrastructure. Same backup engine, full control, zero monthly fees.

  • Complete source code access
  • Deploy anywhere with MySQL access
  • No vendor lock-in
  • Community support

Free and open source (MIT License)

View on GitHub

Questions? Contact our team to discuss your MediaWiki backup needs.

Additional MediaWiki Backup Resources