Difference between revisions of "Mediawiki back up and restore steps"
m |
(→Overview and Purpose) |
||
Line 40: | Line 40: | ||
+ | |||
+ | When downloading MediaWiki, use GNU Privacy Guard or gpg utility to confirm that the downloaded files are integral. See [https://www.mediawiki.org/keys/keys.html MediaWiki public keys] page for command to issue to obtain public gpg keys. | ||
The question now is how many of these steps can be automated in a shell script? . . . - TMH | The question now is how many of these steps can be automated in a shell script? . . . - TMH |
Revision as of 02:37, 1 December 2017
<MediaWiki Back-up Process
Contributors: Ted Havelka <ted@cs.pdx.edu>
Started: 2016-04-12 Tuesday
This article in progress
Contents
Overview and Purpose
Note: this article is in progress!
Ted writing this article to cover the back-up and recovery steps of a personal or small-scale Mediawiki site. Once the back-up and restore steps are tested to work and well-understood, a further purpose of this article is to automate as far as possible these steps.
This article describes the situation where the wiki to restore is supported by and running on a Linux based host, and the target host to support the back up copy of the wiki is also a Linux based host.
- Install the latest stable MYSQL server and client on the computer which will host the wiki back up,
- Create a database for the wiki and a MYSQL user with password and privileges to the new database,
- Download and uncompress latest stable release of mediawiki,
- Rename uncompressed mediawiki top directory to short name such as 'wiki',
- For remote configuration via browser move this renamed mediawiki file set to
/var/www/html
,
- Complete the seven to ten steps via Mediawiki's web interface found in the
mw-config
subdirectory,
- Copy media files from the original wiki to the path defined in $wgUploadPath,
- Use a web browser and navigate through several pages of the restored wiki on the newer target host.
When downloading MediaWiki, use GNU Privacy Guard or gpg utility to confirm that the downloaded files are integral. See MediaWiki public keys page for command to issue to obtain public gpg keys.
The question now is how many of these steps can be automated in a shell script? . . . - TMH
Creating Wiki Database And Wiki User
ted@back-up-server:/var/www/wiki$ mysql -u root -p Enter password: Welcome to the MySQL monitor. Commands end with ; or \g. Your MySQL connection id is 7 Server version: 5.7.13-0ubuntu0.16.04.2 (Ubuntu) Copyright (c) 2000, 2016, Oracle and/or its affiliates. All rights reserved. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Type 'help;' or '\h' for help. Type '\c' to clear the current input statement. mysql> show databases; +--------------------+ | Database | +--------------------+ | information_schema | | mysql | | performance_schema | | sys | +--------------------+ 4 rows in set (0.04 sec) mysql> CREATE DATABASE wikidb; Query OK, 1 row affected (0.03 sec) mysql> grant all privileges on wikidb.* to 'wikiuser'@'localhost' identified by 'wiki_pass_phrase'; Query OK, 0 rows affected, 1 warning (0.03 sec)
Wiki Database Back Up
$ mysqldump -h localhost -u wikiuser -p --default-character-set=utf8 wikidb >> /home/ted/local-wiki-database-backup-2016-05-03.sql
This file may be gzip'd or otherwise compressed. In the case of our example wiki, compression reduces database file size from 148 MB to 36 MB, about a four fold reduction in file size.
Wiki Database Restoration
At least some of the time the MYSQL database dump file will have in the first several lines a comment expressing the database which is backed up by the given file. There may not, however, be any MYSQL statement to create a database by that name, or to use a database by that name. One of these MYSQL directives will likely need be added manually to the file before invoking the MYSQL client to run the statements contained in the back up file.
Figure x - first twenty lines of a MYSQL database back up file:
user@localhost:~/archive$ cat local-wiki-database-backup-2016-12-28.sql | head -n 20 -- MySQL dump 10.13 Distrib 5.1.54, for debian-linux-gnu (x86_64) -- -- Host: localhost Database: wikidb -- ------------------------------------------------------ -- Server version 5.1.54-1ubuntu4 /*!40101 SET @OLD_CHARACTER_SET_CLIENT=@@CHARACTER_SET_CLIENT */; /*!40101 SET @OLD_CHARACTER_SET_RESULTS=@@CHARACTER_SET_RESULTS */; /*!40101 SET @OLD_COLLATION_CONNECTION=@@COLLATION_CONNECTION */; /*!40101 SET NAMES utf8 */; /*!40103 SET @OLD_TIME_ZONE=@@TIME_ZONE */; /*!40103 SET TIME_ZONE='+00:00' */; /*!40014 SET @OLD_UNIQUE_CHECKS=@@UNIQUE_CHECKS, UNIQUE_CHECKS=0 */; /*!40014 SET @OLD_FOREIGN_KEY_CHECKS=@@FOREIGN_KEY_CHECKS, FOREIGN_KEY_CHECKS=0 */; /*!40101 SET @OLD_SQL_MODE=@@SQL_MODE, SQL_MODE='NO_AUTO_VALUE_ON_ZERO' */; /*!40111 SET @OLD_SQL_NOTES=@@SQL_NOTES, SQL_NOTES=0 */; -- -- Table structure for table `archive` --
Database dumps can be redirected at the shell prompt to the MYSQL client:
$ mysql [database_name] < database-back-up.sql
http://dev.mysql.com/doc/refman/5.5/en/mysql-batch-commands.html
Restoring Media Files
While the wiki database may be large on the order of tens or hundreds of megabytes, the media files such as PDFs, images, videos and audio clips are not embedded in the database. These files are organized in a wiki directory defined by the LocalSettings.php
variable named $wgUploadPath
.
As of 2017-10-04 WED the third and final command below is sufficient to copy media files from the wiki to be backed up and restored to the new target wiki:
Figure x - using rsync to back up and restore media files of a wiki site:
ted@back-up-server:/var/www/html/wiki/images$ date Tue Jan 3 18:45:40 PST 2017 ted@back-up-server:/var/www/html/wiki/images$ pwd /var/www/html/local-wiki/images ted@back-up-server:/var/www/html/wiki/images$ rsync -avzpt ted@original-server:/var/www/wiki/images/* .
References