| ... | ... | @@ -80,7 +80,7 @@ If you have a lot of images you'll need to download them (via curl,wget or your |
|
|
|
|
|
|
|
## I really want to keep the page history, any options? ##
|
|
|
|
|
|
|
|
Yes, but they're a little clunky and may not scale well for a large number of pages.
|
|
|
|
Yes, but it's clunky and may not scale well for a large number of pages.
|
|
|
|
|
|
|
|
Start by creating your new wiki(see the import section) and then clone it locally.
|
|
|
|
|
| ... | ... | @@ -99,7 +99,7 @@ Step 2: |
|
|
|
Extract your content via Special:Export
|
|
|
|
|
|
|
|
Step 3:
|
|
|
|
Break it up and commit the individual revisions:
|
|
|
|
Break it up and commit the individual revisions(Use whatever tools you like, this example uses awk and other common CLI tools on Linux, but you should also be able to run them via WSL):
|
|
|
|
```
|
|
|
|
awk 'BEGIN{file="Webmaster_FAQ";commit_rev=0}/<revision>/{++commit_rev}{print > file".xml";};/<\/revision>/{cmd1="pandoc -s -f mediawiki -t gfm " file ".xml -o yourrepo.wiki/" file ".md"; system(cmd1); cmd2="cd yourrepo.wiki; git add " file ".md ; git commit -s -m \"Adding " file " revision " commit_rev "\" ; cd .."; system(cmd2); }' Webmaster_FAQ.xml
|
|
|
|
```
|
| ... | ... | |