| ... | ... | @@ -80,13 +80,13 @@ If you have a lot of images you'll need to download them (via curl,wget or your |
|
|
|
|
|
|
|
## I really want to keep the page history, any options? ##
|
|
|
|
|
|
|
|
Yes, but it's clunky and may not scale well for a large number of pages.
|
|
|
|
Yes, but it's clunky and may not scale well for a large number of pages(hat tip to @ewillink for figuring this out).
|
|
|
|
|
|
|
|
Start by creating your new wiki(see the import section) and then clone it locally.
|
|
|
|
|
|
|
|
Now when you export your content from Wiki.eclipse.org, uncheck 'include only the current revision, not the full history' on the Special:Export page(see the export section).
|
|
|
|
|
|
|
|
This will generate an xml file with all of the various revisions in it wrapped in <revision></revision> tags. You can then script extracting all the different revisions and then commit them individually which will recreate the history.
|
|
|
|
This will generate an xml file with all of the various revisions in it wrapped in \<revision\>\</revision\> tags. You can then script extracting all the different revisions and then commit them individually which will re-create the history.
|
|
|
|
|
|
|
|
For example:
|
|
|
|
|
| ... | ... | @@ -96,7 +96,7 @@ Clone your wiki |
|
|
|
git clone git@gitlab.eclipse.org:path/to/yourrepo.wiki.git
|
|
|
|
```
|
|
|
|
Step 2:
|
|
|
|
Extract your content via Special:Export
|
|
|
|
Extract your content via Special:Export(the Webmaster_FAQ in this case)
|
|
|
|
|
|
|
|
Step 3:
|
|
|
|
Break it up and commit the individual revisions(Use whatever tools you like, this example uses awk and other common CLI tools on Linux, but you should also be able to run them via WSL):
|
| ... | ... | |