* [PATCH 0/5] git-remote-mediawiki: support File: import and export
@ 2012-06-26 16:04 Matthieu Moy
2012-06-26 16:04 ` [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message Matthieu Moy
` (5 more replies)
0 siblings, 6 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-26 16:04 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
This is a rework of two patch series already discussed here:
http://thread.gmane.org/gmane.comp.version-control.git/199877
http://thread.gmane.org/gmane.comp.version-control.git/200002
I've split the commits in smaller patches, hopefully they should be
easier to read and should address remarks already made.
This is built on top of mm/credential-plumbing which also touches
git-remote-mediawiki, but should apply to master too.
Matthieu Moy (3):
git-remote-mediawiki: don't compute the diff when getting commit
message
git-remote-mediawiki: don't "use encoding 'utf8';"
git-remote-mediawiki: split get_mw_pages into smaller functions
NGUYEN Kim Thuat (1):
git-remote-mediawiki: send "File:" attachments to a remote wiki
Pavel Volek (1):
git-remote-mediawiki: import "File:" attachments
contrib/mw-to-git/git-remote-mediawiki | 453 ++++++++++++++++++++++++++++-----
1 file changed, 390 insertions(+), 63 deletions(-)
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply [flat|nested] 18+ messages in thread
* [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message
2012-06-26 16:04 [PATCH 0/5] git-remote-mediawiki: support File: import and export Matthieu Moy
@ 2012-06-26 16:04 ` Matthieu Moy
2012-06-26 17:47 ` Junio C Hamano
2012-06-26 16:04 ` [PATCH 2/5] git-remote-mediawiki: don't "use encoding 'utf8';" Matthieu Moy
` (4 subsequent siblings)
5 siblings, 1 reply; 18+ messages in thread
From: Matthieu Moy @ 2012-06-26 16:04 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
---
contrib/mw-to-git/git-remote-mediawiki | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index a34f53f..a8e6287 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -874,8 +874,7 @@ sub mw_push_revision {
# TODO: for now, it's just a delete+add
my @diff_info_list = split(/\0/, $diff_infos);
# Keep the first line of the commit message as mediawiki comment for the revision
- my $commit_msg = (split(/\n/, run_git("show --pretty=format:\"%s\" $sha1_commit")))[0];
+ my $commit_msg = (split(/\n/, run_git("log -1 --format=\"%s\" $sha1_commit")))[0];
chomp($commit_msg);
# Push every blob
while (@diff_info_list) {
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH 2/5] git-remote-mediawiki: don't "use encoding 'utf8';"
2012-06-26 16:04 [PATCH 0/5] git-remote-mediawiki: support File: import and export Matthieu Moy
2012-06-26 16:04 ` [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message Matthieu Moy
@ 2012-06-26 16:04 ` Matthieu Moy
2012-06-26 16:04 ` [PATCH 3/5] git-remote-mediawiki: send "File:" attachments to a remote wiki Matthieu Moy
` (3 subsequent siblings)
5 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-26 16:04 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
The use of this statement is generally discouraged, and is too intrusive
for us: it forces the HTTP requests made by the API to contain only valid
UTF-8 characters. This would break the upload of binary files.
---
contrib/mw-to-git/git-remote-mediawiki | 5 ++---
1 file changed, 2 insertions(+), 3 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index a8e6287..ed06ff7 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -36,11 +36,10 @@
use strict;
use MediaWiki::API;
use DateTime::Format::ISO8601;
-use encoding 'utf8';
-# use encoding 'utf8' doesn't change STDERROR
-# but we're going to output UTF-8 filenames to STDERR
+# By default, use UTF-8 to communicate with Git and the user
binmode STDERR, ":utf8";
+binmode STDOUT, ":utf8";
use URI::Escape;
use IPC::Open2;
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH 3/5] git-remote-mediawiki: send "File:" attachments to a remote wiki
2012-06-26 16:04 [PATCH 0/5] git-remote-mediawiki: support File: import and export Matthieu Moy
2012-06-26 16:04 ` [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message Matthieu Moy
2012-06-26 16:04 ` [PATCH 2/5] git-remote-mediawiki: don't "use encoding 'utf8';" Matthieu Moy
@ 2012-06-26 16:04 ` Matthieu Moy
2012-06-26 16:04 ` [PATCH 4/5] git-remote-mediawiki: split get_mw_pages into smaller functions Matthieu Moy
` (2 subsequent siblings)
5 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-26 16:04 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
From: NGUYEN Kim Thuat <Kim-Thuat.Nguyen@ensimag.imag.fr>
The current version of the git-remote-mediawiki supports only import and
export of plain wiki pages. This patch adds the functionality to export
file attachments (i.e. the content of the File: MediaWiki namespace),
which are also exposed by MediaWiki API.
This requires a recent version of MediaWiki::API (Version 0.37 works.
Version 0.34 doesn't).
Signed-off-by: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Signed-off-by: NGUYEN Kim Thuat <Kim-Thuat.Nguyen@ensimag.imag.fr>
Signed-off-by: ROUCHER IGLESIAS Javier <roucherj@ensimag.imag.fr>
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
---
contrib/mw-to-git/git-remote-mediawiki | 98 +++++++++++++++++++++++++++++++---
1 file changed, 90 insertions(+), 8 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index ed06ff7..253b449 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -348,8 +348,12 @@ sub get_mw_pages {
return values(%pages);
}
+# usage: $out = run_git("command args");
+# $out = run_git("command args", "raw"); # don't interpret output as UTF-8.
sub run_git {
- open(my $git, "-|:encoding(UTF-8)", "git " . $_[0]);
+ my $args = shift;
+ my $encoding = (shift || "encoding(UTF-8)");
+ open(my $git, "-|:$encoding", "git " . $args);
my $res = do { local $/; <$git> };
close($git);
@@ -705,6 +709,63 @@ sub error_non_fast_forward {
return 0;
}
+sub mw_upload_file {
+ my $complete_file_name = shift;
+ my $new_sha1 = shift;
+ my $extension = shift;
+ my $file_deleted = shift;
+ my $summary = shift;
+ my $newrevid;
+ my $path = "File:" . $complete_file_name;
+ my %hashFiles = get_allowed_file_extensions();
+ if (!exists($hashFiles{$extension})) {
+ print STDERR "$complete_file_name is not a permitted file on this wiki.\n";
+ print STDERR "Check the configuration of file uploads in your mediawiki.\n";
+ return $newrevid;
+ }
+ # Deleting and uploading a file requires a priviledged user
+ if ($file_deleted) {
+ mw_connect_maybe();
+ my $query = {
+ action => 'delete',
+ title => $path,
+ reason => $summary
+ };
+ if (!$mediawiki->edit($query)) {
+ print STDERR "Failed to delete file on remote wiki\n";
+ print STDERR "Check your permissions on the remote site. Error code:\n";
+ print STDERR $mediawiki->{error}->{code} . ':' . $mediawiki->{error}->{details};
+ exit 1;
+ }
+ } else {
+ # Don't let perl try to interpret file content as UTF-8 => use "raw"
+ my $content = run_git("cat-file blob $new_sha1", "raw");
+ if ($content ne "") {
+ mw_connect_maybe();
+ $mediawiki->{config}->{upload_url} =
+ "$url/index.php/Special:Upload";
+ $mediawiki->edit({
+ action => 'upload',
+ filename => $complete_file_name,
+ comment => $summary,
+ file => [undef,
+ $complete_file_name,
+ Content => $content],
+ ignorewarnings => 1,
+ }, {
+ skip_encoding => 1
+ } ) || die $mediawiki->{error}->{code} . ':'
+ . $mediawiki->{error}->{details};
+ my $last_file_page = $mediawiki->get_page({title => $path});
+ $newrevid = $last_file_page->{revid};
+ print STDERR "Pushed file: $new_sha1 - $complete_file_name.\n";
+ } else {
+ print STDERR "Empty file $complete_file_name not pushed.\n";
+ }
+ }
+ return $newrevid;
+}
+
sub mw_push_file {
my $diff_info = shift;
# $diff_info contains a string in this format:
@@ -717,7 +778,8 @@ sub mw_push_file {
my $summary = shift;
# MediaWiki revision number. Keep the previous one by default,
# in case there's no edit to perform.
- my $newrevid = shift;
+ my $oldrevid = shift;
+ my $newrevid;
my $new_sha1 = $diff_info_split[3];
my $old_sha1 = $diff_info_split[2];
@@ -725,9 +787,11 @@ sub mw_push_file {
my $page_deleted = ($new_sha1 eq NULL_SHA1);
$complete_file_name = mediawiki_clean_filename($complete_file_name);
- if (substr($complete_file_name,-3) eq ".mw") {
- my $title = substr($complete_file_name,0,-3);
-
+ my ($title, $extension) = $complete_file_name =~ /^(.*)\.([^\.]*)$/;
+ if (!defined($extension)) {
+ $extension = "";
+ }
+ if ($extension eq "mw") {
my $file_content;
if ($page_deleted) {
# Deleting a page usually requires
@@ -745,7 +809,7 @@ sub mw_push_file {
action => 'edit',
summary => $summary,
title => $title,
- basetimestamp => $basetimestamps{$newrevid},
+ basetimestamp => $basetimestamps{$oldrevid},
text => mediawiki_clean($file_content, $page_created),
}, {
skip_encoding => 1 # Helps with names with accentuated characters
@@ -757,7 +821,7 @@ sub mw_push_file {
$mediawiki->{error}->{code} .
' from mediwiki: ' . $mediawiki->{error}->{details} .
".\n";
- return ($newrevid, "non-fast-forward");
+ return ($oldrevid, "non-fast-forward");
} else {
# Other errors. Shouldn't happen => just die()
die 'Fatal: Error ' .
@@ -768,8 +832,11 @@ sub mw_push_file {
$newrevid = $result->{edit}->{newrevid};
print STDERR "Pushed file: $new_sha1 - $title\n";
} else {
- print STDERR "$complete_file_name not a mediawiki file (Not pushable on this version of git-remote-mediawiki).\n"
+ $newrevid = mw_upload_file($complete_file_name, $new_sha1,
+ $extension, $page_deleted,
+ $summary);
}
+ $newrevid = ($newrevid or $oldrevid);
return ($newrevid, "ok");
}
@@ -906,3 +973,18 @@ sub mw_push_revision {
print STDOUT "ok $remote\n";
return 1;
}
+
+sub get_allowed_file_extensions {
+ mw_connect_maybe();
+
+ my $query = {
+ action => 'query',
+ meta => 'siteinfo',
+ siprop => 'fileextensions'
+ };
+ my $result = $mediawiki->api($query);
+ my @file_extensions= map $_->{ext},@{$result->{query}->{fileextensions}};
+ my %hashFile = map {$_ => 1}@file_extensions;
+
+ return %hashFile;
+}
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH 4/5] git-remote-mediawiki: split get_mw_pages into smaller functions
2012-06-26 16:04 [PATCH 0/5] git-remote-mediawiki: support File: import and export Matthieu Moy
` (2 preceding siblings ...)
2012-06-26 16:04 ` [PATCH 3/5] git-remote-mediawiki: send "File:" attachments to a remote wiki Matthieu Moy
@ 2012-06-26 16:04 ` Matthieu Moy
2012-06-26 16:04 ` [PATCH 5/5] git-remote-mediawiki: import "File:" attachments Matthieu Moy
2012-06-27 9:10 ` [PATCH 0/5 v2] git-remote-mediawiki: support File: import and export Matthieu Moy
5 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-26 16:04 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
---
contrib/mw-to-git/git-remote-mediawiki | 106 +++++++++++++++++++--------------
1 file changed, 62 insertions(+), 44 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index 253b449..e175c69 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -258,6 +258,64 @@ sub mw_connect_maybe {
}
}
+## Functions for listing pages on the remote wiki
+sub get_mw_tracked_pages {
+ my $pages = shift;
+ my @some_pages = @tracked_pages;
+ while (@some_pages) {
+ my $last = 50;
+ if ($#some_pages < $last) {
+ $last = $#some_pages;
+ }
+ my @slice = @some_pages[0..$last];
+ get_mw_first_pages(\@slice, $pages);
+ @some_pages = @some_pages[51..$#some_pages];
+ }
+}
+
+sub get_mw_tracked_categories {
+ my $pages = shift;
+ foreach my $category (@tracked_categories) {
+ if (index($category, ':') < 0) {
+ # Mediawiki requires the Category
+ # prefix, but let's not force the user
+ # to specify it.
+ $category = "Category:" . $category;
+ }
+ my $mw_pages = $mediawiki->list( {
+ action => 'query',
+ list => 'categorymembers',
+ cmtitle => $category,
+ cmlimit => 'max' } )
+ || die $mediawiki->{error}->{code} . ': '
+ . $mediawiki->{error}->{details};
+ foreach my $page (@{$mw_pages}) {
+ $pages->{$page->{title}} = $page;
+ }
+ }
+}
+
+sub get_mw_all_pages {
+ my $pages = shift;
+ # No user-provided list, get the list of pages from the API.
+ my $mw_pages = $mediawiki->list({
+ action => 'query',
+ list => 'allpages',
+ aplimit => 'max'
+ });
+ if (!defined($mw_pages)) {
+ print STDERR "fatal: could not get the list of wiki pages.\n";
+ print STDERR "fatal: '$url' does not appear to be a mediawiki\n";
+ print STDERR "fatal: make sure '$url/api.php' is a valid page.\n";
+ exit 1;
+ }
+ foreach my $page (@{$mw_pages}) {
+ $pages->{$page->{title}} = $page;
+ }
+}
+
+# queries the wiki for a set of pages. Meant to be used within a loop
+# querying the wiki for slices of page list.
sub get_mw_first_pages {
my $some_pages = shift;
my @some_pages = @{$some_pages};
@@ -286,6 +344,7 @@ sub get_mw_first_pages {
}
}
+# Get the list of pages to be fetched according to configuration.
sub get_mw_pages {
mw_connect_maybe();
@@ -295,55 +354,14 @@ sub get_mw_pages {
$user_defined = 1;
# The user provided a list of pages titles, but we
# still need to query the API to get the page IDs.
-
- my @some_pages = @tracked_pages;
- while (@some_pages) {
- my $last = 50;
- if ($#some_pages < $last) {
- $last = $#some_pages;
- }
- my @slice = @some_pages[0..$last];
- get_mw_first_pages(\@slice, \%pages);
- @some_pages = @some_pages[51..$#some_pages];
- }
+ get_mw_tracked_pages(\%pages);
}
if (@tracked_categories) {
$user_defined = 1;
- foreach my $category (@tracked_categories) {
- if (index($category, ':') < 0) {
- # Mediawiki requires the Category
- # prefix, but let's not force the user
- # to specify it.
- $category = "Category:" . $category;
- }
- my $mw_pages = $mediawiki->list( {
- action => 'query',
- list => 'categorymembers',
- cmtitle => $category,
- cmlimit => 'max' } )
- || die $mediawiki->{error}->{code} . ': ' . $mediawiki->{error}->{details};
- foreach my $page (@{$mw_pages}) {
- $pages{$page->{title}} = $page;
- }
- }
+ get_mw_tracked_categories(\%pages);
}
if (!$user_defined) {
- # No user-provided list, get the list of pages from
- # the API.
- my $mw_pages = $mediawiki->list({
- action => 'query',
- list => 'allpages',
- aplimit => 500,
- });
- if (!defined($mw_pages)) {
- print STDERR "fatal: could not get the list of wiki pages.\n";
- print STDERR "fatal: '$url' does not appear to be a mediawiki\n";
- print STDERR "fatal: make sure '$url/api.php' is a valid page.\n";
- exit 1;
- }
- foreach my $page (@{$mw_pages}) {
- $pages{$page->{title}} = $page;
- }
+ get_mw_all_pages(\%pages);
}
return values(%pages);
}
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH 5/5] git-remote-mediawiki: import "File:" attachments
2012-06-26 16:04 [PATCH 0/5] git-remote-mediawiki: support File: import and export Matthieu Moy
` (3 preceding siblings ...)
2012-06-26 16:04 ` [PATCH 4/5] git-remote-mediawiki: split get_mw_pages into smaller functions Matthieu Moy
@ 2012-06-26 16:04 ` Matthieu Moy
2012-06-27 9:10 ` [PATCH 0/5 v2] git-remote-mediawiki: support File: import and export Matthieu Moy
5 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-26 16:04 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
From: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Add the symmetrical feature to the "File:" export support in the previous
patch. Download files from the wiki as needed, and feed them into the
fast-import stream. Import both the file itself, and the corresponding
description page.
Signed-off-by: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Signed-off-by: NGUYEN Kim Thuat <Kim-Thuat.Nguyen@ensimag.imag.fr>
Signed-off-by: ROUCHER IGLESIAS Javier <roucherj@ensimag.imag.fr>
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
---
contrib/mw-to-git/git-remote-mediawiki | 244 +++++++++++++++++++++++++++++++--
1 file changed, 236 insertions(+), 8 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index e175c69..93c9135 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -13,9 +13,6 @@
#
# Known limitations:
#
-# - Only wiki pages are managed, no support for [[File:...]]
-# attachments.
-#
# - Poor performance in the best case: it takes forever to check
# whether we're up-to-date (on fetch or push) or to fetch a few
# revisions from a large wiki, because we use exclusively a
@@ -36,6 +33,7 @@
use strict;
use MediaWiki::API;
use DateTime::Format::ISO8601;
+use FileHandle;
# By default, use UTF-8 to communicate with Git and the user
binmode STDERR, ":utf8";
@@ -72,6 +70,9 @@ chomp(@tracked_pages);
my @tracked_categories = split(/[ \n]/, run_git("config --get-all remote.". $remotename .".categories"));
chomp(@tracked_categories);
+# Import media files too.
+my $import_media = run_git("config --get --bool remote.". $remotename .".mediaimport");
+
my $wiki_login = run_git("config --get remote.". $remotename .".mwLogin");
# TODO: ideally, this should be able to read from keyboard, but we're
# inside a remote helper, so our stdin is connect to git, not to a
@@ -87,6 +88,9 @@ my $shallow_import = run_git("config --get --bool remote.". $remotename .".shall
chomp($shallow_import);
$shallow_import = ($shallow_import eq "true");
+# Cache for MediaWiki namespace ids.
+my %namespace_id;
+
# Dumb push: don't update notes and mediawiki ref to reflect the last push.
#
# Configurable with mediawiki.dumbPush, or per-remote with
@@ -363,6 +367,13 @@ sub get_mw_pages {
if (!$user_defined) {
get_mw_all_pages(\%pages);
}
+ if ($import_media) {
+ if ($user_defined) {
+ get_linked_mediafiles(\%pages);
+ } else {
+ get_all_mediafiles(\%pages);
+ }
+ }
return values(%pages);
}
@@ -379,6 +390,152 @@ sub run_git {
}
+sub get_all_mediafiles {
+ my $pages = shift;
+ # Attach list of all pages for media files from the API,
+ # they are in a different namespace, only one namespace
+ # can be queried at the same moment
+ my $mw_pages = $mediawiki->list({
+ action => 'query',
+ list => 'allpages',
+ apnamespace => get_mw_namespace_id("File"),
+ aplimit => 'max'
+ });
+ if (!defined($mw_pages)) {
+ print STDERR "fatal: could not get the list of pages for media files.\n";
+ print STDERR "fatal: '$url' does not appear to be a mediawiki\n";
+ print STDERR "fatal: make sure '$url/api.php' is a valid page.\n";
+ exit 1;
+ }
+ foreach my $page (@{$mw_pages}) {
+ $pages->{$page->{title}} = $page;
+ }
+}
+
+sub get_linked_mediafiles {
+ my $pages = shift;
+ my @titles = map $_->{title}, values(%{$pages});
+
+ # The query is split in small batches because of the MW API limit of
+ # the number of links to be returned (500 links max).
+ my $batch = 10;
+ while (@titles) {
+ if ($#titles < $batch) {
+ $batch = $#titles;
+ }
+ my @slice = @titles[0..$batch];
+
+ # pattern 'page1|page2|...' required by the API
+ my $mw_titles = join('|', @slice);
+
+ # Media files could be included or linked from
+ # a page, get all related
+ my $query = {
+ action => 'query',
+ prop => 'links|images',
+ titles => $mw_titles,
+ plnamespace => get_mw_namespace_id("File"),
+ pllimit => 'max'
+ };
+ my $result = $mediawiki->api($query);
+
+ while (my ($id, $page) = each(%{$result->{query}->{pages}})) {
+ my @titles;
+ if (defined($page->{links})) {
+ my @link_titles = map $_->{title}, @{$page->{links}};
+ push(@titles, @link_titles);
+ }
+ if (defined($page->{images})) {
+ my @image_titles = map $_->{title}, @{$page->{images}};
+ push(@titles, @image_titles);
+ }
+ if (@titles) {
+ get_mw_first_pages(\@titles, \%{$pages});
+ }
+ }
+
+ @titles = @titles[($batch+1)..$#titles];
+ }
+}
+
+sub get_mw_mediafile_for_page_revision {
+ # Name of the file on Wiki, with the prefix.
+ my $mw_filename = shift;
+ my $timestamp = shift;
+ my %mediafile;
+
+ # Search if on MediaWiki exists a media file with given
+ # timestamp. In that case download the file.
+ my $query = {
+ action => 'query',
+ prop => 'imageinfo',
+ titles => $mw_filename,
+ iistart => $timestamp,
+ iiend => $timestamp,
+ iiprop => 'timestamp|archivename|url',
+ iilimit => 1
+ };
+ my $result = $mediawiki->api($query);
+
+ my ($fileid, $file) = each ( %{$result->{query}->{pages}} );
+ # If not defined it means there is no revision of the file for
+ # given timestamp.
+ if (defined($file->{imageinfo})) {
+ # Get real name of media file.
+ my $filename;
+ if (index($mw_filename, 'File:') == 0) {
+ $filename = substr $mw_filename, 5;
+ } else {
+ $filename = substr $mw_filename, 6;
+ }
+ $mediafile{title} = $filename;
+
+ my $fileinfo = pop(@{$file->{imageinfo}});
+ $mediafile{timestamp} = $fileinfo->{timestamp};
+ # If this is an old version of the file, the file has to be
+ # obtained from the archive. Otherwise it can be downloaded
+ # by MediaWiki API download() function.
+ if (defined($fileinfo->{archivename})) {
+ $mediafile{content} = download_mw_mediafile_from_archive($fileinfo->{url});
+ } else {
+ $mediafile{content} = download_mw_mediafile($mw_filename);
+ }
+ }
+ return %mediafile;
+}
+
+sub download_mw_mediafile_from_archive {
+ my $url = shift;
+ my $file;
+
+ my $ua = LWP::UserAgent->new;
+ my $response = $ua->get($url);
+ if ($response->code) {
+ $file = $response->decoded_content;
+ } else {
+ print STDERR "Error downloading a file from archive.\n";
+ }
+
+ return $file;
+}
+
+sub download_mw_mediafile {
+ my $filename = shift;
+
+ $mediawiki->{config}->{files_url} = $url;
+
+ my $file_content = $mediawiki->download( { title => $filename } );
+ if (!defined($file_content)) {
+ print STDERR "\tFile \'$filename\' could not be downloaded.\n";
+ exit 1;
+ } elsif ($file_content eq "") {
+ print STDERR "\tFile \'$filename\' does not exist on the wiki.\n";
+ exit 1;
+ } else {
+ return $file_content;
+ }
+}
+
sub get_last_local_revision {
# Get note regarding last mediawiki revision
my $note = run_git("notes --ref=$remotename/mediawiki show refs/mediawiki/$remotename/master 2>/dev/null");
@@ -569,6 +726,11 @@ sub import_file_revision {
my %commit = %{$commit};
my $full_import = shift;
my $n = shift;
+ my $mediafile = shift;
+ my %mediafile;
+ if ($mediafile) {
+ %mediafile = %{$mediafile};
+ }
my $title = $commit{title};
my $comment = $commit{comment};
@@ -588,6 +750,10 @@ sub import_file_revision {
if ($content ne DELETED_CONTENT) {
print STDOUT "M 644 inline $title.mw\n";
literal_data($content);
+ if (%mediafile) {
+ print STDOUT "M 644 inline $mediafile{title}\n";
+ literal_data($mediafile{content});
+ }
print STDOUT "\n\n";
} else {
print STDOUT "D $title.mw\n";
@@ -683,12 +849,11 @@ sub mw_import_ref {
$n++;
+ my $page_title = $result->{query}->{pages}->{$pagerevid->{pageid}}->{title};
my %commit;
$commit{author} = $rev->{user} || 'Anonymous';
$commit{comment} = $rev->{comment} || '*Empty MediaWiki Message*';
- $commit{title} = mediawiki_smudge_filename(
- $result->{query}->{pages}->{$pagerevid->{pageid}}->{title}
- );
+ $commit{title} = mediawiki_smudge_filename($page_title);
$commit{mw_revision} = $pagerevid->{revid};
$commit{content} = mediawiki_smudge($rev->{'*'});
@@ -699,9 +864,25 @@ sub mw_import_ref {
}
$commit{date} = DateTime::Format::ISO8601->parse_datetime($last_timestamp);
- print STDERR "$n/", scalar(@revisions), ": Revision #$pagerevid->{revid} of $commit{title}\n";
+ # Differentiates classic pages and media files.
+ my @prefix = split(":", $page_title);
- import_file_revision(\%commit, ($fetch_from == 1), $n);
+ my %mediafile;
+ if ($prefix[0] eq "File" || $prefix[0] eq "Image") {
+ # The name of the file is the same as the media page.
+ my $filename = $page_title;
+ %mediafile = get_mw_mediafile_for_page_revision($filename, $rev->{timestamp});
+ }
+ # If this is a revision of the media page for new version
+ # of a file do one common commit for both file and media page.
+ # Else do commit only for that page.
+ print STDERR "$n/", scalar(@revisions), ": Revision #$pagerevid->{revid} of $commit{title}\n";
+ if (%mediafile) {
+ print STDERR "\tDownloading file $mediafile{title}, version $mediafile{timestamp}\n";
+ import_file_revision(\%commit, ($fetch_from == 1), $n, \%mediafile);
+ } else {
+ import_file_revision(\%commit, ($fetch_from == 1), $n);
+ }
}
if ($fetch_from == 1 && $n == 0) {
@@ -1006,3 +1187,50 @@ sub get_allowed_file_extensions {
return %hashFile;
}
+
+# Return MediaWiki id for a canonical namespace name.
+# Ex.: "File", "Project".
+# Looks for the namespace id in the local configuration
+# variables, if it is not found asks MW API.
+sub get_mw_namespace_id {
+ mw_connect_maybe();
+ my $name = shift;
+
+ if (!exists $namespace_id{$name}) {
+ # Look at configuration file, if the record for that namespace is
+ # already stored. Namespaces are stored in form:
+ # "Name_of_namespace:Id_namespace", ex.: "File:6".
+ my @temp = split(/[ \n]/, run_git("config --get-all remote."
+ . $remotename .".namespaces"));
+ chomp(@temp);
+ foreach my $ns (@temp) {
+ my ($n, $s) = split(/:/, $ns);
+ $namespace_id{$n} = $s;
+ }
+ }
+
+ if (!exists $namespace_id{$name}) {
+ # NS not found => get namespace id from MW and store it in
+ # configuration file.
+ my $query = {
+ action => 'query',
+ meta => 'siteinfo',
+ siprop => 'namespaces'
+ };
+ my $result = $mediawiki->api($query);
+
+ while (my ($id, $ns) = each(%{$result->{query}->{namespaces}})) {
+ if (defined($ns->{canonical}) && ($ns->{canonical} eq $name)) {
+ run_git("config --add remote.". $remotename
+ .".namespaces ". $name .":". $ns->{id});
+ $namespace_id{$name} = $ns->{id};
+ }
+ }
+ }
+
+ if (exists $namespace_id{$name}) {
+ return $namespace_id{$name};
+ } else {
+ die "No such namespace $name on MediaWiki.";
+ }
+}
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* Re: [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message
2012-06-26 16:04 ` [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message Matthieu Moy
@ 2012-06-26 17:47 ` Junio C Hamano
2012-06-27 8:32 ` Matthieu Moy
0 siblings, 1 reply; 18+ messages in thread
From: Junio C Hamano @ 2012-06-26 17:47 UTC (permalink / raw)
To: Matthieu Moy; +Cc: git, Pavel.Volek, Kim-Thuat.Nguyen, roucherj
Matthieu Moy <Matthieu.Moy@imag.fr> writes:
> ---
Not signed off yet?
> contrib/mw-to-git/git-remote-mediawiki | 2 +-
> 1 file changed, 1 insertion(+), 1 deletion(-)
>
> diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
> index a34f53f..a8e6287 100755
> --- a/contrib/mw-to-git/git-remote-mediawiki
> +++ b/contrib/mw-to-git/git-remote-mediawiki
> @@ -874,8 +874,7 @@ sub mw_push_revision {
Curious. The hunk replaces an old line with a new one, but somehow
claims to reduce 8 to 7 by one???
> # TODO: for now, it's just a delete+add
> my @diff_info_list = split(/\0/, $diff_infos);
> # Keep the first line of the commit message as mediawiki comment for the revision
> - my $commit_msg = (split(/\n/, run_git("show --pretty=format:\"%s\" $sha1_commit")))[0];
> + my $commit_msg = (split(/\n/, run_git("log -1 --format=\"%s\" $sha1_commit")))[0];
> chomp($commit_msg);
> # Push every blob
> while (@diff_info_list) {
^ permalink raw reply [flat|nested] 18+ messages in thread
* Re: [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message
2012-06-26 17:47 ` Junio C Hamano
@ 2012-06-27 8:32 ` Matthieu Moy
0 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-27 8:32 UTC (permalink / raw)
To: Junio C Hamano; +Cc: git, Pavel.Volek, Kim-Thuat.Nguyen, roucherj
Junio C Hamano <gitster@pobox.com> writes:
> Matthieu Moy <Matthieu.Moy@imag.fr> writes:
>
>> ---
>
> Not signed off yet?
My bad, I broke my email-sending alias.
>> contrib/mw-to-git/git-remote-mediawiki | 2 +-
>> 1 file changed, 1 insertion(+), 1 deletion(-)
>>
>> diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
>> index a34f53f..a8e6287 100755
>> --- a/contrib/mw-to-git/git-remote-mediawiki
>> +++ b/contrib/mw-to-git/git-remote-mediawiki
>> @@ -874,8 +874,7 @@ sub mw_push_revision {
>
> Curious. The hunk replaces an old line with a new one, but somehow
> claims to reduce 8 to 7 by one???
I (mis-)tweaked the patch manually. Anyway, a better/simpler one is
comming ;-).
Sorry for the noise.
--
Matthieu Moy
http://www-verimag.imag.fr/~moy/
^ permalink raw reply [flat|nested] 18+ messages in thread
* [PATCH 0/5 v2] git-remote-mediawiki: support File: import and export
2012-06-26 16:04 [PATCH 0/5] git-remote-mediawiki: support File: import and export Matthieu Moy
` (4 preceding siblings ...)
2012-06-26 16:04 ` [PATCH 5/5] git-remote-mediawiki: import "File:" attachments Matthieu Moy
@ 2012-06-27 9:10 ` Matthieu Moy
2012-06-27 9:10 ` [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message Matthieu Moy
` (4 more replies)
5 siblings, 5 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-27 9:10 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
Changes since v1:
- Added forgotten Signed-Off-By
- UTF-8 related fixes
- Fix parsing of mediaimport configuration variable
Matthieu Moy (3):
git-remote-mediawiki: don't compute the diff when getting commit
message
git-remote-mediawiki: don't "use encoding 'utf8';"
git-remote-mediawiki: split get_mw_pages into smaller functions
NGUYEN Kim Thuat (1):
git-remote-mediawiki: send "File:" attachments to a remote wiki
Pavel Volek (1):
git-remote-mediawiki: import "File:" attachments
contrib/mw-to-git/git-remote-mediawiki | 466 ++++++++++++++++++++++++++++-----
1 file changed, 402 insertions(+), 64 deletions(-)
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply [flat|nested] 18+ messages in thread
* [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message
2012-06-27 9:10 ` [PATCH 0/5 v2] git-remote-mediawiki: support File: import and export Matthieu Moy
@ 2012-06-27 9:10 ` Matthieu Moy
2012-06-27 9:10 ` [PATCH 2/5] git-remote-mediawiki: don't "use encoding 'utf8';" Matthieu Moy
` (3 subsequent siblings)
4 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-27 9:10 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
While we're there, simplify the code a bit: since log --format=%s anyway
shows the subject line as a single line, no need to split to take the
first line.
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
---
contrib/mw-to-git/git-remote-mediawiki | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index a34f53f..781dfe2 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -873,8 +873,8 @@ sub mw_push_revision {
# TODO: we could detect rename, and encode them with a #redirect on the wiki.
# TODO: for now, it's just a delete+add
my @diff_info_list = split(/\0/, $diff_infos);
- # Keep the first line of the commit message as mediawiki comment for the revision
- my $commit_msg = (split(/\n/, run_git("show --pretty=format:\"%s\" $sha1_commit")))[0];
+ # Keep the subject line of the commit message as mediawiki comment for the revision
+ my $commit_msg = run_git("log --no-walk --format=\"%s\" $sha1_commit");
chomp($commit_msg);
# Push every blob
while (@diff_info_list) {
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH 2/5] git-remote-mediawiki: don't "use encoding 'utf8';"
2012-06-27 9:10 ` [PATCH 0/5 v2] git-remote-mediawiki: support File: import and export Matthieu Moy
2012-06-27 9:10 ` [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message Matthieu Moy
@ 2012-06-27 9:10 ` Matthieu Moy
2012-06-27 9:10 ` [PATCH 3/5] git-remote-mediawiki: send "File:" attachments to a remote wiki Matthieu Moy
` (2 subsequent siblings)
4 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-27 9:10 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
The use of this statement is generally discouraged, and is too intrusive
for us: it forces the HTTP requests made by the API to contain only valid
UTF-8 characters. This would break the upload of binary files.
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
---
contrib/mw-to-git/git-remote-mediawiki | 5 ++---
1 file changed, 2 insertions(+), 3 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index 781dfe2..f2e841e 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -36,11 +36,10 @@
use strict;
use MediaWiki::API;
use DateTime::Format::ISO8601;
-use encoding 'utf8';
-# use encoding 'utf8' doesn't change STDERROR
-# but we're going to output UTF-8 filenames to STDERR
+# By default, use UTF-8 to communicate with Git and the user
binmode STDERR, ":utf8";
+binmode STDOUT, ":utf8";
use URI::Escape;
use IPC::Open2;
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH 3/5] git-remote-mediawiki: send "File:" attachments to a remote wiki
2012-06-27 9:10 ` [PATCH 0/5 v2] git-remote-mediawiki: support File: import and export Matthieu Moy
2012-06-27 9:10 ` [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message Matthieu Moy
2012-06-27 9:10 ` [PATCH 2/5] git-remote-mediawiki: don't "use encoding 'utf8';" Matthieu Moy
@ 2012-06-27 9:10 ` Matthieu Moy
2012-06-27 9:10 ` [PATCH 4/5] git-remote-mediawiki: split get_mw_pages into smaller functions Matthieu Moy
2012-06-27 9:10 ` [PATCH 5/5] git-remote-mediawiki: import "File:" attachments Matthieu Moy
4 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-27 9:10 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
From: NGUYEN Kim Thuat <Kim-Thuat.Nguyen@ensimag.imag.fr>
The current version of the git-remote-mediawiki supports only import and
export of plain wiki pages. This patch adds the functionality to export
file attachments (i.e. the content of the File: MediaWiki namespace),
which are also exposed by MediaWiki API.
This requires a recent version of MediaWiki::API (Version 0.37 works.
Version 0.34 doesn't).
Signed-off-by: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Signed-off-by: NGUYEN Kim Thuat <Kim-Thuat.Nguyen@ensimag.imag.fr>
Signed-off-by: ROUCHER IGLESIAS Javier <roucherj@ensimag.imag.fr>
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
---
contrib/mw-to-git/git-remote-mediawiki | 98 +++++++++++++++++++++++++++++++---
1 file changed, 90 insertions(+), 8 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index f2e841e..3405772 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -348,8 +348,12 @@ sub get_mw_pages {
return values(%pages);
}
+# usage: $out = run_git("command args");
+# $out = run_git("command args", "raw"); # don't interpret output as UTF-8.
sub run_git {
- open(my $git, "-|:encoding(UTF-8)", "git " . $_[0]);
+ my $args = shift;
+ my $encoding = (shift || "encoding(UTF-8)");
+ open(my $git, "-|:$encoding", "git " . $args);
my $res = do { local $/; <$git> };
close($git);
@@ -705,6 +709,63 @@ sub error_non_fast_forward {
return 0;
}
+sub mw_upload_file {
+ my $complete_file_name = shift;
+ my $new_sha1 = shift;
+ my $extension = shift;
+ my $file_deleted = shift;
+ my $summary = shift;
+ my $newrevid;
+ my $path = "File:" . $complete_file_name;
+ my %hashFiles = get_allowed_file_extensions();
+ if (!exists($hashFiles{$extension})) {
+ print STDERR "$complete_file_name is not a permitted file on this wiki.\n";
+ print STDERR "Check the configuration of file uploads in your mediawiki.\n";
+ return $newrevid;
+ }
+ # Deleting and uploading a file requires a priviledged user
+ if ($file_deleted) {
+ mw_connect_maybe();
+ my $query = {
+ action => 'delete',
+ title => $path,
+ reason => $summary
+ };
+ if (!$mediawiki->edit($query)) {
+ print STDERR "Failed to delete file on remote wiki\n";
+ print STDERR "Check your permissions on the remote site. Error code:\n";
+ print STDERR $mediawiki->{error}->{code} . ':' . $mediawiki->{error}->{details};
+ exit 1;
+ }
+ } else {
+ # Don't let perl try to interpret file content as UTF-8 => use "raw"
+ my $content = run_git("cat-file blob $new_sha1", "raw");
+ if ($content ne "") {
+ mw_connect_maybe();
+ $mediawiki->{config}->{upload_url} =
+ "$url/index.php/Special:Upload";
+ $mediawiki->edit({
+ action => 'upload',
+ filename => $complete_file_name,
+ comment => $summary,
+ file => [undef,
+ $complete_file_name,
+ Content => $content],
+ ignorewarnings => 1,
+ }, {
+ skip_encoding => 1
+ } ) || die $mediawiki->{error}->{code} . ':'
+ . $mediawiki->{error}->{details};
+ my $last_file_page = $mediawiki->get_page({title => $path});
+ $newrevid = $last_file_page->{revid};
+ print STDERR "Pushed file: $new_sha1 - $complete_file_name.\n";
+ } else {
+ print STDERR "Empty file $complete_file_name not pushed.\n";
+ }
+ }
+ return $newrevid;
+}
+
sub mw_push_file {
my $diff_info = shift;
# $diff_info contains a string in this format:
@@ -717,7 +778,8 @@ sub mw_push_file {
my $summary = shift;
# MediaWiki revision number. Keep the previous one by default,
# in case there's no edit to perform.
- my $newrevid = shift;
+ my $oldrevid = shift;
+ my $newrevid;
my $new_sha1 = $diff_info_split[3];
my $old_sha1 = $diff_info_split[2];
@@ -725,9 +787,11 @@ sub mw_push_file {
my $page_deleted = ($new_sha1 eq NULL_SHA1);
$complete_file_name = mediawiki_clean_filename($complete_file_name);
- if (substr($complete_file_name,-3) eq ".mw") {
- my $title = substr($complete_file_name,0,-3);
-
+ my ($title, $extension) = $complete_file_name =~ /^(.*)\.([^\.]*)$/;
+ if (!defined($extension)) {
+ $extension = "";
+ }
+ if ($extension eq "mw") {
my $file_content;
if ($page_deleted) {
# Deleting a page usually requires
@@ -745,7 +809,7 @@ sub mw_push_file {
action => 'edit',
summary => $summary,
title => $title,
- basetimestamp => $basetimestamps{$newrevid},
+ basetimestamp => $basetimestamps{$oldrevid},
text => mediawiki_clean($file_content, $page_created),
}, {
skip_encoding => 1 # Helps with names with accentuated characters
@@ -757,7 +821,7 @@ sub mw_push_file {
$mediawiki->{error}->{code} .
' from mediwiki: ' . $mediawiki->{error}->{details} .
".\n";
- return ($newrevid, "non-fast-forward");
+ return ($oldrevid, "non-fast-forward");
} else {
# Other errors. Shouldn't happen => just die()
die 'Fatal: Error ' .
@@ -768,8 +832,11 @@ sub mw_push_file {
$newrevid = $result->{edit}->{newrevid};
print STDERR "Pushed file: $new_sha1 - $title\n";
} else {
- print STDERR "$complete_file_name not a mediawiki file (Not pushable on this version of git-remote-mediawiki).\n"
+ $newrevid = mw_upload_file($complete_file_name, $new_sha1,
+ $extension, $page_deleted,
+ $summary);
}
+ $newrevid = ($newrevid or $oldrevid);
return ($newrevid, "ok");
}
@@ -906,3 +973,18 @@ sub mw_push_revision {
print STDOUT "ok $remote\n";
return 1;
}
+
+sub get_allowed_file_extensions {
+ mw_connect_maybe();
+
+ my $query = {
+ action => 'query',
+ meta => 'siteinfo',
+ siprop => 'fileextensions'
+ };
+ my $result = $mediawiki->api($query);
+ my @file_extensions= map $_->{ext},@{$result->{query}->{fileextensions}};
+ my %hashFile = map {$_ => 1}@file_extensions;
+
+ return %hashFile;
+}
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH 4/5] git-remote-mediawiki: split get_mw_pages into smaller functions
2012-06-27 9:10 ` [PATCH 0/5 v2] git-remote-mediawiki: support File: import and export Matthieu Moy
` (2 preceding siblings ...)
2012-06-27 9:10 ` [PATCH 3/5] git-remote-mediawiki: send "File:" attachments to a remote wiki Matthieu Moy
@ 2012-06-27 9:10 ` Matthieu Moy
2012-06-27 9:10 ` [PATCH 5/5] git-remote-mediawiki: import "File:" attachments Matthieu Moy
4 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-06-27 9:10 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
---
contrib/mw-to-git/git-remote-mediawiki | 106 +++++++++++++++++++--------------
1 file changed, 62 insertions(+), 44 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index 3405772..76d8824 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -258,6 +258,64 @@ sub mw_connect_maybe {
}
}
+## Functions for listing pages on the remote wiki
+sub get_mw_tracked_pages {
+ my $pages = shift;
+ my @some_pages = @tracked_pages;
+ while (@some_pages) {
+ my $last = 50;
+ if ($#some_pages < $last) {
+ $last = $#some_pages;
+ }
+ my @slice = @some_pages[0..$last];
+ get_mw_first_pages(\@slice, $pages);
+ @some_pages = @some_pages[51..$#some_pages];
+ }
+}
+
+sub get_mw_tracked_categories {
+ my $pages = shift;
+ foreach my $category (@tracked_categories) {
+ if (index($category, ':') < 0) {
+ # Mediawiki requires the Category
+ # prefix, but let's not force the user
+ # to specify it.
+ $category = "Category:" . $category;
+ }
+ my $mw_pages = $mediawiki->list( {
+ action => 'query',
+ list => 'categorymembers',
+ cmtitle => $category,
+ cmlimit => 'max' } )
+ || die $mediawiki->{error}->{code} . ': '
+ . $mediawiki->{error}->{details};
+ foreach my $page (@{$mw_pages}) {
+ $pages->{$page->{title}} = $page;
+ }
+ }
+}
+
+sub get_mw_all_pages {
+ my $pages = shift;
+ # No user-provided list, get the list of pages from the API.
+ my $mw_pages = $mediawiki->list({
+ action => 'query',
+ list => 'allpages',
+ aplimit => 'max'
+ });
+ if (!defined($mw_pages)) {
+ print STDERR "fatal: could not get the list of wiki pages.\n";
+ print STDERR "fatal: '$url' does not appear to be a mediawiki\n";
+ print STDERR "fatal: make sure '$url/api.php' is a valid page.\n";
+ exit 1;
+ }
+ foreach my $page (@{$mw_pages}) {
+ $pages->{$page->{title}} = $page;
+ }
+}
+
+# queries the wiki for a set of pages. Meant to be used within a loop
+# querying the wiki for slices of page list.
sub get_mw_first_pages {
my $some_pages = shift;
my @some_pages = @{$some_pages};
@@ -286,6 +344,7 @@ sub get_mw_first_pages {
}
}
+# Get the list of pages to be fetched according to configuration.
sub get_mw_pages {
mw_connect_maybe();
@@ -295,55 +354,14 @@ sub get_mw_pages {
$user_defined = 1;
# The user provided a list of pages titles, but we
# still need to query the API to get the page IDs.
-
- my @some_pages = @tracked_pages;
- while (@some_pages) {
- my $last = 50;
- if ($#some_pages < $last) {
- $last = $#some_pages;
- }
- my @slice = @some_pages[0..$last];
- get_mw_first_pages(\@slice, \%pages);
- @some_pages = @some_pages[51..$#some_pages];
- }
+ get_mw_tracked_pages(\%pages);
}
if (@tracked_categories) {
$user_defined = 1;
- foreach my $category (@tracked_categories) {
- if (index($category, ':') < 0) {
- # Mediawiki requires the Category
- # prefix, but let's not force the user
- # to specify it.
- $category = "Category:" . $category;
- }
- my $mw_pages = $mediawiki->list( {
- action => 'query',
- list => 'categorymembers',
- cmtitle => $category,
- cmlimit => 'max' } )
- || die $mediawiki->{error}->{code} . ': ' . $mediawiki->{error}->{details};
- foreach my $page (@{$mw_pages}) {
- $pages{$page->{title}} = $page;
- }
- }
+ get_mw_tracked_categories(\%pages);
}
if (!$user_defined) {
- # No user-provided list, get the list of pages from
- # the API.
- my $mw_pages = $mediawiki->list({
- action => 'query',
- list => 'allpages',
- aplimit => 500,
- });
- if (!defined($mw_pages)) {
- print STDERR "fatal: could not get the list of wiki pages.\n";
- print STDERR "fatal: '$url' does not appear to be a mediawiki\n";
- print STDERR "fatal: make sure '$url/api.php' is a valid page.\n";
- exit 1;
- }
- foreach my $page (@{$mw_pages}) {
- $pages{$page->{title}} = $page;
- }
+ get_mw_all_pages(\%pages);
}
return values(%pages);
}
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH 5/5] git-remote-mediawiki: import "File:" attachments
2012-06-27 9:10 ` [PATCH 0/5 v2] git-remote-mediawiki: support File: import and export Matthieu Moy
` (3 preceding siblings ...)
2012-06-27 9:10 ` [PATCH 4/5] git-remote-mediawiki: split get_mw_pages into smaller functions Matthieu Moy
@ 2012-06-27 9:10 ` Matthieu Moy
2012-06-27 14:21 ` [PATCH 5/5 v3] " Matthieu Moy
4 siblings, 1 reply; 18+ messages in thread
From: Matthieu Moy @ 2012-06-27 9:10 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
From: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Add the symmetrical feature to the "File:" export support in the previous
patch. Download files from the wiki as needed, and feed them into the
fast-import stream. Import both the file itself, and the corresponding
description page.
Signed-off-by: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Signed-off-by: NGUYEN Kim Thuat <Kim-Thuat.Nguyen@ensimag.imag.fr>
Signed-off-by: ROUCHER IGLESIAS Javier <roucherj@ensimag.imag.fr>
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
---
contrib/mw-to-git/git-remote-mediawiki | 255 +++++++++++++++++++++++++++++++--
1 file changed, 247 insertions(+), 8 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index 76d8824..15a55ba 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -13,9 +13,6 @@
#
# Known limitations:
#
-# - Only wiki pages are managed, no support for [[File:...]]
-# attachments.
-#
# - Poor performance in the best case: it takes forever to check
# whether we're up-to-date (on fetch or push) or to fetch a few
# revisions from a large wiki, because we use exclusively a
@@ -36,6 +33,7 @@
use strict;
use MediaWiki::API;
use DateTime::Format::ISO8601;
+use FileHandle;
# By default, use UTF-8 to communicate with Git and the user
binmode STDERR, ":utf8";
@@ -72,6 +70,11 @@ chomp(@tracked_pages);
my @tracked_categories = split(/[ \n]/, run_git("config --get-all remote.". $remotename .".categories"));
chomp(@tracked_categories);
+# Import media files too.
+my $import_media = run_git("config --get --bool remote.". $remotename .".mediaimport");
+chomp($import_media);
+$import_media = ($import_media eq "true");
+
my $wiki_login = run_git("config --get remote.". $remotename .".mwLogin");
# TODO: ideally, this should be able to read from keyboard, but we're
# inside a remote helper, so our stdin is connect to git, not to a
@@ -87,6 +90,9 @@ my $shallow_import = run_git("config --get --bool remote.". $remotename .".shall
chomp($shallow_import);
$shallow_import = ($shallow_import eq "true");
+# Cache for MediaWiki namespace ids.
+my %namespace_id;
+
# Dumb push: don't update notes and mediawiki ref to reflect the last push.
#
# Configurable with mediawiki.dumbPush, or per-remote with
@@ -363,6 +369,14 @@ sub get_mw_pages {
if (!$user_defined) {
get_mw_all_pages(\%pages);
}
+ if ($import_media) {
+ print STDERR "Getting media files for selected pages...\n";
+ if ($user_defined) {
+ get_linked_mediafiles(\%pages);
+ } else {
+ get_all_mediafiles(\%pages);
+ }
+ }
return values(%pages);
}
@@ -379,6 +393,152 @@ sub run_git {
}
+sub get_all_mediafiles {
+ my $pages = shift;
+ # Attach list of all pages for media files from the API,
+ # they are in a different namespace, only one namespace
+ # can be queried at the same moment
+ my $mw_pages = $mediawiki->list({
+ action => 'query',
+ list => 'allpages',
+ apnamespace => get_mw_namespace_id("File"),
+ aplimit => 'max'
+ });
+ if (!defined($mw_pages)) {
+ print STDERR "fatal: could not get the list of pages for media files.\n";
+ print STDERR "fatal: '$url' does not appear to be a mediawiki\n";
+ print STDERR "fatal: make sure '$url/api.php' is a valid page.\n";
+ exit 1;
+ }
+ foreach my $page (@{$mw_pages}) {
+ $pages->{$page->{title}} = $page;
+ }
+}
+
+sub get_linked_mediafiles {
+ my $pages = shift;
+ my @titles = map $_->{title}, values(%{$pages});
+
+ # The query is split in small batches because of the MW API limit of
+ # the number of links to be returned (500 links max).
+ my $batch = 10;
+ while (@titles) {
+ if ($#titles < $batch) {
+ $batch = $#titles;
+ }
+ my @slice = @titles[0..$batch];
+
+ # pattern 'page1|page2|...' required by the API
+ my $mw_titles = join('|', @slice);
+
+ # Media files could be included or linked from
+ # a page, get all related
+ my $query = {
+ action => 'query',
+ prop => 'links|images',
+ titles => $mw_titles,
+ plnamespace => get_mw_namespace_id("File"),
+ pllimit => 'max'
+ };
+ my $result = $mediawiki->api($query);
+
+ while (my ($id, $page) = each(%{$result->{query}->{pages}})) {
+ my @titles;
+ if (defined($page->{links})) {
+ my @link_titles = map $_->{title}, @{$page->{links}};
+ push(@titles, @link_titles);
+ }
+ if (defined($page->{images})) {
+ my @image_titles = map $_->{title}, @{$page->{images}};
+ push(@titles, @image_titles);
+ }
+ if (@titles) {
+ get_mw_first_pages(\@titles, \%{$pages});
+ }
+ }
+
+ @titles = @titles[($batch+1)..$#titles];
+ }
+}
+
+sub get_mw_mediafile_for_page_revision {
+ # Name of the file on Wiki, with the prefix.
+ my $mw_filename = shift;
+ my $timestamp = shift;
+ my %mediafile;
+
+ # Search if on MediaWiki exists a media file with given
+ # timestamp. In that case download the file.
+ my $query = {
+ action => 'query',
+ prop => 'imageinfo',
+ titles => $mw_filename,
+ iistart => $timestamp,
+ iiend => $timestamp,
+ iiprop => 'timestamp|archivename|url',
+ iilimit => 1
+ };
+ my $result = $mediawiki->api($query);
+
+ my ($fileid, $file) = each ( %{$result->{query}->{pages}} );
+ # If not defined it means there is no revision of the file for
+ # given timestamp.
+ if (defined($file->{imageinfo})) {
+ # Get real name of media file.
+ my $filename;
+ if (index($mw_filename, 'File:') == 0) {
+ $filename = substr $mw_filename, 5;
+ } else {
+ $filename = substr $mw_filename, 6;
+ }
+ $mediafile{title} = $filename;
+
+ my $fileinfo = pop(@{$file->{imageinfo}});
+ $mediafile{timestamp} = $fileinfo->{timestamp};
+ # If this is an old version of the file, the file has to be
+ # obtained from the archive. Otherwise it can be downloaded
+ # by MediaWiki API download() function.
+ if (defined($fileinfo->{archivename})) {
+ $mediafile{content} = download_mw_mediafile_from_archive($fileinfo->{url});
+ } else {
+ $mediafile{content} = download_mw_mediafile($mw_filename);
+ }
+ }
+ return %mediafile;
+}
+
+sub download_mw_mediafile_from_archive {
+ my $url = shift;
+ my $file;
+
+ my $ua = LWP::UserAgent->new;
+ my $response = $ua->get($url);
+ if ($response->code) {
+ $file = $response->decoded_content;
+ } else {
+ print STDERR "Error downloading a file from archive.\n";
+ }
+
+ return $file;
+}
+
+sub download_mw_mediafile {
+ my $filename = shift;
+
+ $mediawiki->{config}->{files_url} = $url;
+
+ my $file_content = $mediawiki->download( { title => $filename } );
+ if (!defined($file_content)) {
+ print STDERR "\tFile \'$filename\' could not be downloaded.\n";
+ exit 1;
+ } elsif ($file_content eq "") {
+ print STDERR "\tFile \'$filename\' does not exist on the wiki.\n";
+ exit 1;
+ } else {
+ return $file_content;
+ }
+}
+
sub get_last_local_revision {
# Get note regarding last mediawiki revision
my $note = run_git("notes --ref=$remotename/mediawiki show refs/mediawiki/$remotename/master 2>/dev/null");
@@ -482,6 +642,14 @@ sub literal_data {
print STDOUT "data ", bytes::length($content), "\n", $content;
}
+sub literal_data_raw {
+ # Output possibly binary content.
+ my ($content) = @_;
+ binmode STDOUT, ":raw";
+ print STDOUT "data ", bytes::length($content), "\n", $content;
+ binmode STDOUT, ":utf8";
+}
+
sub mw_capabilities {
# Revisions are imported to the private namespace
# refs/mediawiki/$remotename/ by the helper and fetched into
@@ -569,6 +737,11 @@ sub import_file_revision {
my %commit = %{$commit};
my $full_import = shift;
my $n = shift;
+ my $mediafile = shift;
+ my %mediafile;
+ if ($mediafile) {
+ %mediafile = %{$mediafile};
+ }
my $title = $commit{title};
my $comment = $commit{comment};
@@ -588,6 +761,10 @@ sub import_file_revision {
if ($content ne DELETED_CONTENT) {
print STDOUT "M 644 inline $title.mw\n";
literal_data($content);
+ if (%mediafile) {
+ print STDOUT "M 644 inline $mediafile{title}\n";
+ literal_data_raw($mediafile{content});
+ }
print STDOUT "\n\n";
} else {
print STDOUT "D $title.mw\n";
@@ -683,12 +860,11 @@ sub mw_import_ref {
$n++;
+ my $page_title = $result->{query}->{pages}->{$pagerevid->{pageid}}->{title};
my %commit;
$commit{author} = $rev->{user} || 'Anonymous';
$commit{comment} = $rev->{comment} || '*Empty MediaWiki Message*';
- $commit{title} = mediawiki_smudge_filename(
- $result->{query}->{pages}->{$pagerevid->{pageid}}->{title}
- );
+ $commit{title} = mediawiki_smudge_filename($page_title);
$commit{mw_revision} = $pagerevid->{revid};
$commit{content} = mediawiki_smudge($rev->{'*'});
@@ -699,9 +875,25 @@ sub mw_import_ref {
}
$commit{date} = DateTime::Format::ISO8601->parse_datetime($last_timestamp);
- print STDERR "$n/", scalar(@revisions), ": Revision #$pagerevid->{revid} of $commit{title}\n";
+ # Differentiates classic pages and media files.
+ my @prefix = split(":", $page_title);
- import_file_revision(\%commit, ($fetch_from == 1), $n);
+ my %mediafile;
+ if ($prefix[0] eq "File" || $prefix[0] eq "Image") {
+ # The name of the file is the same as the media page.
+ my $filename = $page_title;
+ %mediafile = get_mw_mediafile_for_page_revision($filename, $rev->{timestamp});
+ }
+ # If this is a revision of the media page for new version
+ # of a file do one common commit for both file and media page.
+ # Else do commit only for that page.
+ print STDERR "$n/", scalar(@revisions), ": Revision #$pagerevid->{revid} of $commit{title}\n";
+ if (%mediafile) {
+ print STDERR "\tDownloading file $mediafile{title}, version $mediafile{timestamp}\n";
+ import_file_revision(\%commit, ($fetch_from == 1), $n, \%mediafile);
+ } else {
+ import_file_revision(\%commit, ($fetch_from == 1), $n);
+ }
}
if ($fetch_from == 1 && $n == 0) {
@@ -1006,3 +1198,50 @@ sub get_allowed_file_extensions {
return %hashFile;
}
+
+# Return MediaWiki id for a canonical namespace name.
+# Ex.: "File", "Project".
+# Looks for the namespace id in the local configuration
+# variables, if it is not found asks MW API.
+sub get_mw_namespace_id {
+ mw_connect_maybe();
+ my $name = shift;
+
+ if (!exists $namespace_id{$name}) {
+ # Look at configuration file, if the record for that namespace is
+ # already stored. Namespaces are stored in form:
+ # "Name_of_namespace:Id_namespace", ex.: "File:6".
+ my @temp = split(/[ \n]/, run_git("config --get-all remote."
+ . $remotename .".namespaces"));
+ chomp(@temp);
+ foreach my $ns (@temp) {
+ my ($n, $s) = split(/:/, $ns);
+ $namespace_id{$n} = $s;
+ }
+ }
+
+ if (!exists $namespace_id{$name}) {
+ # NS not found => get namespace id from MW and store it in
+ # configuration file.
+ my $query = {
+ action => 'query',
+ meta => 'siteinfo',
+ siprop => 'namespaces'
+ };
+ my $result = $mediawiki->api($query);
+
+ while (my ($id, $ns) = each(%{$result->{query}->{namespaces}})) {
+ if (defined($ns->{canonical}) && ($ns->{canonical} eq $name)) {
+ run_git("config --add remote.". $remotename
+ .".namespaces ". $name .":". $ns->{id});
+ $namespace_id{$name} = $ns->{id};
+ }
+ }
+ }
+
+ if (exists $namespace_id{$name}) {
+ return $namespace_id{$name};
+ } else {
+ die "No such namespace $name on MediaWiki.";
+ }
+}
--
1.7.11.5.g0c7e058.dirty
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH 5/5 v3] git-remote-mediawiki: import "File:" attachments
2012-06-27 9:10 ` [PATCH 5/5] git-remote-mediawiki: import "File:" attachments Matthieu Moy
@ 2012-06-27 14:21 ` Matthieu Moy
2012-07-04 12:53 ` [PATCH v4] " Matthieu Moy
0 siblings, 1 reply; 18+ messages in thread
From: Matthieu Moy @ 2012-06-27 14:21 UTC (permalink / raw)
To: git, gitster; +Cc: Pavel.Volek, Kim-Thuat.Nguyen, roucherj, Matthieu Moy
From: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Add the symmetrical feature to the "File:" export support in the previous
patch. Download files from the wiki as needed, and feed them into the
fast-import stream. Import both the file itself, and the corresponding
description page.
Signed-off-by: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Signed-off-by: NGUYEN Kim Thuat <Kim-Thuat.Nguyen@ensimag.imag.fr>
Signed-off-by: ROUCHER IGLESIAS Javier <roucherj@ensimag.imag.fr>
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
---
Sorry, I tested the code on another machine (latest Ubuntu, while I
was testing on Debian stable), and found a bug. Junio, can you take
this version instead? The diff with the previous version is just this:
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -645,6 +645,8 @@ sub literal_data {
sub literal_data_raw {
# Output possibly binary content.
my ($content) = @_;
+ # Avoid confusion between size in bytes and in characters
+ utf8::downgrade($content);
binmode STDOUT, ":raw";
print STDOUT "data ", bytes::length($content), "\n", $content;
binmode STDOUT, ":utf8";
contrib/mw-to-git/git-remote-mediawiki | 257 ++++++++++++++++++++++++++++++++-
1 file changed, 249 insertions(+), 8 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index 361dbb1..76b78bc 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -13,9 +13,6 @@
#
# Known limitations:
#
-# - Only wiki pages are managed, no support for [[File:...]]
-# attachments.
-#
# - Poor performance in the best case: it takes forever to check
# whether we're up-to-date (on fetch or push) or to fetch a few
# revisions from a large wiki, because we use exclusively a
@@ -36,6 +33,7 @@
use strict;
use MediaWiki::API;
use DateTime::Format::ISO8601;
+use FileHandle;
# By default, use UTF-8 to communicate with Git and the user
binmode STDERR, ":utf8";
@@ -72,6 +70,11 @@ chomp(@tracked_pages);
my @tracked_categories = split(/[ \n]/, run_git("config --get-all remote.". $remotename .".categories"));
chomp(@tracked_categories);
+# Import media files too.
+my $import_media = run_git("config --get --bool remote.". $remotename .".mediaimport");
+chomp($import_media);
+$import_media = ($import_media eq "true");
+
my $wiki_login = run_git("config --get remote.". $remotename .".mwLogin");
# TODO: ideally, this should be able to read from keyboard, but we're
# inside a remote helper, so our stdin is connect to git, not to a
@@ -87,6 +90,9 @@ my $shallow_import = run_git("config --get --bool remote.". $remotename .".shall
chomp($shallow_import);
$shallow_import = ($shallow_import eq "true");
+# Cache for MediaWiki namespace ids.
+my %namespace_id;
+
# Dumb push: don't update notes and mediawiki ref to reflect the last push.
#
# Configurable with mediawiki.dumbPush, or per-remote with
@@ -363,6 +369,14 @@ sub get_mw_pages {
if (!$user_defined) {
get_mw_all_pages(\%pages);
}
+ if ($import_media) {
+ print STDERR "Getting media files for selected pages...\n";
+ if ($user_defined) {
+ get_linked_mediafiles(\%pages);
+ } else {
+ get_all_mediafiles(\%pages);
+ }
+ }
return values(%pages);
}
@@ -379,6 +393,152 @@ sub run_git {
}
+sub get_all_mediafiles {
+ my $pages = shift;
+ # Attach list of all pages for media files from the API,
+ # they are in a different namespace, only one namespace
+ # can be queried at the same moment
+ my $mw_pages = $mediawiki->list({
+ action => 'query',
+ list => 'allpages',
+ apnamespace => get_mw_namespace_id("File"),
+ aplimit => 'max'
+ });
+ if (!defined($mw_pages)) {
+ print STDERR "fatal: could not get the list of pages for media files.\n";
+ print STDERR "fatal: '$url' does not appear to be a mediawiki\n";
+ print STDERR "fatal: make sure '$url/api.php' is a valid page.\n";
+ exit 1;
+ }
+ foreach my $page (@{$mw_pages}) {
+ $pages->{$page->{title}} = $page;
+ }
+}
+
+sub get_linked_mediafiles {
+ my $pages = shift;
+ my @titles = map $_->{title}, values(%{$pages});
+
+ # The query is split in small batches because of the MW API limit of
+ # the number of links to be returned (500 links max).
+ my $batch = 10;
+ while (@titles) {
+ if ($#titles < $batch) {
+ $batch = $#titles;
+ }
+ my @slice = @titles[0..$batch];
+
+ # pattern 'page1|page2|...' required by the API
+ my $mw_titles = join('|', @slice);
+
+ # Media files could be included or linked from
+ # a page, get all related
+ my $query = {
+ action => 'query',
+ prop => 'links|images',
+ titles => $mw_titles,
+ plnamespace => get_mw_namespace_id("File"),
+ pllimit => 'max'
+ };
+ my $result = $mediawiki->api($query);
+
+ while (my ($id, $page) = each(%{$result->{query}->{pages}})) {
+ my @titles;
+ if (defined($page->{links})) {
+ my @link_titles = map $_->{title}, @{$page->{links}};
+ push(@titles, @link_titles);
+ }
+ if (defined($page->{images})) {
+ my @image_titles = map $_->{title}, @{$page->{images}};
+ push(@titles, @image_titles);
+ }
+ if (@titles) {
+ get_mw_first_pages(\@titles, \%{$pages});
+ }
+ }
+
+ @titles = @titles[($batch+1)..$#titles];
+ }
+}
+
+sub get_mw_mediafile_for_page_revision {
+ # Name of the file on Wiki, with the prefix.
+ my $mw_filename = shift;
+ my $timestamp = shift;
+ my %mediafile;
+
+ # Search if on MediaWiki exists a media file with given
+ # timestamp. In that case download the file.
+ my $query = {
+ action => 'query',
+ prop => 'imageinfo',
+ titles => $mw_filename,
+ iistart => $timestamp,
+ iiend => $timestamp,
+ iiprop => 'timestamp|archivename|url',
+ iilimit => 1
+ };
+ my $result = $mediawiki->api($query);
+
+ my ($fileid, $file) = each ( %{$result->{query}->{pages}} );
+ # If not defined it means there is no revision of the file for
+ # given timestamp.
+ if (defined($file->{imageinfo})) {
+ # Get real name of media file.
+ my $filename;
+ if (index($mw_filename, 'File:') == 0) {
+ $filename = substr $mw_filename, 5;
+ } else {
+ $filename = substr $mw_filename, 6;
+ }
+ $mediafile{title} = $filename;
+
+ my $fileinfo = pop(@{$file->{imageinfo}});
+ $mediafile{timestamp} = $fileinfo->{timestamp};
+ # If this is an old version of the file, the file has to be
+ # obtained from the archive. Otherwise it can be downloaded
+ # by MediaWiki API download() function.
+ if (defined($fileinfo->{archivename})) {
+ $mediafile{content} = download_mw_mediafile_from_archive($fileinfo->{url});
+ } else {
+ $mediafile{content} = download_mw_mediafile($mw_filename);
+ }
+ }
+ return %mediafile;
+}
+
+sub download_mw_mediafile_from_archive {
+ my $url = shift;
+ my $file;
+
+ my $ua = LWP::UserAgent->new;
+ my $response = $ua->get($url);
+ if ($response->code) {
+ $file = $response->decoded_content;
+ } else {
+ print STDERR "Error downloading a file from archive.\n";
+ }
+
+ return $file;
+}
+
+sub download_mw_mediafile {
+ my $filename = shift;
+
+ $mediawiki->{config}->{files_url} = $url;
+
+ my $file_content = $mediawiki->download( { title => $filename } );
+ if (!defined($file_content)) {
+ print STDERR "\tFile \'$filename\' could not be downloaded.\n";
+ exit 1;
+ } elsif ($file_content eq "") {
+ print STDERR "\tFile \'$filename\' does not exist on the wiki.\n";
+ exit 1;
+ } else {
+ return $file_content;
+ }
+}
+
sub get_last_local_revision {
# Get note regarding last mediawiki revision
my $note = run_git("notes --ref=$remotename/mediawiki show refs/mediawiki/$remotename/master 2>/dev/null");
@@ -482,6 +642,16 @@ sub literal_data {
print STDOUT "data ", bytes::length($content), "\n", $content;
}
+sub literal_data_raw {
+ # Output possibly binary content.
+ my ($content) = @_;
+ # Avoid confusion between size in bytes and in characters
+ utf8::downgrade($content);
+ binmode STDOUT, ":raw";
+ print STDOUT "data ", bytes::length($content), "\n", $content;
+ binmode STDOUT, ":utf8";
+}
+
sub mw_capabilities {
# Revisions are imported to the private namespace
# refs/mediawiki/$remotename/ by the helper and fetched into
@@ -569,6 +739,11 @@ sub import_file_revision {
my %commit = %{$commit};
my $full_import = shift;
my $n = shift;
+ my $mediafile = shift;
+ my %mediafile;
+ if ($mediafile) {
+ %mediafile = %{$mediafile};
+ }
my $title = $commit{title};
my $comment = $commit{comment};
@@ -588,6 +763,10 @@ sub import_file_revision {
if ($content ne DELETED_CONTENT) {
print STDOUT "M 644 inline $title.mw\n";
literal_data($content);
+ if (%mediafile) {
+ print STDOUT "M 644 inline $mediafile{title}\n";
+ literal_data_raw($mediafile{content});
+ }
print STDOUT "\n\n";
} else {
print STDOUT "D $title.mw\n";
@@ -683,12 +862,11 @@ sub mw_import_ref {
$n++;
+ my $page_title = $result->{query}->{pages}->{$pagerevid->{pageid}}->{title};
my %commit;
$commit{author} = $rev->{user} || 'Anonymous';
$commit{comment} = $rev->{comment} || '*Empty MediaWiki Message*';
- $commit{title} = mediawiki_smudge_filename(
- $result->{query}->{pages}->{$pagerevid->{pageid}}->{title}
- );
+ $commit{title} = mediawiki_smudge_filename($page_title);
$commit{mw_revision} = $pagerevid->{revid};
$commit{content} = mediawiki_smudge($rev->{'*'});
@@ -699,9 +877,25 @@ sub mw_import_ref {
}
$commit{date} = DateTime::Format::ISO8601->parse_datetime($last_timestamp);
- print STDERR "$n/", scalar(@revisions), ": Revision #$pagerevid->{revid} of $commit{title}\n";
+ # Differentiates classic pages and media files.
+ my @prefix = split(":", $page_title);
- import_file_revision(\%commit, ($fetch_from == 1), $n);
+ my %mediafile;
+ if ($prefix[0] eq "File" || $prefix[0] eq "Image") {
+ # The name of the file is the same as the media page.
+ my $filename = $page_title;
+ %mediafile = get_mw_mediafile_for_page_revision($filename, $rev->{timestamp});
+ }
+ # If this is a revision of the media page for new version
+ # of a file do one common commit for both file and media page.
+ # Else do commit only for that page.
+ print STDERR "$n/", scalar(@revisions), ": Revision #$pagerevid->{revid} of $commit{title}\n";
+ if (%mediafile) {
+ print STDERR "\tDownloading file $mediafile{title}, version $mediafile{timestamp}\n";
+ import_file_revision(\%commit, ($fetch_from == 1), $n, \%mediafile);
+ } else {
+ import_file_revision(\%commit, ($fetch_from == 1), $n);
+ }
}
if ($fetch_from == 1 && $n == 0) {
@@ -1006,3 +1200,50 @@ sub get_allowed_file_extensions {
return %hashFile;
}
+
+# Return MediaWiki id for a canonical namespace name.
+# Ex.: "File", "Project".
+# Looks for the namespace id in the local configuration
+# variables, if it is not found asks MW API.
+sub get_mw_namespace_id {
+ mw_connect_maybe();
+ my $name = shift;
+
+ if (!exists $namespace_id{$name}) {
+ # Look at configuration file, if the record for that namespace is
+ # already stored. Namespaces are stored in form:
+ # "Name_of_namespace:Id_namespace", ex.: "File:6".
+ my @temp = split(/[ \n]/, run_git("config --get-all remote."
+ . $remotename .".namespaces"));
+ chomp(@temp);
+ foreach my $ns (@temp) {
+ my ($n, $s) = split(/:/, $ns);
+ $namespace_id{$n} = $s;
+ }
+ }
+
+ if (!exists $namespace_id{$name}) {
+ # NS not found => get namespace id from MW and store it in
+ # configuration file.
+ my $query = {
+ action => 'query',
+ meta => 'siteinfo',
+ siprop => 'namespaces'
+ };
+ my $result = $mediawiki->api($query);
+
+ while (my ($id, $ns) = each(%{$result->{query}->{namespaces}})) {
+ if (defined($ns->{canonical}) && ($ns->{canonical} eq $name)) {
+ run_git("config --add remote.". $remotename
+ .".namespaces ". $name .":". $ns->{id});
+ $namespace_id{$name} = $ns->{id};
+ }
+ }
+ }
+
+ if (exists $namespace_id{$name}) {
+ return $namespace_id{$name};
+ } else {
+ die "No such namespace $name on MediaWiki.";
+ }
+}
--
1.7.11.1.147.g47a574d
^ permalink raw reply related [flat|nested] 18+ messages in thread
* [PATCH v4] git-remote-mediawiki: import "File:" attachments
2012-06-27 14:21 ` [PATCH 5/5 v3] " Matthieu Moy
@ 2012-07-04 12:53 ` Matthieu Moy
2012-07-05 6:58 ` Junio C Hamano
0 siblings, 1 reply; 18+ messages in thread
From: Matthieu Moy @ 2012-07-04 12:53 UTC (permalink / raw)
To: git, gitster
Cc: Pavel Volek, NGUYEN Kim Thuat, ROUCHER IGLESIAS Javier, Matthieu Moy
From: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Add the symmetrical feature to the "File:" export support in the previous
patch. Download files from the wiki as needed, and feed them into the
fast-import stream. Import both the file itself, and the corresponding
description page.
Signed-off-by: Pavel Volek <Pavel.Volek@ensimag.imag.fr>
Signed-off-by: NGUYEN Kim Thuat <Kim-Thuat.Nguyen@ensimag.imag.fr>
Signed-off-by: ROUCHER IGLESIAS Javier <roucherj@ensimag.imag.fr>
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
Signed-off-by: Junio C Hamano <gitster@pobox.com>
---
This is meant to replace commit 6a9e55b0fc5df40 in branch
mm/mediawiki-file-attachments in pu.
The main problem I fixed is the support for non-english wikis, where
mediafiles can live in namespaces with names different from Image and
File. While I was there, I reworked the code in various places, the
new code should be simpler and easier to read.
contrib/mw-to-git/git-remote-mediawiki | 241 +++++++++++++++++++++++++++++++--
1 file changed, 232 insertions(+), 9 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index 361dbb1..063a978 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -13,9 +13,6 @@
#
# Known limitations:
#
-# - Only wiki pages are managed, no support for [[File:...]]
-# attachments.
-#
# - Poor performance in the best case: it takes forever to check
# whether we're up-to-date (on fetch or push) or to fetch a few
# revisions from a large wiki, because we use exclusively a
@@ -72,6 +69,11 @@ chomp(@tracked_pages);
my @tracked_categories = split(/[ \n]/, run_git("config --get-all remote.". $remotename .".categories"));
chomp(@tracked_categories);
+# Import media files too.
+my $import_media = run_git("config --get --bool remote.". $remotename .".mediaimport");
+chomp($import_media);
+$import_media = ($import_media eq "true");
+
my $wiki_login = run_git("config --get remote.". $remotename .".mwLogin");
# TODO: ideally, this should be able to read from keyboard, but we're
# inside a remote helper, so our stdin is connect to git, not to a
@@ -261,7 +263,13 @@ sub mw_connect_maybe {
## Functions for listing pages on the remote wiki
sub get_mw_tracked_pages {
my $pages = shift;
- my @some_pages = @tracked_pages;
+ get_mw_page_list(\@tracked_pages, $pages);
+}
+
+sub get_mw_page_list {
+ my $page_list = shift;
+ my $pages = shift;
+ my @some_pages = @$page_list;
while (@some_pages) {
my $last = 50;
if ($#some_pages < $last) {
@@ -363,6 +371,14 @@ sub get_mw_pages {
if (!$user_defined) {
get_mw_all_pages(\%pages);
}
+ if ($import_media) {
+ print STDERR "Getting media files for selected pages...\n";
+ if ($user_defined) {
+ get_linked_mediafiles(\%pages);
+ } else {
+ get_all_mediafiles(\%pages);
+ }
+ }
return values(%pages);
}
@@ -379,6 +395,123 @@ sub run_git {
}
+sub get_all_mediafiles {
+ my $pages = shift;
+ # Attach list of all pages for media files from the API,
+ # they are in a different namespace, only one namespace
+ # can be queried at the same moment
+ my $mw_pages = $mediawiki->list({
+ action => 'query',
+ list => 'allpages',
+ apnamespace => get_mw_namespace_id("File"),
+ aplimit => 'max'
+ });
+ if (!defined($mw_pages)) {
+ print STDERR "fatal: could not get the list of pages for media files.\n";
+ print STDERR "fatal: '$url' does not appear to be a mediawiki\n";
+ print STDERR "fatal: make sure '$url/api.php' is a valid page.\n";
+ exit 1;
+ }
+ foreach my $page (@{$mw_pages}) {
+ $pages->{$page->{title}} = $page;
+ }
+}
+
+sub get_linked_mediafiles {
+ my $pages = shift;
+ my @titles = map $_->{title}, values(%{$pages});
+
+ # The query is split in small batches because of the MW API limit of
+ # the number of links to be returned (500 links max).
+ my $batch = 10;
+ while (@titles) {
+ if ($#titles < $batch) {
+ $batch = $#titles;
+ }
+ my @slice = @titles[0..$batch];
+
+ # pattern 'page1|page2|...' required by the API
+ my $mw_titles = join('|', @slice);
+
+ # Media files could be included or linked from
+ # a page, get all related
+ my $query = {
+ action => 'query',
+ prop => 'links|images',
+ titles => $mw_titles,
+ plnamespace => get_mw_namespace_id("File"),
+ pllimit => 'max'
+ };
+ my $result = $mediawiki->api($query);
+
+ while (my ($id, $page) = each(%{$result->{query}->{pages}})) {
+ my @media_titles;
+ if (defined($page->{links})) {
+ my @link_titles = map $_->{title}, @{$page->{links}};
+ push(@media_titles, @link_titles);
+ }
+ if (defined($page->{images})) {
+ my @image_titles = map $_->{title}, @{$page->{images}};
+ push(@media_titles, @image_titles);
+ }
+ if (@media_titles) {
+ get_mw_page_list(\@media_titles, $pages);
+ }
+ }
+
+ @titles = @titles[($batch+1)..$#titles];
+ }
+}
+
+sub get_mw_mediafile_for_page_revision {
+ # Name of the file on Wiki, with the prefix.
+ my $filename = shift;
+ my $timestamp = shift;
+ my %mediafile;
+
+ # Search if on a media file with given timestamp exists on
+ # MediaWiki. In that case download the file.
+ my $query = {
+ action => 'query',
+ prop => 'imageinfo',
+ titles => "File:" . $filename,
+ iistart => $timestamp,
+ iiend => $timestamp,
+ iiprop => 'timestamp|archivename|url',
+ iilimit => 1
+ };
+ my $result = $mediawiki->api($query);
+
+ my ($fileid, $file) = each( %{$result->{query}->{pages}} );
+ # If not defined it means there is no revision of the file for
+ # given timestamp.
+ if (defined($file->{imageinfo})) {
+ $mediafile{title} = $filename;
+
+ my $fileinfo = pop(@{$file->{imageinfo}});
+ $mediafile{timestamp} = $fileinfo->{timestamp};
+ # Mediawiki::API's download function doesn't support https URLs
+ # and can't download old versions of files.
+ print STDERR "\tDownloading file $mediafile{title}, version $mediafile{timestamp}\n";
+ $mediafile{content} = download_mw_mediafile($fileinfo->{url});
+ }
+ return %mediafile;
+}
+
+sub download_mw_mediafile {
+ my $url = shift;
+
+ my $response = $mediawiki->{ua}->get($url);
+ if ($response->code == 200) {
+ return $response->decoded_content;
+ } else {
+ print STDERR "Error downloading mediafile from :\n";
+ print STDERR "URL: $url\n";
+ print STDERR "Server response: " . $response->code . " " . $response->message . "\n";
+ exit 1;
+ }
+}
+
sub get_last_local_revision {
# Get note regarding last mediawiki revision
my $note = run_git("notes --ref=$remotename/mediawiki show refs/mediawiki/$remotename/master 2>/dev/null");
@@ -482,6 +615,16 @@ sub literal_data {
print STDOUT "data ", bytes::length($content), "\n", $content;
}
+sub literal_data_raw {
+ # Output possibly binary content.
+ my ($content) = @_;
+ # Avoid confusion between size in bytes and in characters
+ utf8::downgrade($content);
+ binmode STDOUT, ":raw";
+ print STDOUT "data ", bytes::length($content), "\n", $content;
+ binmode STDOUT, ":utf8";
+}
+
sub mw_capabilities {
# Revisions are imported to the private namespace
# refs/mediawiki/$remotename/ by the helper and fetched into
@@ -569,6 +712,11 @@ sub import_file_revision {
my %commit = %{$commit};
my $full_import = shift;
my $n = shift;
+ my $mediafile = shift;
+ my %mediafile;
+ if ($mediafile) {
+ %mediafile = %{$mediafile};
+ }
my $title = $commit{title};
my $comment = $commit{comment};
@@ -588,6 +736,10 @@ sub import_file_revision {
if ($content ne DELETED_CONTENT) {
print STDOUT "M 644 inline $title.mw\n";
literal_data($content);
+ if (%mediafile) {
+ print STDOUT "M 644 inline $mediafile{title}\n";
+ literal_data_raw($mediafile{content});
+ }
print STDOUT "\n\n";
} else {
print STDOUT "D $title.mw\n";
@@ -683,12 +835,11 @@ sub mw_import_ref {
$n++;
+ my $page_title = $result->{query}->{pages}->{$pagerevid->{pageid}}->{title};
my %commit;
$commit{author} = $rev->{user} || 'Anonymous';
$commit{comment} = $rev->{comment} || '*Empty MediaWiki Message*';
- $commit{title} = mediawiki_smudge_filename(
- $result->{query}->{pages}->{$pagerevid->{pageid}}->{title}
- );
+ $commit{title} = mediawiki_smudge_filename($page_title);
$commit{mw_revision} = $pagerevid->{revid};
$commit{content} = mediawiki_smudge($rev->{'*'});
@@ -699,9 +850,17 @@ sub mw_import_ref {
}
$commit{date} = DateTime::Format::ISO8601->parse_datetime($last_timestamp);
+ # Differentiates classic pages and media files.
+ my ($namespace, $filename) = $page_title =~ /^([^:]*):(.*)$/;
+ my %mediafile;
+ if ($namespace && get_mw_namespace_id($namespace) == get_mw_namespace_id("File")) {
+ %mediafile = get_mw_mediafile_for_page_revision($filename, $rev->{timestamp});
+ }
+ # If this is a revision of the media page for new version
+ # of a file do one common commit for both file and media page.
+ # Else do commit only for that page.
print STDERR "$n/", scalar(@revisions), ": Revision #$pagerevid->{revid} of $commit{title}\n";
-
- import_file_revision(\%commit, ($fetch_from == 1), $n);
+ import_file_revision(\%commit, ($fetch_from == 1), $n, \%mediafile);
}
if ($fetch_from == 1 && $n == 0) {
@@ -1006,3 +1165,67 @@ sub get_allowed_file_extensions {
return %hashFile;
}
+
+# In memory cache for MediaWiki namespace ids.
+my %namespace_id;
+
+# Namespaces whose id is cached in the configuration file
+# (to avoid duplicates)
+my %cached_mw_namespace_id;
+
+# Return MediaWiki id for a canonical namespace name.
+# Ex.: "File", "Project".
+sub get_mw_namespace_id {
+ mw_connect_maybe();
+ my $name = shift;
+
+ if (!exists $namespace_id{$name}) {
+ # Look at configuration file, if the record for that namespace is
+ # already cached. Namespaces are stored in form:
+ # "Name_of_namespace:Id_namespace", ex.: "File:6".
+ my @temp = split(/[ \n]/, run_git("config --get-all remote."
+ . $remotename .".namespaceCache"));
+ chomp(@temp);
+ foreach my $ns (@temp) {
+ my ($n, $id) = split(/:/, $ns);
+ $namespace_id{$n} = $id;
+ $cached_mw_namespace_id{$n} = 1;
+ }
+ }
+
+ if (!exists $namespace_id{$name}) {
+ print STDERR "Namespace $name not found in cache, querying the wiki ...\n";
+ # NS not found => get namespace id from MW and store it in
+ # configuration file.
+ my $query = {
+ action => 'query',
+ meta => 'siteinfo',
+ siprop => 'namespaces'
+ };
+ my $result = $mediawiki->api($query);
+
+ while (my ($id, $ns) = each(%{$result->{query}->{namespaces}})) {
+ if (defined($ns->{id}) && defined($ns->{canonical})) {
+ $namespace_id{$ns->{canonical}} = $ns->{id};
+ if ($ns->{'*'}) {
+ # alias (e.g. french Fichier: as alias for canonical File:)
+ $namespace_id{$ns->{'*'}} = $ns->{id};
+ }
+ }
+ }
+ }
+
+ my $id = $namespace_id{$name};
+
+ if (defined $id) {
+ # Store explicitely requested namespaces on disk
+ if (!exists $cached_mw_namespace_id{$name}) {
+ run_git("config --add remote.". $remotename
+ .".namespaceCache \"". $name .":". $id ."\"");
+ $cached_mw_namespace_id{$name} = 1;
+ }
+ return $id;
+ } else {
+ die "No such namespace $name on MediaWiki.";
+ }
+}
--
1.7.11.1.147.g47a574d
^ permalink raw reply related [flat|nested] 18+ messages in thread
* Re: [PATCH v4] git-remote-mediawiki: import "File:" attachments
2012-07-04 12:53 ` [PATCH v4] " Matthieu Moy
@ 2012-07-05 6:58 ` Junio C Hamano
2012-07-05 7:42 ` Matthieu Moy
0 siblings, 1 reply; 18+ messages in thread
From: Junio C Hamano @ 2012-07-05 6:58 UTC (permalink / raw)
To: Matthieu Moy; +Cc: git, Pavel Volek, NGUYEN Kim Thuat, ROUCHER IGLESIAS Javier
Matthieu Moy <Matthieu.Moy@imag.fr> writes:
> This is meant to replace commit 6a9e55b0fc5df40 in branch
> mm/mediawiki-file-attachments in pu.
Bad timing for our mails to cross; it is already on 'next'.
Would the following be a good "incremental update" on top of the
named commit?
-- >8 --
From: Matthieu Moy <Matthieu.Moy@imag.fr>
Date: Wed, 4 Jul 2012 14:53:36 +0200
Subject: [PATCH] git-remote-mediawiki: improve support for non-English Wikis
Mediafiles can live in namespaces with names different from Image
and File. While at it, rework the code to make it simpler and easier
to read.
Signed-off-by: Matthieu Moy <Matthieu.Moy@imag.fr>
Signed-off-by: Junio C Hamano <gitster@pobox.com>
---
contrib/mw-to-git/git-remote-mediawiki | 140 ++++++++++++++-------------------
1 file changed, 61 insertions(+), 79 deletions(-)
diff --git a/contrib/mw-to-git/git-remote-mediawiki b/contrib/mw-to-git/git-remote-mediawiki
index 76b78bc..063a978 100755
--- a/contrib/mw-to-git/git-remote-mediawiki
+++ b/contrib/mw-to-git/git-remote-mediawiki
@@ -33,7 +33,6 @@
use strict;
use MediaWiki::API;
use DateTime::Format::ISO8601;
-use FileHandle;
# By default, use UTF-8 to communicate with Git and the user
binmode STDERR, ":utf8";
@@ -90,9 +89,6 @@ my $shallow_import = run_git("config --get --bool remote.". $remotename .".shall
chomp($shallow_import);
$shallow_import = ($shallow_import eq "true");
-# Cache for MediaWiki namespace ids.
-my %namespace_id;
-
# Dumb push: don't update notes and mediawiki ref to reflect the last push.
#
# Configurable with mediawiki.dumbPush, or per-remote with
@@ -267,7 +263,13 @@ sub mw_connect_maybe {
## Functions for listing pages on the remote wiki
sub get_mw_tracked_pages {
my $pages = shift;
- my @some_pages = @tracked_pages;
+ get_mw_page_list(\@tracked_pages, $pages);
+}
+
+sub get_mw_page_list {
+ my $page_list = shift;
+ my $pages = shift;
+ my @some_pages = @$page_list;
while (@some_pages) {
my $last = 50;
if ($#some_pages < $last) {
@@ -443,17 +445,17 @@ sub get_linked_mediafiles {
my $result = $mediawiki->api($query);
while (my ($id, $page) = each(%{$result->{query}->{pages}})) {
- my @titles;
+ my @media_titles;
if (defined($page->{links})) {
my @link_titles = map $_->{title}, @{$page->{links}};
- push(@titles, @link_titles);
+ push(@media_titles, @link_titles);
}
if (defined($page->{images})) {
my @image_titles = map $_->{title}, @{$page->{images}};
- push(@titles, @image_titles);
+ push(@media_titles, @image_titles);
}
- if (@titles) {
- get_mw_first_pages(\@titles, \%{$pages});
+ if (@media_titles) {
+ get_mw_page_list(\@media_titles, $pages);
}
}
@@ -463,16 +465,16 @@ sub get_linked_mediafiles {
sub get_mw_mediafile_for_page_revision {
# Name of the file on Wiki, with the prefix.
- my $mw_filename = shift;
+ my $filename = shift;
my $timestamp = shift;
my %mediafile;
- # Search if on MediaWiki exists a media file with given
- # timestamp. In that case download the file.
+ # Search if on a media file with given timestamp exists on
+ # MediaWiki. In that case download the file.
my $query = {
action => 'query',
prop => 'imageinfo',
- titles => $mw_filename,
+ titles => "File:" . $filename,
iistart => $timestamp,
iiend => $timestamp,
iiprop => 'timestamp|archivename|url',
@@ -480,62 +482,33 @@ sub get_mw_mediafile_for_page_revision {
};
my $result = $mediawiki->api($query);
- my ($fileid, $file) = each ( %{$result->{query}->{pages}} );
+ my ($fileid, $file) = each( %{$result->{query}->{pages}} );
# If not defined it means there is no revision of the file for
# given timestamp.
if (defined($file->{imageinfo})) {
- # Get real name of media file.
- my $filename;
- if (index($mw_filename, 'File:') == 0) {
- $filename = substr $mw_filename, 5;
- } else {
- $filename = substr $mw_filename, 6;
- }
$mediafile{title} = $filename;
my $fileinfo = pop(@{$file->{imageinfo}});
$mediafile{timestamp} = $fileinfo->{timestamp};
- # If this is an old version of the file, the file has to be
- # obtained from the archive. Otherwise it can be downloaded
- # by MediaWiki API download() function.
- if (defined($fileinfo->{archivename})) {
- $mediafile{content} = download_mw_mediafile_from_archive($fileinfo->{url});
- } else {
- $mediafile{content} = download_mw_mediafile($mw_filename);
- }
+ # Mediawiki::API's download function doesn't support https URLs
+ # and can't download old versions of files.
+ print STDERR "\tDownloading file $mediafile{title}, version $mediafile{timestamp}\n";
+ $mediafile{content} = download_mw_mediafile($fileinfo->{url});
}
return %mediafile;
}
-sub download_mw_mediafile_from_archive {
+sub download_mw_mediafile {
my $url = shift;
- my $file;
- my $ua = LWP::UserAgent->new;
- my $response = $ua->get($url);
- if ($response->code) {
- $file = $response->decoded_content;
+ my $response = $mediawiki->{ua}->get($url);
+ if ($response->code == 200) {
+ return $response->decoded_content;
} else {
- print STDERR "Error downloading a file from archive.\n";
- }
-
- return $file;
-}
-
-sub download_mw_mediafile {
- my $filename = shift;
-
- $mediawiki->{config}->{files_url} = $url;
-
- my $file_content = $mediawiki->download( { title => $filename } );
- if (!defined($file_content)) {
- print STDERR "\tFile \'$filename\' could not be downloaded.\n";
- exit 1;
- } elsif ($file_content eq "") {
- print STDERR "\tFile \'$filename\' does not exist on the wiki.\n";
+ print STDERR "Error downloading mediafile from :\n";
+ print STDERR "URL: $url\n";
+ print STDERR "Server response: " . $response->code . " " . $response->message . "\n";
exit 1;
- } else {
- return $file_content;
}
}
@@ -878,24 +851,16 @@ sub mw_import_ref {
$commit{date} = DateTime::Format::ISO8601->parse_datetime($last_timestamp);
# Differentiates classic pages and media files.
- my @prefix = split(":", $page_title);
-
+ my ($namespace, $filename) = $page_title =~ /^([^:]*):(.*)$/;
my %mediafile;
- if ($prefix[0] eq "File" || $prefix[0] eq "Image") {
- # The name of the file is the same as the media page.
- my $filename = $page_title;
+ if ($namespace && get_mw_namespace_id($namespace) == get_mw_namespace_id("File")) {
%mediafile = get_mw_mediafile_for_page_revision($filename, $rev->{timestamp});
}
# If this is a revision of the media page for new version
# of a file do one common commit for both file and media page.
# Else do commit only for that page.
print STDERR "$n/", scalar(@revisions), ": Revision #$pagerevid->{revid} of $commit{title}\n";
- if (%mediafile) {
- print STDERR "\tDownloading file $mediafile{title}, version $mediafile{timestamp}\n";
- import_file_revision(\%commit, ($fetch_from == 1), $n, \%mediafile);
- } else {
- import_file_revision(\%commit, ($fetch_from == 1), $n);
- }
+ import_file_revision(\%commit, ($fetch_from == 1), $n, \%mediafile);
}
if ($fetch_from == 1 && $n == 0) {
@@ -1201,28 +1166,35 @@ sub get_allowed_file_extensions {
return %hashFile;
}
+# In memory cache for MediaWiki namespace ids.
+my %namespace_id;
+
+# Namespaces whose id is cached in the configuration file
+# (to avoid duplicates)
+my %cached_mw_namespace_id;
+
# Return MediaWiki id for a canonical namespace name.
# Ex.: "File", "Project".
-# Looks for the namespace id in the local configuration
-# variables, if it is not found asks MW API.
sub get_mw_namespace_id {
mw_connect_maybe();
my $name = shift;
if (!exists $namespace_id{$name}) {
# Look at configuration file, if the record for that namespace is
- # already stored. Namespaces are stored in form:
+ # already cached. Namespaces are stored in form:
# "Name_of_namespace:Id_namespace", ex.: "File:6".
my @temp = split(/[ \n]/, run_git("config --get-all remote."
- . $remotename .".namespaces"));
+ . $remotename .".namespaceCache"));
chomp(@temp);
foreach my $ns (@temp) {
- my ($n, $s) = split(/:/, $ns);
- $namespace_id{$n} = $s;
+ my ($n, $id) = split(/:/, $ns);
+ $namespace_id{$n} = $id;
+ $cached_mw_namespace_id{$n} = 1;
}
}
if (!exists $namespace_id{$name}) {
+ print STDERR "Namespace $name not found in cache, querying the wiki ...\n";
# NS not found => get namespace id from MW and store it in
# configuration file.
my $query = {
@@ -1233,16 +1205,26 @@ sub get_mw_namespace_id {
my $result = $mediawiki->api($query);
while (my ($id, $ns) = each(%{$result->{query}->{namespaces}})) {
- if (defined($ns->{canonical}) && ($ns->{canonical} eq $name)) {
- run_git("config --add remote.". $remotename
- .".namespaces ". $name .":". $ns->{id});
- $namespace_id{$name} = $ns->{id};
- }
+ if (defined($ns->{id}) && defined($ns->{canonical})) {
+ $namespace_id{$ns->{canonical}} = $ns->{id};
+ if ($ns->{'*'}) {
+ # alias (e.g. french Fichier: as alias for canonical File:)
+ $namespace_id{$ns->{'*'}} = $ns->{id};
+ }
+ }
}
}
- if (exists $namespace_id{$name}) {
- return $namespace_id{$name};
+ my $id = $namespace_id{$name};
+
+ if (defined $id) {
+ # Store explicitely requested namespaces on disk
+ if (!exists $cached_mw_namespace_id{$name}) {
+ run_git("config --add remote.". $remotename
+ .".namespaceCache \"". $name .":". $id ."\"");
+ $cached_mw_namespace_id{$name} = 1;
+ }
+ return $id;
} else {
die "No such namespace $name on MediaWiki.";
}
--
1.7.11.1.243.g7462176
^ permalink raw reply related [flat|nested] 18+ messages in thread
* Re: [PATCH v4] git-remote-mediawiki: import "File:" attachments
2012-07-05 6:58 ` Junio C Hamano
@ 2012-07-05 7:42 ` Matthieu Moy
0 siblings, 0 replies; 18+ messages in thread
From: Matthieu Moy @ 2012-07-05 7:42 UTC (permalink / raw)
To: Junio C Hamano
Cc: git, Pavel Volek, NGUYEN Kim Thuat, ROUCHER IGLESIAS Javier
Junio C Hamano <gitster@pobox.com> writes:
> Matthieu Moy <Matthieu.Moy@imag.fr> writes:
>
>> This is meant to replace commit 6a9e55b0fc5df40 in branch
>> mm/mediawiki-file-attachments in pu.
>
> Bad timing for our mails to cross; it is already on 'next'.
>
> Would the following be a good "incremental update" on top of the
> named commit?
Yes, that is perfect.
Thanks,
--
Matthieu Moy
http://www-verimag.imag.fr/~moy/
^ permalink raw reply [flat|nested] 18+ messages in thread
end of thread, other threads:[~2012-07-05 7:43 UTC | newest]
Thread overview: 18+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2012-06-26 16:04 [PATCH 0/5] git-remote-mediawiki: support File: import and export Matthieu Moy
2012-06-26 16:04 ` [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message Matthieu Moy
2012-06-26 17:47 ` Junio C Hamano
2012-06-27 8:32 ` Matthieu Moy
2012-06-26 16:04 ` [PATCH 2/5] git-remote-mediawiki: don't "use encoding 'utf8';" Matthieu Moy
2012-06-26 16:04 ` [PATCH 3/5] git-remote-mediawiki: send "File:" attachments to a remote wiki Matthieu Moy
2012-06-26 16:04 ` [PATCH 4/5] git-remote-mediawiki: split get_mw_pages into smaller functions Matthieu Moy
2012-06-26 16:04 ` [PATCH 5/5] git-remote-mediawiki: import "File:" attachments Matthieu Moy
2012-06-27 9:10 ` [PATCH 0/5 v2] git-remote-mediawiki: support File: import and export Matthieu Moy
2012-06-27 9:10 ` [PATCH 1/5] git-remote-mediawiki: don't compute the diff when getting commit message Matthieu Moy
2012-06-27 9:10 ` [PATCH 2/5] git-remote-mediawiki: don't "use encoding 'utf8';" Matthieu Moy
2012-06-27 9:10 ` [PATCH 3/5] git-remote-mediawiki: send "File:" attachments to a remote wiki Matthieu Moy
2012-06-27 9:10 ` [PATCH 4/5] git-remote-mediawiki: split get_mw_pages into smaller functions Matthieu Moy
2012-06-27 9:10 ` [PATCH 5/5] git-remote-mediawiki: import "File:" attachments Matthieu Moy
2012-06-27 14:21 ` [PATCH 5/5 v3] " Matthieu Moy
2012-07-04 12:53 ` [PATCH v4] " Matthieu Moy
2012-07-05 6:58 ` Junio C Hamano
2012-07-05 7:42 ` Matthieu Moy
This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.