forked from MediaWiki-Bot/MediaWiki-Bot
-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathREADME
1479 lines (1021 loc) · 49 KB
/
README
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
SYNOPSIS
use MediaWiki::Bot qw(:constants);
my $bot = MediaWiki::Bot->new({
assert => 'bot',
host => 'de.wikimedia.org',
login_data => { username => "Mike's bot account", password => "password" },
});
my $revid = $bot->get_last("User:Mike.lifeguard/sandbox", "Mike.lifeguard");
print "Reverting to $revid\n" if defined($revid);
$bot->revert('User:Mike.lifeguard', $revid, 'rvv');
DESCRIPTION
MediaWiki::Bot is a framework that can be used to write bots which
interface with the MediaWiki API (http://en.wikipedia.org/w/api.php).
METHODS
Initialization
new
my $bot = MediaWiki::Bot({
host => 'en.wikipedia.org',
operator => 'Mike.lifeguard',
});
Calling MediaWiki::Bot->new() will create a new MediaWiki::Bot object.
The only parameter is a hashref with keys:
* agent sets a custom useragent. It is recommended to use operator
instead, which is all we need to do the right thing for you. If you
really want to do it yourself, see
https://meta.wikimedia.org/wiki/User-agent_policy for guidance on
what information must be included.
* assert sets a parameter for the AssertEdit extension (commonly
'bot')
Refer to http://mediawiki.org/wiki/Extension:AssertEdit.
* operator allows the bot to send you a message when it fails an
assert. This is also the recommended way to customize the user agent
string, which is required by the Wikimedia Foundation. A warning will
be emitted if you omit this.
* maxlag allows you to set the maxlag parameter (default is the
recommended 5s).
Please refer to the MediaWiki
<https://www.mediawiki.org/wiki/Manual:Maxlag_parameter>
documentation prior to changing this from the default.
* protocol allows you to specify 'http' or 'https' (default is
'https')
* host sets the domain name of the wiki to connect to (e.g.
'en.wikipedia.org')
* path sets the path to api.php (with no leading or trailing slash,
e.g. 'w')
* login_data is a hashref of credentials to pass to "login" (see
"login" for more information).
* debug - whether to provide debug output (default is 0).
1 provides some more warnings; 2 provides further detail on internal
operations.
For example:
my $bot = MediaWiki::Bot->new({
assert => 'bot',
protocol => 'https',
host => 'en.wikimedia.org',
agent => sprintf(
'PerlWikiBot/%s (https://metacpan.org/MediaWiki::Bot; User:Mike.lifeguard)',
MediaWiki::Bot->VERSION
),
login_data => { username => "Mike's bot account", password => "password" },
});
For backward compatibility, you can specify up to three parameters:
my $bot = MediaWiki::Bot->new('My custom useragent string', $assert, $operator);
This form is deprecated, will never do auto-login or autoconfiguration,
and emits deprecation warnings.
For further reading:
* MediaWiki::Bot wiki
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki>
* <Installing MediaWiki::Bot
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Install>>
* Creating a new bot
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Creating-a-new-bot>
* Setting the wiki
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Setting-the-wiki>
* Where is api.php
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Where-is-api.php>
set_wiki
Set what wiki to use. The parameter is a hashref with keys:
* host - the domain name (e.g. en.wikipedia.org)
* path - the part of the path before api.php (e.g. 'w')
* protocol is either 'http' or 'https'.
If you don't set any parameter, it's previous value is used. If it has
never been set, the default settings are 'https', 'en.wikipedia.org'
and 'w'.
For example:
$bot->set_wiki({
protocol => 'https',
host => 'en.wikimedia.org',
path => 'wikipedia/meta/w',
});
For backward compatibility, you can specify up to two parameters:
$bot->set_wiki($host, $path);
This form is deprecated and will emit deprecation warnings.
login
This method takes a hashref with keys username and password at a
minimum. See "Single User Login" and "Basic authentication" for
additional options.
Logs the $username in, optionally using $password. First, an attempt
will be made to use cookies to log in. If this fails, an attempt will
be made to use the password provided to log in, if any. If the login
was successful, returns true; false otherwise.
$bot->login({
username => $username,
password => $password,
}) or die "Login failed";
Once logged in, attempt to do some simple auto-configuration. At
present, this consists of:
* Warning if the account doesn't have the bot flag, and isn't a sysop
account.
* Setting an appropriate default assert.
You can skip this autoconfiguration by passing autoconfig => 0
For backward compatibility, you can call this as
$bot->login($username, $password);
This form is deprecated, and will emit deprecation warnings. It will
never do autoconfiguration or SUL login.
Single User Login
On WMF wikis, do_sul specifies whether to log in on all projects. The
default is false. But even when false, you still get a CentralAuth
cookie for, and are thus logged in on, all languages of a given domain
(*.wikipedia.org, for example). When set, a login is done on each WMF
domain so you are logged in on all ~800 content wikis. Since
*.wikimedia.org is not possible, we explicitly include meta, commons,
incubator, and wikispecies.
Basic authentication
If you need to supply basic auth credentials, pass a hashref of data as
described by LWP::UserAgent:
$bot->login({
username => $username,
password => $password,
basic_auth => { netloc => "private.wiki.com:80",
realm => "Authentication Realm",
uname => "Basic auth username",
pass => "password",
}
}) or die "Couldn't log in";
Bot passwords
MediaWiki::Bot doesn't yet support the more complicated (but more
secure) oAuth login flow for bots. Instead, we support a simpler "bot
password", which is a generated password connected to a
(possibly-reduced) set of on-wiki privileges, and IP ranges from which
it can be used.
To create one, visit Special:BotPasswords on the wiki. Enter a label
for the password, then select the privileges you want to use with that
password. This set should be as restricted as possible; most bots only
edit existing pages. Keeping the set of privileges as restricted as
possible limits the possible damage if the password were ever
compromised.
Submit the form, and you'll be given a new "username" that looks like
"AccountUsername@bot_password_label", and a generated bot password. To
log in, provide those to MediaWiki::Bot verbatim.
References: API:Login <https://www.mediawiki.org/wiki/API:Login>,
Logging in
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Logging-in>
logout
$bot->logout();
The logout method logs the bot out of the wiki. This invalidates all
login cookies.
References: API:Logging out <https://www.mediawiki.org/wiki/API:Logout>
Getting information about pages
diff
This allows retrieval of a diff from the API. The return is a scalar
containing the HTML table of the diff. Options are passed as a hashref
with keys:
* title is the title to use. Provide either this or revid.
* revid is any revid to diff from. If you also specified title, only
title will be honoured.
* oldid is an identifier to diff to. This can be a revid, or the
special values 'cur', 'prev' or 'next'
References: API:Properties#revisions
<https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv>
get_history
my @hist = $bot->get_history($title);
my @hist = $bot->get_history($title, $additional_params);
Returns an array containing the history of the specified page $title.
The optional hash ref $additional_params can be used to tune the query
by API parameters, such as 'rvlimit' to return only 'rvlimit' number of
revisions (default is as many as possible, but may be limited per
query) or 'rvdir' to set the chronological direction.
Example:
my @hist = $bot->get_history('Main Page', {'rvlimit' => 10, 'rvdir' => 'older'})
The array returned contains hashrefs with keys: revid, user, comment,
minor, timestamp_date, and timestamp_time.
For backward compatibility, you can specify up to four parameters:
my @hist = $bot->get_history($title, $limit, $revid, $direction);
This form is deprecated, and will emit deprecation warnings.
References: Getting page history
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Getting-page-history>,
API:Properties#revisions
<https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv>
get_history_step_by_step
my @hist = $bot->get_history_step_by_step($title);
my @hist = $bot->get_history_step_by_step($title, $additional_params);
Same as get_history(), but does not return the full history at once,
but let's you loop through it.
The optional call-by-reference hash ref $additional_params can be used
to loop through a page's full history by using the 'continue' param
returned by the API.
Example:
my $ready;
my $filter_params = {};
while(!$ready){
my @hist = $bot->get_history_step_by_step($page, $filter_params);
if(@hist == 0 || !defined($filter_params->{'continue'})){
$ready = 1;
}
# do something with @hist
}
References: Getting page history
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Getting-page-history>,
API:Properties#revisions
<https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv>
what_links_here
Returns an array containing a list of all pages linking to $page.
Additional optional parameters are:
* One of: all (default), redirects, or nonredirects.
* A namespace number to search (pass an arrayref to search in
multiple namespaces)
* An "Options hashref".
A typical query:
my @links = $bot->what_links_here("Meta:Sandbox",
undef, 1,
{ hook=>\&mysub }
);
sub mysub{
my ($res) = @_;
foreach my $hash (@$res) {
my $title = $hash->{'title'};
my $is_redir = $hash->{'redirect'};
print "Redirect: $title\n" if $is_redir;
print "Page: $title\n" unless $is_redir;
}
}
Transclusions are no longer handled by what_links_here() - use
"list_transclusions" instead.
References: Listing incoming links
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Listing-incoming-links>,
API:Backlinks <https://www.mediawiki.org/wiki/API:Backlinks>
get_id
Returns the id of the specified $page_title. Returns undef if page does
not exist.
my $pageid = $bot->get_id("Main Page");
die "Page doesn't exist\n" if !defined($pageid);
Revisions: API:Properties#info
<https://www.mediawiki.org/wiki/API:Properties#info_.2F_in>
get_pages
Returns the text of the specified pages in a hashref. Content of undef
means page does not exist. Also handles redirects or article names that
use namespace aliases.
my @pages = ('Page 1', 'Page 2', 'Page 3');
my $thing = $bot->get_pages(\@pages);
foreach my $page (keys %$thing) {
my $text = $thing->{$page};
print "$text\n" if defined($text);
}
References: Fetching page text
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Fetching-page-text>,
API:Properties#revisions
<https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv>
prefixindex
This returns an array of hashrefs containing page titles that start
with the given $prefix. The hashref has keys 'title' and 'redirect'
(present if the page is a redirect, not present otherwise).
Additional parameters are:
* One of all, redirects, or nonredirects
* A single namespace number (unlike "linksearch" etc, which can
accept an arrayref of numbers).
* $options_hashref as described in "Options hashref".
my @prefix_pages = $bot->prefixindex("User:Mike.lifeguard");
# Or, the more efficient equivalent
my @prefix_pages = $bot->prefixindex("Mike.lifeguard", 2);
foreach my $hashref (@pages) {
my $title = $hashref->{'title'};
if $hashref->{'redirect'} {
print "$title is a redirect\n";
}
else {
print "$title\n is not a redirect\n";
}
}
References: API:Allpages <https://www.mediawiki.org/wiki/API:Allpages>
get_protection
Returns data on page protection as a array of up to two hashrefs. Each
hashref has a type, level, and expiry. Levels are 'sysop' and
'autoconfirmed'; types are 'move' and 'edit'; expiry is a timestamp.
Additionally, the key 'cascade' will exist if cascading protection is
used.
my $page = 'Main Page';
$bot->edit({
page => $page,
text => rand(),
summary => 'test',
}) unless $bot->get_protection($page);
You can also pass an arrayref of page titles to do bulk queries:
my @pages = ('Main Page', 'User:Mike.lifeguard', 'Project:Sandbox');
my $answer = $bot->get_protection(\@pages);
foreach my $title (keys %$answer) {
my $protected = $answer->{$title};
print "$title is protected\n" if $protected;
print "$title is unprotected\n" unless $protected;
}
References: API:Properties#info
<https://www.mediawiki.org/wiki/API:Properties#info_.2F_in>
get_last
Returns the revid of the last revision to $page not made by $user.
undef is returned if no result was found, as would be the case if the
page is deleted.
my $revid = $bot->get_last('User:Mike.lifeguard/sandbox', 'Mike.lifeguard');
if defined($revid) {
print "Reverting to $revid\n";
$bot->revert('User:Mike.lifeguard', $revid, 'rvv');
}
References: API:Properties#revisions
<https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv>
recent_edit_to_page
my ($timestamp, $user) = $bot->recent_edit_to_page($title);
Returns timestamp and username for most recent (top) edit to $page.
References: API:Properties#revisions
<https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv>
get_users
my @recent_editors = $bot->get_users($title, $limit, $revid, $direction);
Gets the most recent editors to $page, up to $limit, starting from
$revision and going in $direction.
References: API:Properties#revisions
<https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv>
get_text
Returns the wikitext of the specified $page_title. The first parameter
$page_title is the only required one.
The second parameter is a hashref with the following independent
optional keys:
* rvstartid - if defined, this function returns the text of that
revision, otherwise the newest revision will be used.
* rvsection - if defined, returns the text of that section. Otherwise
the whole page text will be returned.
* pageid - this is an output parameter and can be used to fetch the
id of a page without the need of calling "get_id" additionally. Note
that the value of this param is ignored and it will be overwritten by
this function.
* rv... - any param starting with 'rv' will be forwarded to the api
call.
A blank page will return wikitext of "" (which evaluates to false in
Perl, but is defined); a nonexistent page will return undef (which also
evaluates to false in Perl, but is obviously undefined). You can
distinguish between blank and nonexistent pages by using defined:
# simple example
my $wikitext = $bot->get_text('Page title');
print "Wikitext: $wikitext\n" if defined $wikitext;
# advanced example
my $options = {'revid'=>123456, 'section_number'=>2};
$wikitext = $bot->get_text('Page title', $options);
die "error, see API error message\n" unless defined $options->{'pageid'};
warn "page doesn't exist\n" if $options->{'pageid'} == MediaWiki::Bot::PAGE_NONEXISTENT;
print "Wikitext: $wikitext\n" if defined $wikitext;
References: Fetching page text
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Fetching-page-text>,
API:Properties#revisions
<https://www.mediawiki.org/wiki/API:Properties#revisions_.2F_rv>
For backward-compatibility the params revid and section_number may also
be given as scalar parameters:
my $wikitext = $bot->get_text('Page title', 123456, 2);
print "Wikitext: $wikitext\n" if defined $wikitext;
This form is deprecated, and will emit deprecation warnings.
is_protected
This is a synonym for "get_protection", which should be used in
preference.
This method is deprecated and will emit deprecation warnings.
Modifying pages
edit
my $text = $bot->get_text('My page');
$text .= "\n\n* More text\n";
$bot->edit({
page => 'My page',
text => $text,
summary => 'Adding new content',
section => 'new',
});
This method edits a wiki page, and takes a hashref of data with keys:
* page - the page title to edit
* text - the page text to write
* summary - an edit summary
* minor - whether to mark the edit as minor or not (boolean)
* bot - whether to mark the edit as a bot edit (boolean)
* assertion - usually 'bot', but see
http://mediawiki.org/wiki/Extension:AssertEdit.
* section - edit a single section (identified by number) instead of
the whole page
An MD5 hash is sent to guard against data corruption while in transit.
You can also call this as:
$bot->edit($page, $text, $summary, $is_minor, $assert, $markasbot);
This form is deprecated and will emit deprecation warnings.
CAPTCHAs
If a CAPTCHA <https://en.wikipedia.org/wiki/CAPTCHA> is encountered,
the call to edit will return false, with the error code set to
ERR_CAPTCHA and the details informing you that solving a CAPTCHA is
required for this action. The information you need to actually solve
the captcha (for example the URL for the image) is given in
$bot->{error}->{captcha} as a hash reference. You will want to grab the
keys 'url' (a relative URL to the image) and 'id' (the ID of the
CAPTCHA). Once you have solved the CAPTCHA (presumably by interacting
with a human), retry the edit, adding captcha_id and captcha_solution
parameters:
my $edit = {page => 'Main Page', text => 'got your nose'};
my $edit_status = $bot->edit($edit);
if (not $edit_status) {
if ($bot->{error}{code} == ERR_CAPTCHA) {
my @captcha_uri = split /\Q?/, $bot->{error}{captcha}{url}, 2;
my $image = URI->new(sprintf '%s://%s%s?%s' =>
$bot->{protocol}, $bot->{host}, $captcha_uri[0], $captcha_uri[1],
);
require Term::ReadLine;
my $term = Term::ReadLine->new('Solve the captcha');
$term->ornaments(0);
my $answer = $term->readline("Please solve $image and type the answer: ");
# Add new CAPTCHA params to the edit we're attempting
$edit->{captcha_id} = $bot->{error}{captcha}{id};
$edit->{captcha_solution} = $answer;
$edit_status = $bot->edit($edit);
}
}
References: Editing pages
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Editing-pages>,
API:Edit <https://www.mediawiki.org/wiki/API:Edit>, API:Tokens
<https://www.mediawiki.org/wiki/API:Tokens>
move
$bot->move($from_title, $to_title, $reason, $options_hashref);
This moves a wiki page.
If you wish to specify more options (like whether to suppress creation
of a redirect), use $options_hashref, which has keys:
* movetalk specifies whether to attempt to the talk page.
* noredirect specifies whether to suppress creation of a redirect.
* movesubpages specifies whether to move subpages, if applicable.
* watch and unwatch add or remove the page and the redirect from your
watchlist.
* ignorewarnings ignores warnings.
my @pages = ("Humor", "Rumor");
foreach my $page (@pages) {
my $to = $page;
$to =~ s/or$/our/;
$bot->move($page, $to, "silly 'merricans");
}
References: API:Move <https://www.mediawiki.org/wiki/API:Move>
patrol
$bot->patrol($rcid);
Marks a page or revision identified by the $rcid as patrolled. To mark
several RCIDs as patrolled, you may pass an arrayref of them. Returns
false and sets $bot->{error} if the account cannot patrol.
References: API:Patrol <https://www.mediawiki.org/wiki/API:Patrol>
purge_page
Purges the server cache of the specified $page. Returns true on
success; false on failure. Pass an array reference to purge multiple
pages.
If you really care, a true return value is the number of pages
successfully purged. You could check that it is the same as the number
you wanted to purge - maybe some pages don't exist, or you passed
invalid titles, or you aren't allowed to purge the cache:
my @to_purge = ('Main Page', 'A', 'B', 'C', 'Very unlikely to exist');
my $size = scalar @to_purge;
print "all-at-once:\n";
my $success = $bot->purge_page(\@to_purge);
if ($success == $size) {
print "@to_purge: OK ($success/$size)\n";
}
else {
my $missed = @to_purge - $success;
print "We couldn't purge $missed pages (list was: "
. join(', ', @to_purge)
. ")\n";
}
# OR
print "\n\none-at-a-time:\n";
foreach my $page (@to_purge) {
my $ok = $bot->purge_page($page);
print "$page: $ok\n";
}
References: Purging the server cache
<https://github.com/MediaWiki-Bot/MediaWiki-Bot/wiki/Purging-the-server-cache>,
API:Purge <https://www.mediawiki.org/wiki/API:Purge>
revert
Reverts the specified $page_title to $revid, with an edit summary of
$summary. A default edit summary will be used if $summary is omitted.
my $revid = $bot->get_last("User:Mike.lifeguard/sandbox", "Mike.lifeguard");
print "Reverting to $revid\n" if defined($revid);
$bot->revert('User:Mike.lifeguard', $revid, 'rvv');
References: API:Edit <https://www.mediawiki.org/wiki/API:Edit>
undo
$bot->undo($title, $revid, $summary, $after);
Reverts the specified $revid, with an edit summary of $summary, using
the undo function. To undo all revisions from $revid up to but not
including this one, set $after to another revid. If not set, just undo
the one revision ($revid).
References: API:Edit <https://www.mediawiki.org/wiki/API:Edit>
Files / Images
get_image
$buffer = $bot->get_image('File:Foo.jpg', { width=>256, height=>256 });
Download an image from a wiki. This is derived from a similar function
in MediaWiki::API. This one allows the image to be scaled down by
passing a hashref with height & width parameters.
It returns raw data in the original format. You may simply spew it to a
file, or process it directly with a library such as Imager.
use File::Slurp qw(write_file);
my $img_data = $bot->get_image('File:Foo.jpg');
write_file( 'Foo.jpg', {binmode => ':raw'}, \$img_data );
Images are scaled proportionally. (height/width) will remain constant,
except for rounding errors.
Height and width parameters describe the maximum dimensions. A 400x200
image will never be scaled to greater dimensions. You can scale it
yourself; having the wiki do it is just lazy & selfish.
References: API:Properties#imageinfo
<https://www.mediawiki.org/wiki/API:Properties#imageinfo_.2F_ii>
image_usage
Gets a list of pages which include a certain $image. Include the File:
namespace prefix to avoid incurring an extra round-trip (which will
also emit a deprecation warnings).
Additional parameters are:
* A namespace number to fetch results from (or an arrayref of
multiple namespace numbers)
* One of all, redirect, or nonredirects.
* $options is a hashref as described in the section for "linksearch".
my @pages = $bot->image_usage("File:Albert Einstein Head.jpg");
Or, make use of the "Options hashref" to do incremental processing:
$bot->image_usage("File:Albert Einstein Head.jpg",
undef, undef,
{ hook=>\&mysub, max=>5 }
);
sub mysub {
my $res = shift;
foreach my $page (@$res) {
my $title = $page->{'title'};
print "$title\n";
}
}
References: API:Imageusage
<https://www.mediawiki.org/wiki/API:Imageusage>
global_image_usage($image, $results, $filterlocal)
Returns an array of hashrefs of data about pages which use the given
image.
my @data = $bot->global_image_usage('File:Albert Einstein Head.jpg');
The keys in each hashref are title, url, and wiki. $results is the
maximum number of results that will be returned (not the maximum number
of requests that will be sent, like max in the "Options hashref"); the
default is to attempt to fetch 500 (set to 0 to get all results).
$filterlocal will filter out local uses of the image.
References: Extension:GlobalUsage#API
<https://www.mediawiki.org/wiki/Extension:GlobalUsage#API>
links_to_image
A backward-compatible call to "image_usage". You can provide only the
image title.
This method is deprecated and will emit deprecation warnings.
test_image_exists
Checks if an image exists at $page.
* FILE_NONEXISTENT (0) means "Nothing there"
* FILE_LOCAL (1) means "Yes, an image exists locally"
* FILE_SHARED (2) means "Yes, an image exists on Commons
<http://commons.wikimedia.org>"
* FILE_PAGE_TEXT_ONLY (3) means "No image exists, but there is text
on the page"
If you pass in an arrayref of images, you'll get out an arrayref of
results.
use MediaWiki::Bot::Constants;
my $exists = $bot->test_image_exists('File:Albert Einstein Head.jpg');
if ($exists == FILE_NONEXISTENT) {
print "Doesn't exist\n";
}
elsif ($exists == FILE_LOCAL) {
print "Exists locally\n";
}
elsif ($exists == FILE_SHARED) {
print "Exists on Commons\n";
}
elsif ($exists == FILE_PAGE_TEXT_ONLY) {
print "Page exists, but no image\n";
}
References: API:Properties#imageinfo
<https://www.mediawiki.org/wiki/API:Properties#imageinfo_.2F_ii>
upload
$bot->upload({ data => $file_contents, summary => 'uploading file' });
$bot->upload({ file => $file_name, title => 'Target filename.png' });
Upload a file to the wiki. Specify the file by either giving the
filename, which will be read in, or by giving the data directly.
References: API:Upload <https://www.mediawiki.org/wiki/API:Upload>
upload_from_url
Upload file directly from URL to the wiki. Specify URL, the new
filename and summary. Summary and new filename are optional.
$bot->upload_from_url({
url => 'http://some.domain.ext/pic.png',
title => 'Target_filename.png',
summary => 'uploading new pic',
});
If on your target wiki is enabled uploading from URL, meaning
$wgAllowCopyUploads is set to true in LocalSettings.php and you have
appropriate user rights, you can use this function to upload files to
your wiki directly from remote server.
References: API:Upload#Uploading_from_URL
<https://www.mediawiki.org/wiki/API:Upload#Uploading_from_URL>
Recent changes
update_rc
This method is deprecated and will emit deprecation warnings. Replace
calls to update_rc with calls to the newer "recentchanges", which
returns all available data, including rcid.
Returns an array containing the $limit most recent changes to the
wiki's main namespace. The array contains hashrefs with keys title,
revid, old_revid, and timestamp.
my @rc = $bot->update_rc(5);
foreach my $hashref (@rc) {
my $title = $hash->{'title'};
print "$title\n";
}
The "Options hashref" is also available:
# Use a callback for incremental processing:
my $options = { hook => \&mysub, };
$bot->update_rc($options);
sub mysub {
my ($res) = @_;
foreach my $hashref (@$res) {
my $page = $hashref->{'title'};
print "$page\n";
}
}
recentchanges($wiki_hashref, $options_hashref)
Returns an array of hashrefs containing recentchanges data.
The first parameter is a hashref with the following keys:
* ns - the namespace number, or an arrayref of numbers to specify
several; default is the main namespace (i.e. 0)
* limit - the number of rows to fetch; default is 50
* user - only list changes by this user
* show - itself a hashref where the key is a category and the value
is a boolean. If true, the category will be included; if false,
excluded. The categories are kinds of edits: minor, bot, anon,
redirect, patrolled. See "rcshow" at
http://www.mediawiki.org/wiki/API:Recentchanges#Parameters.
An "Options hashref" can be used as the second parameter:
my @rc = $bot->recentchanges({ ns => 4, limit => 100 });
foreach my $hashref (@rc) {
print $hashref->{title} . "\n";
}
# Or, use a callback for incremental processing:
$bot->recentchanges({ ns => [0,1], limit => 500 }, { hook => \&mysub });
sub mysub {
my ($res) = @_;
foreach my $hashref (@$res) {
my $page = $hashref->{title};
print "$page\n";
}
}
The hashref returned might contain the following keys:
* ns - the namespace number
* revid
* old_revid
* timestamp
* rcid - can be used with "patrol"
* pageid
* type - one of edit, new, log (there may be others)
* title
For backwards compatibility, the previous method signature is still
supported:
$bot->recentchanges($ns, $limit, $options_hashref);
This form is deprecated and will emit deprecation warnings.
References: API:Recentchanges
<https://www.mediawiki.org/wiki/API:Recentchanges>
Users
count_contributions
my $count = $bot->count_contributions($user);
Uses the API to count $user's contributions.
References: API:Users <https://www.mediawiki.org/wiki/API:Users>
timed_count_contributions
($timed_edits_count, $total_count) = $bot->timed_count_contributions($user, $days);
Uses the API to count $user's contributions in last number of $days and
total number of user's contributions (if needed).
Example: If you want to get user contribs for last 30 and 365 days, and
total number of edits you would write something like this:
my ($last30days, $total) = $bot->timed_count_contributions($user, 30);
my $last365days = $bot->timed_count_contributions($user, 365);
You could get total number of edits also by separately calling
count_contributions like this:
my $total = $bot->count_contributions($user);
and use timed_count_contributions only in scalar context, but that
would mean one more call to server (meaning more server load) of which
you are excused as timed_count_contributions returns array with two
parameters.
References: Extension:UserDailyContribs
<https://www.mediawiki.org/wiki/Extension:UserDailyContribs>
get_allusers
my @users = $bot->get_allusers($limit, $user_group, $options_hashref);
Returns an array of all users. Default $limit is 500. Optionally
specify a $group (like 'sysop') to list that group only. The last
optional parameter is an "Options hashref".
References: API:Allusers <https://www.mediawiki.org/wiki/API:Allusers>
contributions
my @contribs = $bot->contributions($user, $namespace, $options, $from, $to);
Returns an array of hashrefs of data for the user's contributions.
$namespace can be an arrayref of namespace numbers. $options can be
specified as in "linksearch". $from and $to are optional timestamps.
ISO 8601 date and time is recommended: 2001-01-15T14:56:00Z, see
https://www.mediawiki.org/wiki/Timestamp for all possible formats. Note
that $from (=ucend) has to be before $to (=ucstart), unlike direct API
access.
Specify an arrayref of users to get results for multiple users.