GET request to obtain revision information of page(s).
The following documentation is the output of Special:ApiHelp/query+revisions, automatically generated by the pre-release version of MediaWiki that is running on this site (MediaWiki.org).Get revision information.
May be used in several ways:
Specific parameters:
Which properties to get for each revision:
wikitext
). For performance reasons, if this option is used, rvlimit is enforced to 50.
Which revision slots to return data for, when slot-related properties are included in rvprops. If omitted, data from the main slot will be returned in a backwards-compatible format.
Content serialization format used for output of content.
Limit how many revisions will be returned. If rvprop=content, rvprop=parsetree, rvdiffto or rvdifftotext is used, the limit is 50. If rvparse is used, the limit is 1.
Use action=expandtemplates instead. Expand templates in revision content (requires rvprop=content).
Use action=expandtemplates or action=parse instead. Generate XML parse tree for revision content (requires rvprop=content).
Use action=parse instead. Parse revision content (requires rvprop=content). For performance reasons, if this option is used, rvlimit is enforced to 1.
Only retrieve the content of the section with this identifier.
Use action=compare instead. Revision ID to diff each revision to. Use prev, next and cur for the previous, next and current revision respectively. For performance reasons, if this option is used, rvlimit is enforced to 50.
Use action=compare instead. Text to diff each revision to. Only diffs a limited number of revisions. Overrides rvdiffto. If rvsection is set, only that section will be diffed against this text. For performance reasons, if this option is used, rvlimit is enforced to 50.
Use action=compare instead. Perform a pre-save transform on the text before diffing it. Only valid when used with rvdifftotext.
Serialization format used for rvdifftotext and expected for output of content.
Start enumeration from the timestamp of the revision with this ID. The revision must exist, but need not belong to this page.
Stop enumeration at the timestamp of the revision with this ID. The revision must exist, but need not belong to this page.
From which revision timestamp to start enumeration.
Enumerate up to this timestamp.
In which direction to enumerate:
Only include revisions made by user.
Exclude revisions made by user.
Only list revisions tagged with this tag.
When more results are available, use this to continue. More detailed information on how to continue queries can be found on mediawiki.org.
Request above is to obtain revision data of pages with titles API and Main Page
Response{ "batchcomplete": true, "query": { "pages": [ { "pageid": 1423, "ns": 0, "title": "Main Page", "revisions": [ { "user": "Bdk", "timestamp": "2005-09-16T01:14:43Z", "comment": "Reverted edit of 82.36.210.14, changed back to last version by Brion VIBBER" } ] }, { "pageid": 55332, "ns": 0, "title": "API", "revisions": [ { "user": "Mainframe98", "timestamp": "2017-08-19T18:23:42Z", "comment": "Reverted edits by [[Special:Contributions/Sankaran kumar|Sankaran kumar]] ([[User talk:Sankaran kumar|talk]]) to last revision by [[User:Shirayuki|Shirayuki]]" } ] } ] } }
#!/usr/bin/python3 """ get_pages_revisions.py MediaWiki API Demos Demo of `Revisions` module: Get revision data with content for pages with titles [[API]] and [[Main Page]] MIT License """ import requests S = requests.Session() URL = "https://www.mediawiki.org/w/api.php" PARAMS = { "action": "query", "prop": "revisions", "titles": "API|Main Page", "rvprop": "timestamp|user|comment|content", "rvslots": "main", "formatversion": "2", "format": "json" } R = S.get(url=URL, params=PARAMS) data = R.json() pages = data["query"]["pages"] for page in pages: print(page["revisions"])
<?php /* get_pages_revisions.php MediaWiki API Demos Demo of `Revisions` module: Get revision data with content for pages with titles [[API]] and [[Main Page]] MIT License */ $endPoint = "https://www.mediawiki.org/w/api.php"; $params = [ "action" => "query", "prop" => "revisions", "titles" => "API|Main Page", "rvprop" => "timestamp|user|comment|content", "rvslots" => "main", "formatversion" => "2", "format" => "json" ]; $url = $endPoint . "?" . http_build_query( $params ); $ch = curl_init( $url ); curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true ); $output = curl_exec( $ch ); curl_close( $ch ); $result = json_decode( $output, true ); foreach( $result["query"]["pages"] as $k => $v ) { var_dump( $v["revisions"] ); }
/* get_pages_revisions.js MediaWiki API Demos Demo of `Revisions` module: Get revision data with content for pages with titles [[API]] and [[Main Page]] MIT License */ var url = "https://www.mediawiki.org/w/api.php"; var params = { action: "query", prop: "revisions", titles: "API|Main Page", rvprop: "timestamp|user|comment|content", rvslots: "main", formatversion: "2", format: "json" }; url = url + "?origin=*"; Object.keys(params).forEach(function(key){url += "&" + key + "=" + params[key];}); fetch(url) .then(function(response){return response.json();}) .then(function(response) { var pages = response.query.pages; for (var p in pages) { console.log(pages[p].revisions); } }) .catch(function(error){console.log(error);});
/* get_pages_revisions.js MediaWiki API Demos Demo of `Revisions` module: Get revision data with content for pages with titles [[API]] and [[Main Page]] MIT License */ var params = { action: 'query', prop: 'revisions', titles: 'API|Main Page', rvprop: 'timestamp|user|comment|content', rvslots: 'main', formatversion: '2', format: 'json' }, api = new mw.Api(); api.get( params ).done( function ( data ) { var pages = data.query.pages, p; for ( p in pages ) { console.log( pages[ p ].revisions ); } } );Example 2: Get last five revisions of a page filtered by date and user[edit]
Request above is to obtain data for the last five revisions of the page API:Geosearch made after the 1st of July 2018, i.e. 2018-07-01 excluding changes made by the user SSethi (WMF)
Response{ "batchcomplete": "", "query": { "pages": { "812323": { "pageid": 812323, "ns": 104, "title": "API:Geosearch", "revisions": [ { "user": "Shirayuki", "timestamp": "2018-11-04T05:25:34Z", "comment": "translation tweaks" }, { "user": "Shirayuki", "timestamp": "2018-11-25T06:06:50Z", "comment": "translation tweaks" } ] } } } }get_filtered_page_revisions.py Python[edit]
#!/usr/bin/python3 """ get_filtered_page_revisions.py MediaWiki API Demos Demo of `Revisions` module: Get data including content of last 5 revisions of the title [[API:Geosearch]] made after the 1st of July 2018 i.e 2018-07-01 excluding changes made by the user SSethi (WMF) MIT License """ import requests S = requests.Session() URL = "https://www.mediawiki.org/w/api.php" PARAMS = { "action": "query", "prop": "revisions", "titles": "API:Geosearch", "rvlimit": "5", "rvprop": "timestamp|user|comment|content", "rvdir": "newer", "rvstart": "2018-07-01T00:00:00Z", "rvexcludeuser": "SSethi (WMF)", "rvslots": "main", "formatversion": "2", "format": "json" } r = S.get(url=URL, params=PARAMS) data = r.json() pages = data["query"]["pages"] for page in pages: print(page["revisions"])PHP[edit]
<?php /* get_filtered_page_revisions.php MediaWiki API Demos Demo of `Revisions` module: Get data including content of last 5 revisions of the title [[API:Geosearch]] made after July 1st 2018 excluding changes made by the user SSethi (WMF) MIT License */ $endPoint = "https://www.mediawiki.org/w/api.php"; $params = [ "action" => "query", "prop" => "revisions", "titles" => "API:Geosearch", "rvlimit" => "5", "rvprop" => "timestamp|user|comment|content", "rvdir" => "newer", "rvstart" => "2018-07-01T00:00:00Z", "rvexcludeuser" => "SSethi (WMF)", "rvslots" => "main", "formatversion" => "2", "format" => "json" ]; $url = $endPoint . "?" . http_build_query( $params ); $ch = curl_init( $url ); curl_setopt( $ch, CURLOPT_RETURNTRANSFER, true ); $output = curl_exec( $ch ); curl_close( $ch ); $result = json_decode( $output, true ); foreach( $result["query"]["pages"] as $k => $v ) { var_dump( $v["revisions"] ); }JavaScript[edit]
/* get_filtered_page_revisions.js MediaWiki API Demos Demo of `Revisions` module: Get data including content of last 5 revisions of the title [[API:Geosearch]] made after July 1st 2018 excluding changes made by the user SSethi (WMF) MIT License */ var url = "https://www.mediawiki.org/w/api.php"; var params = { action: "query", prop: "revisions", titles: "API:Geosearch", rvlimit: "5", rvprop: "timestamp|user|comment|content", rvdir: "newer", rvstart: "2018-07-01T00:00:00Z", rvexcludeuser: "SSethi (WMF)", rvslots: "main", formatversion: "2", format: "json" }; url = url + "?origin=*"; Object.keys(params).forEach(function(key){url += "&" + key + "=" + params[key];}); fetch(url) .then(function(response){return response.json();}) .then(function(response) { var pages = response.query.pages; for (var p in pages) { console.log(pages[p].revisions); } }) .catch(function(error){console.log(error);});MediaWiki JS[edit]
/* get_filtered_page_revisions.js MediaWiki API Demos Demo of `Revisions` module: Get data including content of last 5 revisions of the title [[API:Geosearch]] made after July 1st 2018 excluding changes made by the user SSethi (WMF) MIT License */ var params = { action: 'query', prop: 'revisions', titles: 'API:Geosearch', rvlimit: '5', rvprop: 'timestamp|user|comment|content', rvdir: 'newer', rvstart: '2018-07-01T00:00:00Z', rvexcludeuser: 'SSethi (WMF)', rvslots: 'main', formatversion: '2', format: 'json' }, api = new mw.Api(); api.get( params ).done( function ( data ) { var pages = data.query.pages, p; for ( p in pages ) { console.log( pages[ p ].revisions ); } } );Example 3: Get last revision of a page, following any redirects[edit]
Request above is to obtain revision data of the page AntiSpoof , following any redirects. Since AntiSpoof redirects to Extension:AntiSpoof , it will actually return revision data for Extension:AntiSpoof .
Response{ "batchcomplete": true, "query": { "redirects": [ { "from": "AntiSpoof", "to": "Extension:AntiSpoof" } ], "pages": [ { "pageid": 8993, "ns": 102, "title": "Extension:AntiSpoof", "revisions": [ { "revid": 3419761, "parentid": 3053177, "minor": true, "user": "Shirayuki", "timestamp": "2019-09-22T05:14:46Z", "comment": "" } ] } ] } }Code Info rvdiffto rvdiffto must be set to "prev", "next", "cur" or a non-negative number. rvnosuchrevid There is no revision with ID ID. rvnosuchsection There is no section section in rID rvrevids The revids parameter may not be used with the list options (rvlimit, rvstartid, rvendid, rvdir=newer, rvuser, rvexcludeuser, rvstart, and rvend). rvmultpages titles, pageids or a generator was used to supply multiple pages, but the rvlimit, rvstartid, rvendid, rvdir=newer, rvuser, rvexcludeuser, rvstart, and rvend parameters may only be used on a single page. rvaccessdenied The current user is not allowed to read title rvbadparams start and startid cannot be used together rvbadparams end and endid cannot be used together rvbadparams user and excludeuser cannot be used together invalidparammix titles, pageids or a generator was used to supply multiple pages, but the rvlimit, rvstartid, rvendid, rvdir=newer, rvuser, rvexcludeuser, rvstart, and rvend parameters may only be used on a single page. accessdenied You are not allowed to view title. badid_startid No revision was found for parameter startid. badid_endid No revision was found for parameter endid.
pageids
or titles
parameter. Individual revisions are specified by revids
parameter. See API:Query .rvslots
parameter. When the parameter is not present, the API will only return information about the main slot.titles=
must have only one title listed.rvslots
, roles
Deprecated rvcontentformat
rvprop=parsetree
, rvexpandtemplates
, rvparse
, rvdiffto
, rvdifftotext
, rvdifftotextpst
rvdifftotextpst
parsetree
Deprecated rvgeneratexml
rvtoken
contentmodel
, rvcontentformat
sha1
, rvparse
userid
, rvparse
parsedcomment
, tags
, rvdifftotext
, rvtag
rvdiffto
, rvcontinue
rvgeneratexml
rvsection
rvexpandtemplates
, rvtoken
ids
, flags
, size
, rvuser
, rvexcludeuser
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4