This page documents metrics provided by Wikimedia Performance. We collect metrics using real user monitoring, running synthetic tests and some backend metrics. The aim of collecting the metrics is to get a feeling for the user experience and finding user experience regressions.
navtimingThese are real-user metrics (RUM) collected on a sample of MediaWiki pageviews, using standard APIs that are built-in to every web browser. Through the webperf services, these eventually end up in Graphite under the frontend.navtiming2
prefix.
We collect two kinds of metrics:
responseStart
: From navigationStart to here (Navigation Timing).domInteractive
: From navigationStart to here (Navigation Timing).domComplete
: From navigationStart to here (Navigation Timing).loadEventStart
: From navigationStart to here (Navigation Timing).loadEventEnd
: From navigationStart to here (Navigation Timing). Also known as "page load time (PLT)" or "On load", which typically corresponds with the browser's page loading indicator.firstPaint
: From navigationStart to first-timing (Paint Timing).firstContentfulPaint
: From navigationStart to first-contentful-paint (Paint Timing).mediaWikiLoadEnd
: From navigationStart to when all JavaScript code queued to load on a page has arrived and finished its initial script execution. This is analogous to when the mw.loader.using(RLPAGEMODULES).then()
Promise resolves.dns
: Computed as domainLookupEnd - domainLookupStart
, our intermediary layer labels this "dnsLookup".unload
: Computed as unloadEventEnd - unloadEventStart
.redirect
: Computed as redirectEnd - redirectStart
, our intermediary layer labels this "redirecting".tcp
: Computed as connectEnd - connectStart
. (As per the spec, browsers include the TLS handshake for HTTPS).ssl
: Computed as connectEnd - secureConnectionStart
. (As per the spec, browsers report this as subset of tcp
).request
: Computed as responseStart - requestStart
.response
: Computed as responseEnd - responseStart
.processing
: Computed as domComplete - responseEnd
.onLoad
: Computed as loadEventEnd - loadEventStart
.See phab:T104902 for how we validate incoming data:
nonCompliant
counter. The details of this are logged by webperf-navtiming to journalctl
.We measure the overall time it takes to process an edit submission. To save an edit in MediaWiki means to create or change a wiki page.
Backend Save TimingBackend Save Timing measures time spent in MediaWiki PHP, from the process start (REQUEST_TIME_FLOAT
) until the response is flushed to the web server for sending to the client (PRESEND
). The instrumentation resides in the WikimediaEvents extension (source), and is published to Graphite under MediaWiki.timing.editResponseTime
.
The metric is plotted in Grafana: Backend Save Timing Breakdown, and includes slices by account type (bot vs human), by entry point (index.php wikitext editor, vs api.php for VisualEditor and bots), and by page type or namespace (Wikipedia content, or Wikidata entity, or discussion pages).
Frontend Save TimingFrontend Save Timing is measured as time from pressing "Publish changes" from a user interface in a web browser (e.g. submitting the edit page form) until that browser recieves the first byte of the server response that will render the confirmation page (e.g. the article with their edit applied and a "Post-edit" message).
This is implemented as navigationStart
(the click to submit the form over HTTP POST) to responseStart
(the first byte after the server has finished processing the edit, redirected, and responded to the subsequent GET).
Instrumented by Extension:NavigationTiming (client source code), processed webperf-navtiming, and published to Statsd/Graphite under the mw.performance.save
.
The metric is plotted toward the bottom of Grafana: Save Timing, and includes slices by wiki (group1 is Wikidata/Commons, group2 is Wikipedia, per Train groups).
See alsoWhen investigating Save Timing metrics, it may be useful to correlate with:
The synthetic tests collects metrics from the browser as real user monitoring and extra metrics by recording a video of the screen when the page is loading and then analysing when different parts is painted. With the synthetics tests you get a video, screenshots and metrics.
First and Largest Contentful Paint on Barack Obama page on mobile.Synthetic tests can also collect trace logs from Chrome and Firefox so you can see how much time is in CSS/JS and different functions.
Timeline trace from synthetic tests on an Android phone.You can see what metrics we get from synthetic tests in synthetic test drill down dashboard.
Visual MetricsFirstVisualChange
: The time when something for the first time is painted within the viewport.Logo
: The time when the Wikipedia logo is paintedHeading
: The time when the first h1/h2 heading is painted at its final position within the viewport.LargestImage
: The time when the largest image is painted at its final position within the viewport.SpeedIndex
: SpeedIndex is the average time at which visible parts of the page are displayed. It is expressed in milliseconds and dependent on size of the view port.lastVisualChange
: The time when the last paint happens within the viewport.VisualComplete85
: The time when 85% of the content within the viewport is paintedVisualComplete95
: The time when 95% of the content within the viewport is painted.VisualComplete99
: The time when 99% of the content within the viewport is painted.Read more about Google Web Vitals.
You can find a full definition of the metrics collected.
CruXYou can explore these on our Grafana: Chrome User Experience dashboard.
Infrastructure diagramGoogle collects metrics within its Chrome browser from all people who have "opted-in" by syncing with their Google account. These are used by Google Search as real-world signal from how a website performs in practice. The data Google collects from its Chrome users is publicly available through the Chrome User Experience Report.
In order to keep track of how Wikipedia is doing from Google's point of view, we import a copy of this data once a day from the Google API and store it in a Graphite instance.
The daily crawl runs on the crux-metrics.webperformancetest.eqiad1.wikimedia.cloud
server, where run a couple of tests and collect if we are slow/moderate/fast. The data is collected using the Sitespeed.io CruX plugin (similar setup as for Performance/Synthetic testing).
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4