Sending Data to the Performance Dashboard

The Chrome Performance Dashboard can accept data from anywhere, as long as receives data in the right format and the sender’s IP is whitelisted.



Data Format


The endpoint that accepts new points (https://chromeperf.appspot.com/add_point) accepts HTTP post requests. With the post request, there should be one parameter given, called data, whose value should be a JSON encoding of a list of points add. Each point is a map of property names to values for that point.

Minimal example (only required fields)

[

 {

   "master": "SenderType",

   "bot": "platform-type",

   "test": "my_test_suite/chart_name/trace_name",

   "revision": 1234,

   "value": 18.5

}

]


Required fields are:

  • master, bot and test: These three fields in combination specify a particular test path. The master and bot are supposed to be the Buildbot master name and slave perf_id, respectively, but if the tests aren’t being run by Buildbot, these can be any descriptive strings which specify the test data origin (note master and bot names can’t contain slashes, and none of these can contain asterisks).

  • revision: This is a number that’s used to index the data point. It doesn't actually have to be a "revision". By default, this will be assumed to be a Chromium SVN revision, but it could be anything, as long as it’s an integer, and as long as it’s monotonically increasing for any given graph. Note that you can choose to show something else on the X-axis of your graphs use supplemental properties, as described below.

  • value: The Y-value for this point.



Larger example (including some optional fields)


[

 {

   "master": "ChromiumPerf",

   "bot": "linux-release",

   "test": "sunspider/string-unpack-code/ref",

   "revision": 33241,

   "value": "18.5",

   "error": "0.5",

   "units": "ms",

   "masterid": "master.chromium.perf",

   "buildername": "Linux Builder",

   "buildnumber": 75,

   "supplemental_columns": {

     "r_webkit_rev": "167808",

     "a_default_rev": "r_webkit_rev"

   }

 },

 {

   "master": "ChromiumPerf",

   "bot": "linux-release",

   "test": "sunspider/string-unpack-code",

   "revision": 33241,

   "value": "18.4",
   "error": "0.489897948557",

   "units": "ms",

   "masterid": "master.chromium.perf",

   "buildername": "Linux Builder",

   "buildnumber": 75,

   "supplemental_columns": {

     "r_webkit_rev": "167808",

     "a_default_rev": "r_webkit_rev"

   }

 }

]


Optional fields are:

  • error: If the "value" given is a mean, you can also send an associated standard error or standard deviation value here. This will be represented on the graphs with a shaded region.

  • units: The (Y-axis) units for this point. By default, if higher_is_better is not given, this also determines the change direction which will be considered an improvement.

  • higher_is_better: Boolean. You can use this field to explicitly define improvement direction.

  • masterid, buildername, buildnumber: Some information from the Buildbot builder. This is used to construct a link to the Buildbot logs for this point. Note that this “masterid” is actually the “master name”, e.g. chromium.perf.

  • supplemental_columns: A dictionary of other data associated with this point. You can specify other values, other strings, links, and other revision or version numbers; which type of property it is expressed with a prefix. Supplemental properties that start with “d_” are extra data values. Those that start with “r_” are additional revision or version numbers, such as dot-separated version numbers or git hashes. A list of revision types is maintained in the dashboard code. Those that start with “a_” are other annotations, including the following:

    • a_default_rev: This is the name of another supplemental property which is the revision or version you want to show on the X-axis, e.g. “r_chrome_version”.

    • a_stdio_uri: With this property you can override the URI used for the link to the Buildbot output logs. Then masterid, buildername, buildnumber won’t be used.


Sending Multi-Value Data

Generally when you send data to the dashboard, you’re only sending scalar values; for each line on each chart, there is one point per revision. However, some tests may have time series data that is output for one run of the test. In that case, it is possible to send this data as well. The way that this


[

{

 "bot": "endure-linux-dbg",
 "buildername": "Linux QA Perf (dbg)(0)",

 "buildnumber": 7625,

 "data": [[4, 529148.0], [9, 529150.0], [12, 529150.0]],

 "master": "ChromiumEndure",
 "masterid": "chromium.endure",
 "revision": 262335,
 "test": "endure/control_wpr/vm_stats/gpu_vm",
 "units": "KB",

 "units_x": "iterations"
}
]


Providing test and unit information

In order for the dashboard to know which direction is considered an improvement and which is a regression, the units you specify must be associated with an improvement direction in chromium/src/tools/perf/unit-info.json. The units you want to use are probably already listed in that file, but if they’re not, you have to add them.


You can also provide a short test description and more specific result descriptions at chromium/src/tools/perf/test-info.json and chromium/src/tools/perf/trace-info.json. Doing this is helpful to other people viewing the test results.

Relevant code links


Implementations of code that sends data to the dashboard:


Whitelisting sender IPs, making data externally-visible, and monitoring


Once you’re ready to start sending data to the real perf dashboard, there are a few more things you might want to do. Firstly, in order for the dashboard to accept the data, the IP of the sender must be whitelisted. To request whitelisting of a new sender, you can file an issue. The whitelist requirement might change if issue 266056 is implemented.


If your data is not internal-only data, you can request that it be marked as such, again by filing an issue.


Finally, if you want to monitor your the test results, you can decide which tests you want to be monitored, who should be receiving alerts, and whether you want to set any special thresholds for alerting. In general, for questions you can email chrome-perf-dashboard-team@google.com.


Comments