For Developers‎ > ‎

Cluster Telemetry

Run on Cluster Telemetry!

Cluster Telemetry 101

Telemetry is Chrome's performance testing framework, using it you can perform arbitrary actions on a set of web pages and report metrics about it.

Cluster Telemetry allows you to run telemetry's benchmarks using multiple repository patches through Alexa's top 10k and top 100k web pages.
Developers can use the framework to measure the performance of their patch against the top subset of the internet on both Desktop and Android using the perf page.
Developers can also use CT to gather metrics for analysis or for reports against 100k web pages on CT's VM farm using the analysis page.

Should I use CT?

If you would like to know how your patch impacts Chrome's performance against a very large repository of real world web pages then you should try out Cluster Telemetry.

It has been used to gather perf data for the following projects:
  • Slimming paint
  • Performance data for layer squashing and compositing overlap map
  • SkPaint in Graphics Context
  • Culling
  • New paint dictionary

Which telemetry benchmarks does CT support?

Currently the framework supports rasterize_and_record_micro and repaint. It is likely the framework can support more benchmarks, but these are the ones which have been extensively tested.
CT also allows you to run against unlanded / modified benchmarks.

How accurate are CT's results?

For an empty patch repaint run on Desktop, these are the results:


The overall results from Cluster Telemetry runs are accurate within a percentage point.

The per webpage results (visible when you click on a field) do have some variance, but this has been greatly improved due to efforts detailed here.

Framework Code and Documentation

Cluster Telemetry is primarily written in Go with a few python scripts. The framework lives in master/ct.
More detailed documentation is available here (for the perf page) and here (for the analysis page).

Contact Us

If you have questions, please email rmistry@ or