Tag Archives: publisher_ads_audits

Take the 2022 Google Publisher Tag developer survey

Since 2020, we've asked the Google Publisher Tag (GPT) developer community to provide their valuable feedback through an annual survey. This feedback influences what we work on each year, and has directly inspired improvements to our sample offerings, release notes, and much more.

As we look forward to next year, it's time once again to check in with our developer community to see what's working well and what can be improved. Starting today, we're kicking off the 2022 GPT developer survey.

Take the survey

The survey should take no more than 10 minutes to complete, and it will be open through the end of October 2022. Most questions are optional and your answers are completely anonymous.

Remember that the feedback you provide influences what we work on over the course of the next year, so please let us know what matters most to you. Thanks in advance for taking the time to help improve the GPT developer experience!

Take the 2021 Google Publisher Tag developer survey

In 2020, we launched the first ever Google Publisher Tag (GPT) developer survey to learn more about our community and understand how we can improve the developer experience. Feedback from users like you directly inspired improvements to our sample offerings, release notes, and much more over the course of the past year.

As we look forward to a new year, it's time once again to check in with our developer community to see what's working well and what can be improved. Starting today, we're kicking off the 2021 GPT developer survey.



The survey should take no more than 10 minutes to complete, and it will be open through the end of October 2021. Most questions are optional and your answers are completely anonymous.

Remember that the feedback you provide influences what we work on over the course of the next year, so please let us know what matters most to you. Thanks in advance for taking the time to help improve the GPT developer experience!

Introducing Publisher Ads Audits for Lighthouse v1.5.0

Today, we're announcing the general release of Publisher Ads Audits for Lighthouse v1.5.0. The major highlights of this release are:
A number of smaller enhancements and fixes are also included. For complete details, see the release notes and the changelog on GitHub.

Try it out and let us know what you think


Publisher Ads Audits v1.5.0 is already available to use. You can try generating an updated report from the web app, or grab the latest CLI release from GitHub or npm to run locally. Updates to the Chrome DevTools version will land in Chrome Stable with the Chrome 94 release (est. Sept 2021).

Have questions about this or anything else Ad Speed related? Interested in contributing? Visit our GitHub repo.

Introducing Publisher Ads Audits for Lighthouse v1.4.1

Today, we're announcing the general release of Publisher Ads Audits for Lighthouse v1.4.1. The major highlights of this release are:
A number of smaller enhancements and fixes are also included. For complete details, see the changelog on GitHub. Additional Lighthouse changes included in this release can be found in the v6.5.0 release notes.

Helping ensure Google Publisher Tags are used correctly

The Google Publisher Tag (GPT) library is designed with flexibility in mind, to support displaying ads across the widest variety of sites. However, this flexibility means that sites sometimes end up using the library in unexpected ways or provide configurations that don't quite work the way they intend. Although the library throws various warnings and errors when these sorts of issues are encountered, they can be easy to miss if you're not looking for them. To help raise awareness, we're introducing two new audits:
  • deprecated-api-usage - this audit looks for usage of GPT API features that have been deprecated. These features may be removed in the future and are no longer actively maintained. Consult the GPT release notes for alternatives to use.
  • gpt-errors-overall - this audit surfaces all warnings and errors thrown by GPT during page load. The severity of these issues can vary depending on a number of factors, so you need to review the results and decide whether there's anything that needs to be addressed for your site.

These new audits are being presented as informational only in this release, so they won't affect scores right now. However, we plan to give these audits weights in a future release. Users are therefore encouraged to address any newly uncovered issues now, to avoid a negative impact on their scores later.

Try it out and let us know what you think

Publisher Ads Audits v1.4.1 is available to use right now. You can try generating an updated report from the web app, or grab the latest CLI release from GitHub or npm to run locally. Updates to the Chrome DevTools version will land in Chrome Stable with the Chrome 91 release (est. May 2021).

Have questions about this or anything else Ad Speed related? Interested in contributing? Visit our GitHub repo.


Introducing Publisher Ads Audits for Lighthouse v1.3.0

Today, we're announcing the general release of Publisher Ads Audits for Lighthouse v1.3.0. The major highlights of this release are:

A number of smaller enhancements and bug fixes are also included. For complete details, see the changelog on GitHub. Additional Lighthouse changes included in this release can be found in the v6.3.0 and v6.4.0 release notes.

Try it out and let us know what you think


Publisher Ads Audits v1.3.0 is available to use right now. You can try generating an updated report from the web app, or grab the latest CLI release from GitHub or npm to run locally. Updates to the Chrome DevTools version will land in Chrome Stable with the Chrome 89 release (est. March 2021).

Have questions about this or anything else Ad Speed related? Interested in contributing? Visit our GitHub repo.

Introducing Publisher Ads Audits for Lighthouse v1.3.0

Today, we're announcing the general release of Publisher Ads Audits for Lighthouse v1.3.0. The major highlights of this release are:

A number of smaller enhancements and bug fixes are also included. For complete details, see the changelog on GitHub. Additional Lighthouse changes included in this release can be found in the v6.3.0 and v6.4.0 release notes.

Try it out and let us know what you think


Publisher Ads Audits v1.3.0 is available to use right now. You can try generating an updated report from the web app, or grab the latest CLI release from GitHub or npm to run locally. Updates to the Chrome DevTools version will land in Chrome Stable with the Chrome 89 release (est. March 2021).

Have questions about this or anything else Ad Speed related? Interested in contributing? Visit our GitHub repo.

Help improve the Google Publisher Tag developer experience

We’re constantly working to improve our offerings for Google Publisher Tag (GPT) developers. Whether it's writing guides, producing samples, or building tools like the Google Publisher Console and Publisher Ads Audits for Lighthouse, we strive to equip you with everything you need to succeed.

To better understand what's working and what needs improvement, we're asking our developer community for feedback. Starting today, we're launching the first Google Publisher Tag developer survey.

Take the survey


All questions in the survey are optional and your answers will be completely anonymous. We expect the survey to take about 10 minutes to complete, and it will be open through the end of September 2020.

The feedback you provide will directly impact what we work on over the course of the next year, so please let us know what matters most to you. Thanks in advance for taking the time to help improve the GPT developer experience.

Introducing Publisher Ads Audits for Lighthouse v1.2.0

Today, we're announcing the general release of Publisher Ads Audits for Lighthouse v1.2.0. The major highlights of this release are:
A number of smaller enhancements and bug fixes are also included. For complete details, see the changelog on GitHub.

Measuring the impact of ads on the user experience

The new cumulative ad shift metric is the first publisher ads audit aimed at measuring the impact of ads on Core Web Vitals metrics. In particular, this metric provides insight into how ads affect the stability of your page's layout. To learn more about how ads affect layout stability and steps you can take to address this, see our article on minimizing layout shift with GPT.

The APIs for measuring layout stability are new and still evolving, so it's not currently possible to attribute layout shifts to ads with 100% accuracy. We'll be updating this audit as these APIs mature, so scores may change over time.

Try it out and let us know what you think

Publisher Ads Audits v1.2.0 is available to use right now. You can try generating an updated report from the web app, or grab the latest CLI release from GitHub or npm to run locally. Updates to the Chrome DevTools version will land in Chrome Stable with the Chrome 87 release (mid-November).

Have questions about this or anything else Ad Speed related? Interested in contributing? Visit our GitHub repo.

Introducing Publisher Ads Audits for Lighthouse v1.2.0

Today, we're announcing the general release of Publisher Ads Audits for Lighthouse v1.2.0. The major highlights of this release are:
A number of smaller enhancements and bug fixes are also included. For complete details, see the changelog on GitHub.

Measuring the impact of ads on the user experience

The new cumulative ad shift metric is the first publisher ads audit aimed at measuring the impact of ads on Core Web Vitals metrics. In particular, this metric provides insight into how ads affect the stability of your page's layout. To learn more about how ads affect layout stability and steps you can take to address this, see our article on minimizing layout shift with GPT.

The APIs for measuring layout stability are new and still evolving, so it's not currently possible to attribute layout shifts to ads with 100% accuracy. We'll be updating this audit as these APIs mature, so scores may change over time.

Try it out and let us know what you think

Publisher Ads Audits v1.2.0 is available to use right now. You can try generating an updated report from the web app, or grab the latest CLI release from GitHub or npm to run locally. Updates to the Chrome DevTools version will land in Chrome Stable with the Chrome 87 release (mid-November).

Have questions about this or anything else Ad Speed related? Interested in contributing? Visit our GitHub repo.

Improving the accuracy of Publisher Ads Audits for Lighthouse mobile reports

Today we're releasing an update to Publisher Ads Audits for Lighthouse, which focuses on improving the real-world accuracy of mobile reports via simulated throttling. Continue reading to learn more about the new simulated throttling option and how it will affect mobile audit scores.

What is simulated throttling?

Until now, all audit scores were based on the result of loading your page on a desktop CPU over an unthrottled, wired broadband connection. While mobile audits request the mobile version of your page (by identifying as a Nexus 5X phone), the results did not reflect actual mobile CPU and bandwidth constraints.

To address this, we are introducing a simulated throttling option. When enabled, mobile audit scores are calculated by first auditing your page under normal conditions, then using this data to simulate page load performance under mobile conditions. The simulation uses the Lighthouse mobile network throttling preset, which artificially limits both CPU speed and network bandwidth. This approximates the performance of an actual mobile device loading your site over a fast 3G connection.

Note that simulated throttling only applies to mobile audits. Desktop audits will not be throttled regardless of whether this new option is enabled or not.

How does this affect me?

The new simulated throttling option will be enabled by default for all newly generated reports, matching the behavior of other performance auditing tools such as Lighthouse, PageSpeed Insights, and the Chrome DevTools Audits' panel. It can be disabled on a per-report basis by toggling the option in advanced settings.

While metric thresholds have been adjusted to account for simulation, we expect this new option may cause scores to change for many pages that were previously audited without throttling. If you audit your page regularly and have established a performance baseline based on the previous mobile audit behavior, you may need to establish a new baseline to account for these differences.


Questions or feedback about this or anything else ad speed related? Reach out on our GitHub issue tracker.