Tag Archives: DoubleClick

What’s new with IMA iOS SDK Beta 15?

Last week, we released beta version 15 of the IMA SDK for iOS. This release includes two new features:

  • Ad buffer events via IMAAdsManager delegates
  • Debugging mode

Ad buffer events

We’re providing more information on ad buffering by introducing new buffering events via the following optional IMAAdsManagerDelegate methods:

  • adsManagerAdPlaybackReady:
  • adsManagerAdDidStartBuffering:
  • adsManager:adDidBufferToMediaTime:

Collectively, these delegate methods provide more transparency into buffer events, giving you more control over the user’s ad experience. For more detailed information on these new methods, take a look at the reference documentation.

Debugging mode

We’ve introduced a new debugging mode setting to allow for more verbose logging to the console. You can now set IMASettings.enableDebugMode to YES to enable debug mode. This should not be used in production, as it will show a watermark on the ad player.

A note about CocoaPods

If you’re using CocoaPods with the IMA SDK, please make sure to use at least version 0.38.

As always, if you have any questions, feel free to contact us via the support forum.

Relaxing constraints on ad group name uniqueness

Have you ever found it frustrating that you can never reuse an AdGroup name after removing the AdGroup, since a removed AdGroup cannot be modified? We have awesome news for you!

Now, AdGroupService doesn’t consider REMOVED AdGroup names when verifying that an AdGroup name is unique within a Campaign. If an AdGroup is in a REMOVED state, then the name of that AdGroup can be reused. This is already the case for Campaigns, and we’ve extended this relaxation of constraints to AdGroups.

If you have questions or need clarification, visit us on the AdWords API Forum or our Google+ page.

L’Oréal Canada finds beauty in programmatic buying


While global sales of L'Oréal Luxe makeup brand Shu Uemura were booming, reaching its target audience across North America proved challenging. By collaborating with Karl Lagerfeld (and his cat, Choupette) and using DoubleClick Bid Manager and Google Analytics Premium, the campaign delivered nearly double the anticipated revenue.

Goals


  • Re-introduce and raise awareness of the Shu Uemura cosmetics brand in North America
  • Drive North American sales of Karl Lagerfeld’s Shupette collection for Shu Uemura
  • Grow the Shu Uemura email subscriber list
  • Approach


  • Organized website audiences with Google Analytics Premium
  • Used programmatic buying to lead prospects down the path to purchase
  • Leveraged a range of audience data in DoubleClick Bid Manager to buy paid media in display and social channels
  • Results


  • Drove almost 2X the anticipated revenue
  • Exceeded CPA targets and achieved a 2,200% return on ad spend (ROAS)
  • Increased web traffic and email subscriber
  • To learn more about Shu Uemura’s approach, check out the full case study.

    Posted by Kelly Cox, Product Marketing, DoubleClick

    Make Mobile Work 2.0: Continuing the Mobile Conversation With Brand Marketers

    Last year we started the Make Mobile Work initiative in partnership with the IAB to foster adoption of HTML5 and cross-screen creative, and it quickly became the IAB Mobile Center’s lynchpin for marketer outreach as interest in the program accelerated.

    For 2015, we’re excited to bring back Make Mobile Work for another round of educational and practical conversations for brand marketers, to help them succeed in our increasingly mobile-first world.The importance of HTML5 for digital marketing continues to be at the center of the Make Mobile Work message, building on the success of the HTML5 Week we hosted last week.

    Make Mobile Work webinars will address three important topics over the remainder of 2015. These webinars are curated with marketer business decision-makers in mind—they will keep the jargon to a minimum and focus on sharing practical examples and learning.


    The IAB’s Tech Lab is also working to update their standard ad units to reflect the file size needs of HTML5-based ads. This is a timely effort as connectivity technologies have changed along with the rise of HTML5 and it’s vital to realign buyer and seller expectations around ad file weights that will enable engaging ads, while not harming webpage or ad-load performance. Make Mobile Work will help to spread the word about this process and its outcomes and implications.

    Along with the other members of the Mobile Center, we’re looking forward to continuing to help brands large and small, novice and experienced, get the know-how they need to make mobile work for them.

    Posted by Becky Chappell, DoubleClick Product Marketing

    Learn how to design and build HTML5 with Google Web Designer

    We dubbed last week #HTML5Week and launched multiple HTML5 resources, hangouts and product updates to help make it easier to build all your ads in HTML5. Today, we are pleased to announce the launch of our Google Web Designer Certification exam and training resources. 

    Google Web Designer helps you create engaging HTML5 content. Use animations and interactive elements to bring your creative vision to life and enjoy integrations with other Google products, including a shared asset library and one-click-to-publish integration with DoubleClick Studio, compatibility with AdWords, and the ability to collaborate on works-in-progress in Google Drive. 

    The new Google Web Designer Fundamentals Certification allows new users to demonstrate proficiency and understanding of the Google Web Designer interface and features. The exam will help you learn:
    • How to use the Google Web Designer interface
    • How to create templates and animations
    • How to build interstitial ads
    • How to build advanced expandable ads
    The program consists of a step-by-step eLearning that takes you through the basics of Google Web Designer and helps you get trained quickly in building HTML5 ads using the tool. Once you finish the eLearning and build a few test ads, you can take a certification exam to test your knowledge and demonstrate your proficiency. If you pass the certification, you can get your name listed on the Certified User list in the Rich Media Gallery. 

    We hope this new certification exam helps you learn the ins and outs of the Google Web Designer tool, so that you can more easily create your HTML5 ads.

    Posted by Becky Chappell, Google Web Designer Product Marketing Manager 

    Making it easier to run mobile-friendly HTML5 ads with DoubleClick

    For the past week, we’ve shared best practices and new resources to help you learn how to build HTML5 ads. To cap off #HTML5Week, we're making it even easier for our DoubleClick Campaign Manager clients to run mobile-friendly HTML5 ads by offering unlimited file sizes...for free. 

    HTML5 ad files are much larger than Flash ads, meaning they need a larger file size (k size) allowance and should utilize a polite load function to be publisher friendly. 

    We launched Enhanced Formats in DoubleClick Campaign Manager early last year, as an option for media agencies to upload larger creative files to DoubleClick Campaign Manager directly. Now that all ads need to be built in HTML5 to show up properly across screens and browsers, we’ve decided to offer the unlimited file sizes free of charge. Starting August 3rd, we will be automatically upgrading all DoubleClick Campaign Manager accounts to support Enhanced Formats. [No action necessary by clients.]

    In addition to unlimited file size, Enhanced Formats offers automatic polite load functionality (which allows main web page content to load before the ad unit finishes loading,) a way to easily add basic engagement metrics to standard banner ads, and the ability to track metrics such as multiple exits, interactions, backup image views, HTML5 views, and display time. You can learn more about Enhanced Formats in the help center.

    During the month of August, we’ll also be making some other changes to the way we handle campaigns that contain Flash ads:
    • In DoubleClick Campaign Manager, our systems will automatically show the HTML5 version of an ad, where available. (Before, the HTML5 version would only show in environments where Flash wasn't supported). 
    • DoubleClick Bid Manager will stop supporting bids for Flash ads in Chrome (which pauses Flash ads automatically). This means advertisers won't end up paying for ads that will be paused by default.

    While making the switch to HTML5 may seem daunting, we hope that these product updates, combined with the resources and training provided this week, will help you tackle the challenge. Here is a recap of the resources that are now available:
    • HTML5 Overview Hangout and step-by-step tutorial
    • Google Web Designer Landing Page with components and templates 
    • HTML5 Toolkit -- everything you need in one place 
    • Upcoming HTML5 Immersion Days: (All Immersion Days are hosted at Google’s offices. Please contact your DoubleClick Rep if you’d like to join.) 
      • In NYC: Monday, Jul. 27 (10am-1pm) and Thursday, Jul. 30 (10am-1pm) 
      • In LA: Monday, Jul. 27 (1:30-3:30pm) and Monday, Aug. 10 (2-4pm) 
      • In SF: Tuesday, Jul. 28 (1-3:30pm) 
      • In Chicago: Wednesday, Aug. 12 (1-3pm) 

    Posted by Becky Chappell, Product Marketing Manager, DoubleClick and Google Web Designer

    New guide for setting up AdWords API authorization using the OAuth 2.0 Playground

    For those of you who’d prefer to generate an OAuth refresh token using only a browser, there's a new guide on how to use the OAuth 2.0 Playground:

    https://developers.google.com/adwords/api/docs/guides/oauth_playground

    The guide walks you through the authorization setup required by the AdWords API for a Web application--via a browser session--without the need to execute any command-line scripts.

    More OAuth resources
    Still have questions? Feel free to visit us on the AdWords API Forum or our Google+ page.

    Working together to filter automated data-center traffic

    Today the Trustworthy Accountability Group (TAG) announced a new pilot blacklist to protect advertisers across the industry. This blacklist comprises data-center IP addresses associated with non-human ad requests. We're happy to support this effort along with other industry leaders—Dstillery, Facebook, MediaMath, Quantcast, Rubicon Project, TubeMogul and Yahoo—and contribute our own data-center blacklist. As mentioned to Ad Age and in our recent call to action, we believe that if we work together we can raise the fraud-fighting bar for the whole industry.

    Data-center traffic is one of many types of non-human or illegitimate ad traffic. The newly shared blacklist identifies web robots or “bots” that are being run in data centers but that avoid detection by the IAB/ABC International Spiders & Bots List. Well-behaved bots announce that they're bots as they surf the web by including a bot identifier in their declared User-Agent strings. The bots filtered by this new blacklist are different. They masquerade as human visitors by using User-Agent strings that are indistinguishable from those of typical web browsers.

    In this post, we take a closer look at a few examples of data-center traffic to show why it’s so important to filter this traffic across the industry.
    Impact of the data-center blacklist
    When observing the traffic generated by the IP addresses in the newly shared blacklist, we found significantly distorted click metrics. In May of 2015 on DoubleClick Campaign Manager alone, we found the blacklist filtered 8.9% of all clicks. Without filtering these clicks from campaign metrics, advertiser click-through rates would have been incorrect and for some advertisers this error would have been very large.

    Below is a plot that shows how much click-through rates in May would have been inflated across the most impacted of DoubleClick Campaign Manager’s larger advertisers.

    Two examples of bad data-center traffic
    There are two distinct types of invalid data-center traffic: where the intent is malicious and where the impact on advertisers is accidental. In this section we consider two interesting examples where we’ve observed traffic that was likely generated with malicious intent.

    Publishers use many different strategies to increase the traffic to their sites. Unfortunately, some are willing to use any means necessary to do so. In our investigations we’ve seen instances where publishers have been running software tools in data centers to intentionally mislead advertisers with fake impressions and fake clicks.

    First example
    UrlSpirit is just one example of software that some unscrupulous publishers have been using to collaboratively drive automated traffic to their websites. Participating publishers install the UrlSpirit application on Windows machines and they each submit up to three URLs through the application’s interface. Submitted URLs are then distributed to other installed instances of the application, where Internet Explorer is used to automatically visit the list of target URLs. Publishers who have not installed the application can also leverage the network of installations by paying a fee.

    At the end of May more than 82% of the UrlSpirit installations were being run on machines in data centers. There were more than 6,500 data-center installations of UrlSpirit, with each data-center installation running in a separate virtual machine. In aggregate, the data-center installations of UrlSpirit were generating a monthly rate of at least half a billion ad requests— an average of 2,500 fraudulent ad requests per installation per day.

    Second Example
    HitLeap is another example of software that some publishers are using to collaboratively drive automated traffic to their websites. The software also runs on Windows machines, and each instance uses the Chromium Embedded Framework to automatically browse the websites of participating publishers—rather than using Internet Explorer.

    Before publishers can use the network of installations to drive traffic to their websites, they need browsing minutes. Participating publishers earn browsing minutes by running the application on their computers. Alternatively, they can simply buy browsing minutes—with bundles starting at $9 for 10,000 minutes or up to 1,000,000 minutes for $625. 

    Publishers can specify as many target URLs as they like. The number of visits they receive from the network of installations is a function of how long they want the network of bots to spend on their sites. For example, ten browsing minutes will get a publisher five visits if the publisher requests two-minute visit durations.

    In mid-June, at least 4,800 HitLeap installations were being run in virtual machines in data centers, with a unique IP associated with each HitLeap installation. The data-center installations of HitLeap made up 16% of the total HitLeap network, which was substantially larger than the UrlSpirit network.

    In aggregate the data-center installations of HitLeap were generating a monthly rate of at least a billion fraudulent ad requests—or an average of 1,600 ad requests per installation per day.

    Not only were these publishers collectively responsible for billions of automated ad requests, but their websites were also often extremely deceptive. For example, of the top ten webpages visited by HitLeap bots in June, nine of these included hidden ad slots -- meaning that not only was the traffic fake, but the ads couldn’t have been seen even if they had been legitimate human visitors. 

    http://vedgre.com/7/gg.html is illustrative of these nine webpages with hidden ad slots. The webpage has no visible content other than a single 300×250px ad. This visible ad is actually in a 300×250px iframe that includes two ads, the second of which is hidden. Additionally, there are also twenty-seven 0×0px hidden iframes on this page with each hidden iframe including two ad slots. In total there are fifty-five hidden ads on this page and one visible ad. Finally, the ads served on http://vedgre.com/7/gg.html appear to advertisers as though they have been served on legitimate websites like indiatimes.com, scotsman.com, autotrader.co.uk, allrecipes.com, dictionary.com and nypost.com, because the tags used on http://vedgre.com/7/gg.html to request the ad creatives have been deliberately spoofed.

    An example of collateral damage
    Unlike the traffic described above, there is also automated data-center traffic that impacts advertising campaigns but that hasn’t been generated for malicious purposes. An interesting example of this is an advertising competitive intelligence company that is generating a large volume of undeclared non-human traffic.

    This company uses bots to scrape the web to find out which ad creatives are being served on which websites and at what scale. The company’s scrapers also click ad creatives to analyze the landing page destinations. To provide its clients with the most accurate possible intelligence, this company’s scrapers operate at extraordinary scale and they also do so without including bot identifiers in their User-Agent strings.

    While the aim of this company is not to cause advertisers to pay for fake traffic, the company’s scrapers do waste advertiser spend. They not only generate non-human impressions; they also distort the metrics that advertisers use to evaluate campaign performance—in particular, click metrics. Looking at the data across DoubleClick Campaign Manager this company’s scrapers were responsible for 65% of the automated data-center clicks recorded in the month of May.

    Going forward
    Google has always invested to prevent this and other types of invalid traffic from entering our ad platforms. By contributing our data-center blacklist to TAG, we hope to help others in the industry protect themselves. 

    We’re excited by the collaborative spirit we’ve seen working with other industry leaders on this initiative. This is an important, early step toward tackling fraudulent and illegitimate inventory across the industry and we look forward to sharing more in the future. By pooling our collective efforts and working with industry bodies, we can create strong defenses against those looking to take advantage of our ecosystem. We look forward to working with the TAG Anti-fraud working group to turn this pilot program into an industry-wide tool.


    Posted by Vegard Johnsen, Product Manager Google Ad Traffic Quality

    Working together to filter automated data-center traffic

    Today the Trustworthy Accountability Group (TAG) announced a new pilot blacklist to protect advertisers across the industry. This blacklist comprises data-center IP addresses associated with non-human ad requests. We're happy to support this effort along with other industry leaders—Dstillery, Facebook, MediaMath, Quantcast, Rubicon Project, TubeMogul and Yahoo—and contribute our own data-center blacklist. As mentioned to Ad Age and in our recent call to action, we believe that if we work together we can raise the fraud-fighting bar for the whole industry.

    Data-center traffic is one of many types of non-human or illegitimate ad traffic. The newly shared blacklist identifies web robots or “bots” that are being run in data centers but that avoid detection by the IAB/ABC International Spiders & Bots List. Well-behaved bots announce that they're bots as they surf the web by including a bot identifier in their declared User-Agent strings. The bots filtered by this new blacklist are different. They masquerade as human visitors by using User-Agent strings that are indistinguishable from those of typical web browsers.

    In this post, we take a closer look at a few examples of data-center traffic to show why it’s so important to filter this traffic across the industry.
    Impact of the data-center blacklist
    When observing the traffic generated by the IP addresses in the newly shared blacklist, we found significantly distorted click metrics. In May of 2015 on DoubleClick Campaign Manager alone, we found the blacklist filtered 8.9% of all clicks. Without filtering these clicks from campaign metrics, advertiser click-through rates would have been incorrect and for some advertisers this error would have been very large.

    Below is a plot that shows how much click-through rates in May would have been inflated across the most impacted of DoubleClick Campaign Manager’s larger advertisers.

    Two examples of bad data-center traffic
    There are two distinct types of invalid data-center traffic: where the intent is malicious and where the impact on advertisers is accidental. In this section we consider two interesting examples where we’ve observed traffic that was likely generated with malicious intent.

    Publishers use many different strategies to increase the traffic to their sites. Unfortunately, some are willing to use any means necessary to do so. In our investigations we’ve seen instances where publishers have been running software tools in data centers to intentionally mislead advertisers with fake impressions and fake clicks.

    First example
    UrlSpirit is just one example of software that some unscrupulous publishers have been using to collaboratively drive automated traffic to their websites. Participating publishers install the UrlSpirit application on Windows machines and they each submit up to three URLs through the application’s interface. Submitted URLs are then distributed to other installed instances of the application, where Internet Explorer is used to automatically visit the list of target URLs. Publishers who have not installed the application can also leverage the network of installations by paying a fee.

    At the end of May more than 82% of the UrlSpirit installations were being run on machines in data centers. There were more than 6,500 data-center installations of UrlSpirit, with each data-center installation running in a separate virtual machine. In aggregate, the data-center installations of UrlSpirit were generating a monthly rate of at least half a billion ad requests— an average of 2,500 fraudulent ad requests per installation per day.

    Second example
    HitLeap is another example of software that some publishers are using to collaboratively drive automated traffic to their websites. The software also runs on Windows machines, and each instance uses the Chromium Embedded Framework to automatically browse the websites of participating publishers—rather than using Internet Explorer.

    Before publishers can use the network of installations to drive traffic to their websites, they need browsing minutes. Participating publishers earn browsing minutes by running the application on their computers. Alternatively, they can simply buy browsing minutes—with bundles starting at $9 for 10,000 minutes or up to 1,000,000 minutes for $625. 

    Publishers can specify as many target URLs as they like. The number of visits they receive from the network of installations is a function of how long they want the network of bots to spend on their sites. For example, ten browsing minutes will get a publisher five visits if the publisher requests two-minute visit durations.

    In mid-June, at least 4,800 HitLeap installations were being run in virtual machines in data centers, with a unique IP associated with each HitLeap installation. The data-center installations of HitLeap made up 16% of the total HitLeap network, which was substantially larger than the UrlSpirit network.

    In aggregate the data-center installations of HitLeap were generating a monthly rate of at least a billion fraudulent ad requests—or an average of 1,600 ad requests per installation per day.

    Not only were these publishers collectively responsible for billions of automated ad requests, but their websites were also often extremely deceptive. For example, of the top ten webpages visited by HitLeap bots in June, nine of these included hidden ad slots -- meaning that not only was the traffic fake, but the ads couldn’t have been seen even if they had been legitimate human visitors. 

    http://vedgre.com/7/gg.html is illustrative of these nine webpages with hidden ad slots. The webpage has no visible content other than a single 300×250px ad. This visible ad is actually in a 300×250px iframe that includes two ads, the second of which is hidden. Additionally, there are also twenty-seven 0×0px hidden iframes on this page with each hidden iframe including two ad slots. In total there are fifty-five hidden ads on this page and one visible ad. Finally, the ads served on http://vedgre.com/7/gg.html appear to advertisers as though they have been served on legitimate websites like indiatimes.com, scotsman.com, autotrader.co.uk, allrecipes.com, dictionary.com and nypost.com, because the tags used on http://vedgre.com/7/gg.html to request the ad creatives have been deliberately spoofed.

    An example of collateral damage
    Unlike the traffic described above, there is also automated data-center traffic that impacts advertising campaigns but that hasn’t been generated for malicious purposes. An interesting example of this is an advertising competitive intelligence company that is generating a large volume of undeclared non-human traffic.

    This company uses bots to scrape the web to find out which ad creatives are being served on which websites and at what scale. The company’s scrapers also click ad creatives to analyze the landing page destinations. To provide its clients with the most accurate possible intelligence, this company’s scrapers operate at extraordinary scale and they also do so without including bot identifiers in their User-Agent strings.

    While the aim of this company is not to cause advertisers to pay for fake traffic, the company’s scrapers do waste advertiser spend. They not only generate non-human impressions; they also distort the metrics that advertisers use to evaluate campaign performance—in particular, click metrics. Looking at the data across DoubleClick Campaign Manager this company’s scrapers were responsible for 65% of the automated data-center clicks recorded in the month of May.

    Going forward
    Google has always invested to prevent this and other types of invalid traffic from entering our ad platforms. By contributing our data-center blacklist to TAG, we hope to help others in the industry protect themselves. 

    We’re excited by the collaborative spirit we’ve seen working with other industry leaders on this initiative. This is an important, early step toward tackling fraudulent and illegitimate inventory across the industry and we look forward to sharing more in the future. By pooling our collective efforts and working with industry bodies, we can create strong defenses against those looking to take advantage of our ecosystem. We look forward to working with the TAG Anti-fraud working group to turn this pilot program into an industry-wide tool.


    Posted by Vegard Johnsen, Product Manager Google Ad Traffic Quality

    New study: Simpler ad tech stacks drive greater programmatic efficiency, more revenue

    Publishers’ growth in programmatic revenue is outpacing traditional direct sales for desktop and mobile across display and video advertising. New technologies like “programmatic guaranteed” are further blurring the lines between direct and programmatic channels.

    A new study by The Boston Consulting Group, commissioned by Google, found that despite this trend, many publishers are failing to appropriately capitalize on the programmatic opportunity. For example, the study found that less than 25 percent of programmatic team time is spent on value-creating activities, causing publishers to miss out on significant revenues.

    The study also closely analyzed the operations of those publishers that consistently outperform their peers in terms of value creation and efficiency, and arrived at best practices and approaches other publishers can follow to achieve similar success. Using simpler ad tech stack configurations, best in class publishers were on average 30% more efficient, had up to 24% higher CPMs, and delivered 10% more impressions otherwise lost to discrepancies.

    Head over to DoubleClick.com to read the full study.

    Posted by Yamini Gupta, Product Marketing team