Author Archives: Android Developers

How to effectively A/B test power consumption for your Android app’s features

Posted by Mayank Jain - Product Manager, and Yasser Dbeis - Software Engineer; Android Studio

Android developers have been telling us they're looking for tools to help optimize power consumption for different devices on Android.

The new Power Profiler in Android Studio helps Android developers by showing power consumption happening on devices as the app is being used. Understanding power consumption across Android devices can help Android developers identify and fix power consumption issues in their apps. They can run A/B tests to compare the power consumption of different algorithms, features or even different versions of their app.

The new Power Profiler in Android Studio
The new Power Profiler in Android Studio

Apps which are optimized for lower power consumption lead to an improved battery and thermal performance of the device, which means an improved user experience on Android.

This power consumption data is made available through the On Device Power Monitor (ODPM) on Pixel 6+ devices, segmented by each sub-system called “Power Rails”. See Profileable power rails for a list of supported sub-systems.

The Power Profiler can help app developers detect problems in several areas:

    • Detecting unoptimized code that is using more power than necessary.
    • Finding background tasks that are causing unnecessary CPU usage.
    • Identifying wakelocks that are keeping the device awake when they are not needed.

Once a power consumption issue has been identified, the Power Profiler can be used when testing different hypotheses to understand why the app could be consuming excessive power. For example, if the issue is caused by background tasks, the developer can try to stop the tasks from running unnecessarily or for longer periods. And if the issue is caused by wakelocks, the developer can try to release the wakelocks when the resource is not in use or use them more judiciously. Then compare the power consumption before/after the change using the Power Profiler.

In this blog post, we showcase a technique which uses A/B testing to understand how your app’s power consumption characteristics might change with different versions of the same feature - and how you can effectively measure them.

A real-life example of how the Power Profiler can be used to improve the battery life of an app.

Let’s assume you have an app through which users can purchase their favorite movies.

Sample app to demonstrate A/B testing for measure power consumption
Sample app to demonstrate A/B testing for measure power consumption 
Video (c) copyright Blender Foundation | www.bigbuckbunny.org

As your app becomes popular and is used by more users, you realize that a high quality 4K video takes very long to load every time the app is started. Because of its large size, you want to understand its impact on power consumption on the device.

Originally, this video was in 4K quality in the best of intentions, so as to showcase the best possible movie highlights to your customers.

This makes you think…

    • Do you really need a 4K video banner on the home screen?
    • Does it make sense to load a 4K video over the network every time your app is run?
    • How will the power consumption characteristics of your app change if you replace the 4K video with something of lower quality (while still preserving the vivid look & feel of the video)?

This is a perfect scenario to perform an A/B test for power consumption

With an A/B test, you can test two slightly different variations of the video banner feature and choose the one with the better power consumption characteristics.

Scenario A : Run the app with 4K video banner on screen & measure power consumption

Scenario B : Run the app with lower resolution video banner on screen & measure power consumption

A/B Test setup

Let's take a moment and set up our Android Studio profiler to run this A/B test. We need to start the app and attach the CPU profiler to it and trigger a system trace (where the Power Profiler will be shown).

Step 1

Create a custom “Run configuration” by clicking the 3 dot menu > Edit

Custom run configuration
Custom run configuration

Step 2

Then select the “Profiling” tab and ensure that “Start this recording on startup” and CPU Activity > System Trace is selected. Then click “Apply”.

Edit configuration settings
Edit configuration settings

Now simply run the “Profile app startup profiling with low overhead” whenever you want to run this app from start and attach the CPU profiler to it.

Note on precision

The following example scenarios use the entire app startup for estimating the power consumption for this blog’s purpose. However you can use more advanced techniques to have even higher precision in getting power readings. Some techniques to try are:

    • Isolate and measure power consumption for video playback only after a tap event on the video player
    • Use the trace markers API to mark the start and stop time for power measurement timeline - and then only measure power consumption within that marked window

Scenario A

In this scenario, we run the app with 4K video playing and measure power consumption for the first 30 seconds. We can optionally also run the scenario A multiple times and average out the readings. Once the System trace is shown in Android Studio, select the 0-30 second time range from the timeline selection panel and record as a screenshot for comparing against scenario B

Power consumption in scenario A - playing a 4k video
Power consumption in scenario A - playing a 4k video

As you can see, the average power consumed by WLAN, CPU cores & Memory combined is about 1,352 mW (milliwatts)

Now let's compare and contrast how this power consumption changes in Scenario B

Scenario B

In this scenario, we run the app with low quality video playing and measure power consumption for the first 30 seconds. As before, we can also optionally run scenario B multiple times and average out the power consumption readings. Again, once the System trace is shown in Android Studio, select the 0-30 second time range from the timeline selection panel.

Power consumption in scenario B - playing a lower quality video
Power consumption in scenario B - playing a lower quality video

The total power consumed by WLAN, CPU Little, CPU Big and CPU Mid & Memory is about 741 mW (milliwatts)

Conclusion

All else being equal, Scenario B (with lower quality video) consumed 741 mW power as compared to Scenario A (with 4K video) which required 1,352 mW power.

Scenario B (lower quality video) took 45% less power than Scenario A (4K) - while the lower quality video provides little to no visual difference in perceived quality of the app’s screen.

As a result of this A/B test for power consumption, you conclude that replacing the 4K video with a lower quality video on our app’s home screen not only reduces power consumption by 45%, also reduces the required network bandwidth and can potentially also improve the thermal performance of the devices.

If your app’s business logic still requires the 4K video to be shown on the app’s screen, you can explore strategies like:

    • Caching the 4K video across subsequent runs of the app.
    • Loading video on a user tap.
    • Loading an image initially and only load the video after the screen has fully rendered (delayed loading).

The overall power consumption numbers presented in the above A/B test scenario might seem small, but it shows the techniques that app developers can use to effectively A/B test power consumption for their app’s features using the Power Profiler in Android Studio.

Next Steps

The new Power Profiler is available in Android Studio Hedgehog onwards. To know more, please head over to the official documentation.

The First Beta of Android 15

Posted by Dave Burke, VP of Engineering

Android 14 logo


Today we're releasing the first beta of Android 15. With the progress we've made refining the features and stability of Android 15, it's time to open the experience up to both developers and early adopters, so you can now enroll any supported Pixel device here to get this and future Android 15 Beta and feature drop Beta updates over-the-air.

Android 15 continues our work to build a platform that helps improve your productivity, give users a premium app experience, protect user privacy and security, and make your app accessible to as many people as possible — all in a vibrant and diverse ecosystem of devices, silicon partners, and carriers.

Android delivers enhancements and new features year-round, and your feedback on the Android beta program plays a key role in helping Android continuously improve. The Android 15 developer site has lots more information about the beta, including downloads for Pixel and the release timeline. We’re looking forward to hearing what you think, and thank you in advance for your continued help in making Android a platform that works for everyone.

We’ll have lots more to share as we move through the release cycle, and be sure to tune into Google I/O where you can dive deeper into topics that interest you with over 100 sessions, workshops, codelabs, and demos.

Edge-to-edge

Apps targeting Android 15 are displayed edge-to-edge by default on Android 15 devices. This means that apps no longer need to explicitly call Window.setDecorFitsSystemWindows(false) or enableEdgeToEdge() to show their content behind the system bars, although we recommend continuing to call enableEdgeToEdge() to get the edge-to-edge experience on earlier Android releases.

To assist your app with going edge-to-edge, many of the Material 3 composables handle insets for you, based on how the composables are placed in your app according to the Material specifications.

a side-by-side comparison of App targets SDK 34 (left) and App targets SDK 35 (right) demonstrating edge-to-edge on an Android 15 device
On the left: App targets SDK 34 (Android 14) and is not edge-to-edge on an Android 15 device. On the right: App targets SDK 35 (Android 15) and is edge-to-edge on an Android 15 device. Note the Material 3 TopAppBar is automatically protecting the status bar, which would otherwise be transparent by default.

The system bars are transparent or translucent and content will draw behind by default. Refer to "Handle overlaps using insets" (Views) or Window insets in Compose to see how to prevent important touch targets from being hidden by the system bars.

Smoother NFC experiences - part 2

Android 15 is working to make the tap to pay experience more seamless and reliable while continuing to support Android's robust NFC app ecosystem. In addition to the observe mode changes from Android 15 developer preview 2, apps can now register a fingerprint on supported devices so they can be notified of polling loop activity, which allows for smooth operation with multiple NFC-aware applications.

Inter-character justification

Starting with Android 15, text can be justified utilizing letter spacing by using JUSTIFICATION_MODE_INTER_CHARACTER. Inter-word justification was first introduced in Android O, but inter-character solves for languages that use the white space for segmentation, e.g. Chinese, Japanese, etc.

JUSTIFICATION_MODE_NONE
image shows how japanese kanji (top) and english alphabet characters (bottom) appear with JUSTIFICATION_MODE_NONE
JUSTIFICATION_MODE_INTER_WORD
image shows how japanese kanji (top) and english alphabet characters (bottom) appear with JUSTIFICATION_MODE_INTER_WORD
JUSTIFICATION_MODE_INTER_CHARACTER
image shows how japanese kanji (top) and english alphabet characters (bottom) appear with JUSTIFICATION_MODE_INTER_WORD

App archiving

Android and Google Play announced support for app archiving last year, allowing users to free up space by partially removing infrequently used apps from the device that were published using Android App Bundle on Google Play. Android 15 now includes OS level support for app archiving and unarchiving, making it easier for all app stores to implement it.

Apps with the REQUEST_DELETE_PACKAGES permission can call the PackageInstaller requestArchive method to request archiving a currently installed app package, which removes the APK and any cached files, but persists user data. Archived apps are returned as displayable apps through the LauncherApps APIs; users will see a UI treatment to highlight that those apps are archived. If a user taps on an archived app, the responsible installer will get a request to unarchive it, and the restoration process can be monitored by the ACTION_PACKAGE_ADDED broadcast.

App-managed profiling

Android 15 includes the all new ProfilingManager class, which allows you to collect profiling information from within your app. We're planning to wrap this with an Android Jetpack API that will simplify construction of profiling requests, but the core API will allow the collection of heap dumps, heap profiles, stack sampling, and more. It provides a callback to your app with a supplied tag to identify the output file, which is delivered to your app's files directory. The API does rate limiting to minimize the performance impact.

Better Braille

In Android 15, we've made it possible for TalkBack to support Braille displays that are using the HID standard over both USB and secure Bluetooth.

This standard, much like the one used by mice and keyboards, will help Android support a wider range of Braille displays over time.

Key management for end-to-end encryption

We are introducing the E2eeContactKeysManager in Android 15, which facilitates end-to-end encryption (E2EE) in your Android apps by providing an OS-level API for the storage of cryptographic public keys.

The E2eeContactKeysManager is designed to integrate with the platform contacts app to give users a centralized way to manage and verify their contacts' public keys.

Secured background activity launches

Android 15 brings additional changes to prevent malicious background apps from bringing other apps to the foreground, elevating their privileges, and abusing user interaction, aiming to protect users from malicious apps and give them more control over their devices. Background activity launches have been restricted since Android 10.

App compatibility

With Android 15 now in beta, we're opening up access to early-adopter users as well as developers, so if you haven't yet tested your app for compatibility with Android 15, now is the time to do it. In the weeks ahead, you can expect more users to try your app on Android 15 and raise issues they find.

To test for compatibility, install your published app on a device or emulator running Android 15 beta and work through all of your app's flows. Review the behavior changes to focus your testing. After you've resolved any issues, publish an update as soon as possible.

To give you more time to plan for app compatibility work, we’re letting you know our Platform Stability milestone well in advance.

Android 15 release timeline

At this milestone, we’ll deliver final SDK/NDK APIs and also final internal APIs and app-facing system behaviors. We’re expecting to reach Platform Stability in June 2024, and from that time you’ll have several months before the official release to do your final testing. The release timeline details are here.

Get started with Android 15

Today's beta release has everything you need to try the Android 15 features, test your apps, and give us feedback. Now that we've entered the beta phase, you can enroll any supported Pixel device here to get this and future Android Beta updates over-the-air. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you're already in the Android 14 QPR beta program on a supported device, or have installed the developer preview, you'll automatically get updated to Android 15 Beta 1.

For the best development experience with Android 15, we recommend that you use the latest version of Android Studio Jellyfish (or more recent Jellyfish+ versions). Once you’re set up, here are some of the things you should do:

    • Try the new features and APIs - your feedback is critical during the early part of the developer preview and beta program. Report issues in our tracker on the feedback page.
    • Test your current app for compatibility - learn whether your app is affected by changes in Android 15; install your app onto a device or emulator running Android 15 and extensively test it.

We’ll update the beta system images and SDK regularly throughout the Android 15 release cycle. Read more here.

For complete information, visit the Android 15 developer site.


Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.

Google Drive cut code and development time in half with Jetpack Compose and new architecture

Posted by Nick Butcher – Product Manager for Jetpack Compose, and Florina Muntenescu – Developer Relations Engineer

As one of the world’s most popular cloud-based storage services, Google Drive lets people do more than just store their files online. With Drive, users can synchronize, share, search, edit, and even pin specified files and content for safe and secure offline use.

Recently, Drive’s developers revamped the application’s home screen to provide a more seamless experience across devices, matching updates made to Google Drive’s web version. However, the app’s previous architecture and codebase would’ve prevented the team from completing the updates in a timely manner.

Instead of struggling with the app’s previous tech stack to implement the update, the Drive team rebuilt the home page from the ground up using Android’s recommended architecture and Jetpack Compose, Android’s modern declarative toolkit for creating native UI.

Compose, combined with architecture improvements, cut our development time nearly in half.” — Dale Hawkins, Senior software engineer and tech lead at Google Drive

Experimenting with Kotlin and Compose

The Drive team experimented with Kotlin — which the Compose toolkit is built with — for several months before planning the app’s home screen rebuild. Drive’s developers liked Kotlin’s improved syntax and null enforcement, making it easier to produce code.

“We had been using RxJava, but started looking into replacing that with coroutines,” said Dale Hawkins, the features team lead for Google Drive. “This led to a more natural alignment between coroutines and Jetpack Compose. After a deep dive into Compose, we came away with a clear understanding of how Compose has numerous benefits over the Views-based approach.”

Following the Kotlin exploration, Dale experimented with Jetpack Compose. “I was pleased with how easy it was to build the UI using Compose. So I continued the experiment after that week,” said Dale. “I eventually rewrote the feature using Compose.”

Using Compose

Shortly after experimenting with Jetpack Compose, the Drive team decided to use it to completely rebuild the app’s home screen UI.

“We wanted to make some major changes to match the ones being done for the web version, but that project had a several-month head start. We wanted to release the Android version shortly after the web changes went live to ensure our users have a seamless Google Drive experience across devices,” said Dale.

The Drive team's experimentation and testing with Jetpack Compose showed that the new toolkit was powerful and reliable and that it would enable them to move faster. With this in mind, the Drive team decided to step away from their old codebase and embrace Jetpack Compose for the app’s home screen update. Not only would it be quicker and easier, but it would also better prepare the team to easily make future UI changes.

Using Android’s guidance for architecture

Before going all-in with Jetpack Compose, Drive developers wanted to restructure the application by implementing a completely new app architecture. Drive developers followed Android’s official architecture guidance to apply structural changes, paving the way for the new Kotlin codebase.

“The recommended architecture reinforces good separation between layers,” said Quintin Knudsen, an Android engineer for Google Drive. “We work in a highly dynamic environment and need to be able to adjust to any app changes. Using well-defined and independent layers helps isolate any changes or UI requirements. The recommendations from Android offered sound ways to structure the layers.” With a clear separation between the app’s data and UI layers, developers could work in parallel to significantly speed up testing and development.

Drive developers also relied on Mappers and UseCases when creating the new architecture. These patterns allowed them to create flexible code that is easier to manage. They also exposed flows from their ViewModels to make the UI respond immediately to any data changes, making it much simpler to implement and understand UI updates.

Less code, faster development

With the app’s newly improved architecture and Jetpack Compose, the Drive team was able to develop the app’s new home screen in less than half the time that they expected. They also implemented the new code and finished quality assurance testing nearly seven weeks ahead of schedule.

“Thanks to Compose, we had the groundwork done within a couple of weeks. We delivered a great implementation over a month ahead of schedule, and it’s been praised by product, UX, and even other engineering teams,” said Dale.

Despite having fewer features, the original home screen required over 12,000 lines of code. The new Compose-based home screen has many new features and only required 5,100 lines of code—a 57% reduction. Having less code makes it much easier for developers to maintain the app and implement any updates.

Testing the new UI in Jetpack Compose also required significantly less code. Before Compose, Drive developers used roughly 9,000 lines of code to test about 62% of the UI. With Compose, it took only 2,200 lines to test over 80% of the new UI.

The original home screen required over 12,000 lines of code. The Compose-based home screen only required 5,100 lines of code. That’s a 57% reduction.” — Dale Hawkins, Senior software engineer and tech lead at Google Drive

Looking forward

A new and improved app architecture paired with Jetpack Compose allowed Drive developers to rebuild the app’s home screen UI faster and easier than they could’ve imagined. The Drive team plans to expand its use of Compose within the application for things like supporting large dynamic displays and text resizing.

“As we work on new projects, we’re taking the opportunity to update older UI code to make use of our new architecture and Compose. The new code will be objectively better and features will be easier to write, test, and maintain,” said Dale.

Get started

Improve app architecture using Android’s official architecture guidance and optimize your UI development with Jetpack Compose.

Android Studio uses Gemini Pro to make Android development faster and easier

Posted by Sandhya Mohan – Product Manager, Android Studio

As part of the next chapter of our Gemini era, we announced we were bringing Gemini to more products. Today we’re excited to announce that Android Studio is using the Gemini 1.0 Pro model to make Android development faster and easier, and we’ve seen significant improvements in response quality over the last several months through our internal testing. In addition, we are making this transition more apparent by announcing that Studio Bot is now called Gemini in Android Studio.

Gemini in Android Studio is an AI-powered coding assistant which can be accessed directly in the IDE. It can accelerate your ability to develop high-quality Android apps faster by helping generate code for your app, providing complex code completions, answering your questions, finding relevant resources, adding code comments and more — all without ever having to leave Android Studio. It is available in 180+ countries and territories in Android Studio Jellyfish.

If you were already using Studio Bot in the canary channel, you’ll continue experiencing the same helpful and powerful features, but you’ll notice improved quality in responses compared to earlier versions.

Ask Gemini your Android development questions

Gemini in Android Studio can understand natural language, so you can ask development questions in your own words. You can enter your questions in the chat window ranging from very simple and open-ended ones to specific problems that you need help with.

Here are some examples of the types of queries it can answer:

    • How do I add camera support to my app?
    • Using Compose, I need a login screen with the following: a username field, a password field, a 'Sign In' button, a 'Forgot Password?' link. I want the password field to obscure the input.
    • What's the best way to get location on Android?
    • I have an 'orders' table with columns like 'order_id', 'customer_id', 'product_id', 'price', and 'order_date'. Can you help me write a query that calculates the average order value per customer over the last month?
Moving image demonstrating a conversation in Android Studio

Gemini in Android Studio remembers the context of the conversation, so you can also ask follow-up questions, such as “Can you give me the code for this in Kotlin?” or “Can you show me how to do it in Compose?”

Code faster with AI powered Code Completions

Gemini in Android Studio can help you be more productive by providing you with powerful AI code completions. You can receive suggestions of multi-line code completions, suggestions for how to do comments for your code, or how to add documentation to your code.

Moving image demonstrating code completion in Android Studio

Designed with privacy in mind

Gemini in Android Studio was designed with privacy in mind. Gemini is only available after you log in and enable it. You don’t need to send your code context to take advantage of most features. By default, Gemini in Android Studio’s chat responses are purely based on conversation history, and you control whether you want to share additional context for customized responses. You can update this anytime in Android Studio > Settings at a granular project level. We also have a custom way for you to opt out certain files and folders through an .aiexclude file. Much like our work on other AI projects, we stick to a set of AI Principles that hold us accountable. Learn more here.

image of share settings in Android Studio

Build a Generative AI app using the Gemini API starter template

Not only does Android Studio use Gemini to help you be more productive, it can also help you take advantage of Gemini models to create AI-powered features in your applications. Get started in minutes using the Gemini API starter template available in the canary release – channel for Android Studio – under File > New Project > Gemini API Starter. You can also use the code sample available at File > Import Sample > Google Generative AI sample.

The Gemini API is multimodal, meaning it can support image and text inputs. For example, it can support conversational chat, summarization, translation, caption generation etc. using both text and image inputs.

image of starter templates in Android Studio

Try Gemini in Android Studio

Gemini in Android Studio is still in preview, but we have added many feature improvements — and now a major model update — since we released the experience in May 2023. It is currently no-cost for developers to try out. Now is a great time to test it and let us know what you think, before we release this experience to stable.


Stay updated on the latest by following us on LinkedIn, Medium, YouTube, or X. Let's build the future of Android apps together!

Battling Impersonation Scams: Monzo’s Innovative Approach

Posted by Todd Burner – Developer Relations Engineer

Cybercriminals continue to invest in advanced financial fraud scams, costing consumers more than $1 trillion in losses. According to the 2023 Global State of Scams Report by the Global Anti-Scam Alliance, 78 percent of mobile users surveyed experienced at least one scam in the last year. Of those surveyed, 45 percent said they’re experiencing more scams in the last 12 months.

ALT TEXT

The Global Scam Report also found that phone calls are the top method to initiate a scam. Scammers frequently employ social engineering tactics to deceive mobile users.

The key place these scammers want individuals to take action are in the tools that give access to their money. This means financial services are frequently targeted. As cybercriminals push forward with more scams, and their reach extends globally, it’s important to innovate in the response.

One such innovator is Monzo, who have been able to tackle scam calls through a unique impersonation detection feature in their app.

Monzo’s Innovative Approach

Founded in 2015, Monzo is the largest digital bank in the UK with presence in the US as well. Their mission is to make money work for everyone with an ambition to become the one app customers turn to to manage their entire financial lives.

Monzo logo

Impersonation fraud is an issue that the entire industry is grappling with and Monzo decided to take action and introduce an industry-first tool. An impersonation scam is a very common social engineering tactic when a criminal pretends to be someone else so they can encourage you to send them money. These scams often involve using urgent pretenses that involve a risk to a user’s finances or an opportunity for quick wealth. With this pressure, fraudsters convince users to disable security safeguards and ignore proactive warnings for potential malware, scams, and phishing.

Call Status Feature

Android offers multiple layers of spam and phishing protection for users including call ID and spam protection in the Phone by Google app. Monzo’s team wanted to enhance that protection by leveraging their in-house telephone systems. By integrating with their mobile application infrastructure they could help their customers confirm in real time when they’re actually talking to a member of Monzo’s customer support team in a privacy preserving way.

If someone calls a Monzo customer stating they are from the bank, their users can go into the app to verify this. In the Monzo app’s Privacy & Security section, users can see the ‘Monzo Call Status’, letting them know if there is an active call ongoing with an actual Monzo team member.

“We’ve built this industry-first feature using our world-class tech to provide an additional layer of comfort and security. Our hope is that this could stop instances of impersonation scams for Monzo customers from happening in the first place and impacting customers.” 

- Priyesh Patel, Senior Staff Engineer, Monzo’s Security team

Keeping Customers Informed

If a user is not talking to a member of Monzo’s customer support team they will see that as well as some helpful information. If the ‘Monzo call status’ is showing that you are not speaking to Monzo, the call status feature tells you to hang up right away and report it to their team. Their customers can start a scam report directly from the call status feature in the app.

screen grab of Monzo call status alerting the customer that the call the customer is receiving is not coming from Monzo. The customer is being advised to end the call

If a genuine call is ongoing the customer will see the information.

screen grab of Monzo call status confirming to the customer that the call the customer is receiving is coming from Monzo.

How does it work?

Monzo has integrated a few systems together to help inform their customers. A cross functional team was put together to build a solution.

Monzo’s in-house technology stack meant that the systems that power their app and customer service phone calls can easily communicate with one another. This allowed them to link the two and share details of customer service calls with their app, accurately and in real-time.

The team then worked to identify edge cases, like when the user is offline. In this situation Monzo recommends that customers don’t speak to anyone claiming they’re from Monzo until you’re connected to the internet again and can check the call status within the app.

screen grab of Monzo call status displaying warning while the customer is offline letting the customer know the app is unable to verify whether or not the call is coming from Monzo, so it is safer not to answer.

Results and Next Steps

The feature has proven highly effective in safeguarding customers, and received universal praise from industry experts and consumer champions.

“Since we launched Call Status, we receive an average of around 700 reports of suspected fraud from our customers through the feature per month. Now that it’s live and helping protect customers, we’re always looking for ways to improve Call Status - like making it more visible and easier to find if you’re on a call and you want to quickly check that who you’re speaking to is who they say they are.” 

- Priyesh Patel, Senior Staff Engineer, Monzo’s Security team

Final Advice

Monzo continues to invest and innovate in fraud prevention. The call status feature brings together both technological innovation and customer education to achieve its success, and gives their customers a way to catch scammers in action.

A layered security approach is a great way to protect users. Android and Google Play provide layers like app sandboxing, Google Play Protect, and privacy preserving permissions, and Monzo has built an additional one in a privacy-preserving way.

To learn more about Android and Play’s protections and to further protect your app check out these resources:

#WeArePlay | Meet the founders changing women’s lives: Women’s History Month Stories

Posted by Leticia Lago – Developer Marketing

In celebration of Women’s History month, we’re celebrating the founders behind groundbreaking apps and games from around the world - made by women or for women. Let's discover four of my favorites in this latest batch of nine #WeArePlay stories.


Múkami Kinoti Kimotho

Royelles Revolution / Royelles Revolution: Gaming For Girls (USA)

Múkami Kinoti Kimotho – Royelles Revolution / Royelles- Gaming For Girls | USA

Múkami's journey began when she noticed the lack of representation for girls in the gaming industry. Determined to change this narrative, she created Royelles, a game designed to inspire girls and non-binary people to pursue careers in STEAM (science, technology, engineering, art, math) fields. The game is anchored in fierce female avatars like the real life NASA scientist Mara who voices a character. Royelles is revolutionizing the gaming landscape and empowering the next generation of innovators. Múkami's excited to release more gamified stories and learning modules, and a range of extended reality and AI-powered avatars based on the game’s characters.

"If we're going to effectively educate Gen Z and Gen Alpha, we have to meet them in the metaverse and leverage gamified play as a means of driving education, awareness, inspiration and empowerment.” 

- Múkami

Leonika Sari Njoto Boedioetomo

Reblood: Blood Services App (Indonesia)

Leonika Sari Njoto Boedioetomo – Reblood / Blood Services App | Indonesia

When her university friend needed an urgent blood transfusion but discovered there was none available in the blood bank, Leonika became aware of the blood donation shortage in Indonesia. Her mission to address this led her to create Reblood, an app connecting blood donors with those in need. With over 140,000 blood donations facilitated to date, Reblood is not only saving lives but also promoting healthier lifestyles with a recently added feature that allows people to find the most affordable medical checkups.

“Our goal is to save more lives by raising awareness of blood donation in Indonesia and promoting healthier lifestyles for blood donors.” 

- Leonika

Luciane Antunes dos Santos and Renato Hélio Rauber

CARSUL / Car Sul: Urban Mobility App (Brazil)

Luciane Antunes dos Santos and Renato Hélio Rauber – Car Sul: Urban Mobility App | Brazil

Luciane was devastated when she lost her son in a car accident. Her and her husband Renato's loss led them to develop Carsul, an urban mobility app prioritizing safety and security. By providing safe transportation options and partnering with government health programs to chauffeur patients long distances to larger hospitals, Carsul is not only preventing accidents but also saving lives. Luciane and Renato's dedication to protecting others from the pain they've experienced is ongoing and they plan to expand to more cities in Brazil.

“Carsul was born from this story of loss, inspiring me to protect other lives. Redefining myself in this way is very rewarding.” 

- Luciane

Diariata (Diata) N'Diaye

Resonantes / App-Elles: Safety App for Women (France)

Diariata (Diata) N'Diaye – Resonantes /App-Elles: Safety App for Women | France

After hearing the stories of young people who had experienced abuse that was similar to her own, Spoken word artist Diata developed App-Elles – an app that allows women to send alerts when they're in danger. By connecting users with support networks and professional services, App-Elles is empowering women to reclaim their safety and seek help when needed.Diata also runs writing and recording workshops to help victims overcome their experiences with violence and has plans to expand her app with the introduction of a discreet wearable that sends out alerts.

“I realized from my work on the ground that there were victims of violence who needed help and support systems. This was my inspiration to create App-Elles." 

- Diata


Discover more #WeArePlay stories and share your favorites.



How useful did you find this blog post?

The Second Developer Preview of Android 15

Posted by Dave Burke, VP of Engineering

Android 14 logo


Today marks the second chapter of the Android 15 story with the release of Android 15 Developer Preview 2!

Android 15 continues our work to build a platform that helps improve your productivity while giving you new capabilities to produce superior media and AI experiences, take advantage of device form factors, minimize battery impact, maximize smooth app performance, and protect user privacy and security, all on the most diverse lineup of devices out there.

Android continues to add features enabling your apps to take advantage of premium device hardware, including the latest telecommunications features, high-end media capabilities, dazzling displays, foldable/filppable form factors, and AI processing.

Your feedback on the Android 15 Developer Preview and Beta program plays a key role in helping Android continuously improve. The Android 15 developer site has more information about the preview, including downloads for Pixel and detailed documentation about changes. This preview is just the beginning, and we’ll have lots more to share as we move through the release cycle. Thank you in advance for your help in making Android a platform that works for everyone.

Updating Android communications

Android 15 updates the platform to give your app access to the latest advances in communication.

Satellite support

Android 15 continues to extend platform support for satellite connectivity and includes some UI elements to ensure a consistent user experience across the satellite connectivity landscape.

screen schot of a mobile Android device showing notification when device connects to satellite
Notification when device connects to satellite

Apps can use ServiceState.isUsingNonTerrestrialNetwork() to detect when a device is connected to a satellite, giving them more awareness of why full network services may be unavailable. Additionally, Android 15 provides support for SMS/ MMS applications as well as preloaded RCS applications to use satellite connectivity for sending and receiving messages.

Smoother NFC experiences

Android 15 is working to make the tap to pay experience more seamless and reliable while continuing to support Android's robust NFC app ecosystem. On supported devices, apps can request the NfcAdapter enter observe mode, where the device will listen but not respond to NFC readers, sending the app's NFC service PollingFrame objects to process. The PollingFrame objects

can be used to auth ahead of the first communication to the NFC reader, allowing for a one tap transaction in many cases.

Developer productivity

While most of our work to improve your productivity centers around tools like Android Studio, Jetpack Compose, and the Android Jetpack libraries, we always look for ways in the platform to help you more easily realize your vision.

PDF Improvements

screen schot of a mobile Android device showing search enabled for PDF files
Enable searching embedded PDF files with updates to PdfRenderer

Android 15 Developer Preview 2 includes an early preview of substantial improvements to the PdfRenderer APIs, giving apps capabilities to incorporate advanced features such as rendering password-protected files, annotations, form editing, searching, and selection with copy. Linearized PDF optimizations are supported to speed local PDF viewing and reduce resource use.

The PdfRenderer has been moved to a module that can be updated using Google Play system updates independent of the platform release, and we're supporting these changes back to Android R by creating a compatible pre-Android 15 version of the API surface, called PdfRendererPreV.

We value your feedback on the enhancements we've made to the PdfRenderer API surface, and we plan to make it much easier to incorporate these APIs into your app with an upcoming Android Jetpack library. Stay tuned.

Automatic language switching refinements

Android 14 added on-device multi-language audio recognition with automatic switching between languages, but this can cause words to get dropped, especially when languages switch with less of a pause between the two utterances. Android 15 has added additional controls to allow apps to help tune this switching for their use case. EXTRA_LANGUAGE_SWITCH_INITIAL_ACTIVE_DURATION_TIME_MILLIS confines the automatic switching to the beginning of the audio session, while EXTRA_LANGUAGE_SWITCH_MATCH_SWITCHES deactivates the language switching after a defined number of switches. This can be a useful refinement, particularly if the expectation is that there will be a single language spoken during the session that should be autodetected.

Granular line break controls

Starting in Android 15, the TextView and the underlying line breaker can preserve the given portion of text in the same line to improve readability. You can take advantage of this line break customization by using the <nobreak> tag in string resources or createNoBreakSpan. Similarly, you can preserve words from hyphenation by using the <nohyphen> tag or createNoHyphenationSpan.

Examples and screenshots:

<resources>
    <string name="pixel8pro">The power and brains behind Pixel 8 Pro.</string>
</resources>
text reads: The power and brains behind Pixel 8 Pro.
<resources>
    <string name="pixel8pro">The power and brains behind <nobreak>Pixel 8 Pro.</nobreak></string>
</resources>
text reads: The power and brains behind Pixel 8 Pro.

Expanded IntentFilter Functionality

Android 15 builds-in support for more precise Intent resolution through UriRelativeFilterGroup, which contain a set of UriRelativeFilter objects that form a set of Intent matching rules that must each be satisfied, including URL query parameters, URL fragments, and blocking/exclusion rules. This helps applications better keep up with the dynamic demands of web-hosted deep links.

These rules can be defined in the AndroidManifest with the new <uri-relative-filter-group> tag which can optionally include an android:allow tag. These tags can contain tags that use existing data tag attributes as well as the new android:query and android:fragment attributes.

An example of the AndroidManifest syntax that will be supported:

<intent-filter>
  <action android:name="android.intent.action.VIEW" />
  <category android:name="android.intent.category.BROWSABLE" />
  <data android:scheme="http" />
  <data android:scheme="https" />
  <data android:domain="astore.com" />
  <uri-relative-filter-group>
    <data android:pathPrefix="/auth" />
    <data android:query="region=na" />
  </uri-relative-filter-group>
  <uri-relative-filter-group android:allow="false">
    <data android:pathPrefix="/auth" />
    <data android:query="mobileoptout=true" />
  </uri-relative-filter-group>
  <uri-relative-filter-group android:allow="false">
    <data android:pathPrefix="/auth" />
    <data android:fragmentPrefix="faq" />
  </uri-relative-filter-group>
</intent-filter>

More OpenJDK API support

Android 15 continues to add OpenJDK APIs. Developer Preview 2 includes support for additional math/strictmath methods, lots of util updates including sequenced collection/map/set, ByteBuffer support in Deflater, and security key updates. These APIs are updated on over a billion devices running Android 12+ through Android 15 through Google Play System updates so you can target the latest programming features.

Giving your app more flexibility on more screens

Android 15 gives your apps the support to get the most out of Android's form factors, including large screens, flippables, and foldables.

Cover screen support

Your app can declare a property that Android 15 uses to allow your Application or Activity to be presented on the small cover screens of supported flippable devices. These screens are too small to be considered as compatible targets for Android apps to run on, but your app can opt-in to supporting them, making your app available in more places.

A more private, secure Android

We're always looking to give users more transparency and control over their data while enhancing the core security features of the platform.

Screen record detection

Android 15 adds support for apps to detect that they are being recorded. A callback is invoked whenever the app transitions between being visible or invisible within a screen recording. (An app is considered visible if activities owned by the registering process's UID are being recorded.) This way, if your app is performing a sensitive operation, you can inform the user that they're being recorded.

val mCallback = Consumer<Int> { state ->
  if (state == SCREEN_RECORDING_STATE_VISIBLE) {
    // we're being recorded
  } else {
    // we’re not being recorded
  }
}

override fun onStart() {
   super.onStart()
   val initialState =
      windowManager.addScreenRecordingCallback(mainExecutor, mCallback)
   mCallback.accept(initialState)
}

override fun onStop() {
    super.onStop()
    windowManager.removeScreenRecordingCallback(mCallback)
}

Making Android more efficient

We are introducing new APIs that can help you gather insights about your apps, continuing to optimize the way background applications work, and providing APIs to help make tasks in your app more efficient to execute.

ApplicationStartInfo API

App startup on Android has always been a bit of a mystery. There was no easy way to know within your app whether it started from a cold, warm, or hot state. It was difficult to know how long your app spent during the various launch phases: forking the process, calling onCreate, drawing the first frame, and more. When your application class was instantiated, you had no way of knowing whether the app started from a broadcast, a content provider, a job, a backup, boot complete, an alarm, or an Activity.

The ApplicationStartInfo API on Android 15 gives you all of this and more. You can even choose to add your own timestamps into the flow to make it easy to collect timing data in one place. In addition to collecting metrics, you can use ApplicationStartInfo to help directly optimize app startup; for example, you can eliminate the costly instantiation of UI-related libraries within your Application class when your app is starting up due to a broadcast.

Changes to package stopped state

Android 15 includes several improvements to the PackageManager’s Stopped State. Apps that are in a Stopped State should only be leaving this state through direct user action. Furthermore, apps entering the Stopped State will have their PendingIntents removed. To help developers re-register their pending intents, apps will now receive the BOOT_COMPLETED broadcast once they are removed from the Stopped State. Lastly, the new ApplicationStartInfo will also include the ApplicationStartInfo.wasForceStopped() to let developers know that their app was put into the Stopped State.

Detailed app size information

Android has offered an API, StorageStats.getAppBytes(), that summarizes the installed size of an app as a single number of bytes, which is a sum of the APK size, the size of files extracted from the APK, and files that were generated on the device such as ahead-of-time (AOT) compiled code. This number is not very insightful in terms of how your app is using storage.

Android 15 adds the StorageStats.getAppBytesByDataType([type]) API, which allows you to get insight into how your app is using up all that space, including apk file splits, AOT and speedup related code, dex metadata, libraries, and guided profiles.

Changes to foreground services

Android 14 began requiring Foreground Service Types. The documentation mentions that the dataSync Foreground Service type will be deprecated in a future version of Android.

To support migrating away from the dataSync Foreground Service type, Android 15 includes the mediaProcessing Foreground Service type, which is used to perform time-consuming operations on media assets, like converting media to different formats. In a future Beta release, this service will have a runtime limit of 6 hours.

SQLite database

Android 15 introduces new SQLite APIs that expose advanced features from the underlying SQLite engine that target specific performance issues that can manifest in apps.

Developers should consult best practices for SQLite performance to get the most out of their SQLite database, especially when working with large databases or when running latency-sensitive queries.

    • Row counts and IDs: new APIs were added to retrieve the count of changed rows or the last inserted row ID without issuing an additional query. getLastChangedRowCount() will return the number of rows that were inserted, updated, or deleted by the most recent SQL statement within the current transaction, while getTotalChangedRowCount() will return the count on the current connection. getLastInsertRowId() will return the “rowid” of the last row to be inserted on the current connection.
    • Raw statements: issue a raw SQlite statement, bypassing convenience wrappers and any additional processing overhead that they may incur.

Media refinements

Each release of Android focuses on improving the media experience.

HDR Headroom Control

side by side images of SDR content
The image on the left shows a view with SDR content. The image on the right simulates perceived headroom issues with SDR and HDR mixed content, which we can avoid by setting the desired HDR headroom.

Android 15 chooses HDR headroom that is appropriate for the underlying device capabilities and bit-depth of the panel; for pages that have lots of SDR content such as a messaging app displaying a single HDR thumbnail, this can end up adversely influencing the perceived brightness of the SDR content. Android 15 allows you to control the HDR headroom with setDesiredHdrHeadroom to strike a balance between SDR and HDR content.

Loudness Control

moving image of Droid wearing headphones and bopping his head rhythmically
Android 15 introduces support for the CTA-2075 loudness standard to help you avoid audio loudness inconsistencies and ensure users don't have to constantly adjust volume when switching between content. The system leverages known characteristics of the output devices (headphones, speaker) along with loudness metadata available in AAC audio content to intelligently adjust the audio loudness and dynamic range compression levels.

To enable this feature, you need to ensure loudness metadata is available in your AAC content and enable the platform feature in your app. For this, you instantiate a LoudnessCodecController object by calling its create factory method with the audio session ID from the associated AudioTrack; this automatically starts applying audio updates. You can pass an OnLoudnessCodecUpdateListener to modify/filter loudness parameters before they are applied on the MediaCodec.

// media contains metadata of type MPEG_4 OR MPEG_D
val mediaCodec = ...
val audioTrack = AudioTrack.Builder()
                                .setSessionId(sessionId)
                                .build()
...
// create new loudness controller that applies the parameters to the MediaCodec
try {
   val lcController = LoudnessCodecController.create(mSessionId)
   // starts applying audio updates for each added MediaCodec

AndroidX media3 ExoPlayer will soon be updated to leverage LoudnessCodecController APIs for a seamless app integration.

Use Spatializer instead of Virtualizer

Android 12 included the Spatializer class, which enables querying the capabilities and behavior of sound spatialization on the device. In Android 15, we're deprecating the Virtualizer class; instead use AudioAttributes.Builder.setSpatializationBehavior to characterize how you want your content to be played when spatialization is supported.

AndroidX media3 ExoPlayer 1.0 enables spatial audio by default for multichannel audio when the device supports it. See the blog post and documentation for more information, including APIs to control the feature.

User Experience

AutomaticZenRules allow apps to customize Attention Management (Do Not Disturb) rules and decide when to activate/deactivate them. Android 15 greatly enhances these rules with the goal of improving the user experience. It does this by:

    • Adding types to AutomaticZenRule, allowing the system to apply special treatment to some rules
    • Adding an icon to AutomaticZenRule, helping to make the modes be more recognizable
    • Adding a triggerDescription string to AutomaticZenRule that describes the conditions on which the rule should become active for the user
    • Added ZenDeviceEffects to AutomaticZenRule, allowing rules to trigger things like grayscale display, night mode, or dimming the wallpaper

Behavior changes

Because backward compatibility is so important to us, we try to limit impactful behavior changes, but some are inevitable.

Elegant fonts everywhere

Once your app targets Android 15, the elegantTextHeight TextView attribute becomes true by default, replacing the compact font used by default with some scripts that have large vertical metrics with one that is much more readable. The compact font was introduced to prevent breaking layouts; Android 13 prevents many of these breakages by allowing the text layout to stretch the vertical height utilizing the fallbackLineSpacing attribute. In Android 15, the compact font still remains in the system, so your app can set elegantTextHeight to false to get the same behavior as before, but it is unlikely to be supported in upcoming releases. So, if your application supports the following scripts: Arabic, Lao, Myanmar, Tamil, Gujarati, Kannada, Malayalam, Odia, Telugu or Thai, please test your applications by setting elegantTextHeight to true.

Examples and screenshots

Default behavior as of Android 14

Default behavior as of Android 14

Default behavior for applications that target Android 15

Default behavior as of Android 14

App compatibility

Android 15 release timeline

To give you more time to plan for app compatibility work, we’re letting you know our Platform Stability milestone well in advance.

At this milestone, we’ll deliver final SDK/NDK APIs and also final internal APIs and app-facing system behaviors. We’re expecting to reach Platform Stability in June 2024, and from that time you’ll have several months before the official release to do your final testing. The release timeline details are here.

Get started with Android 15

The Developer Preview has everything you need to try the Android 15 features, test your apps, and give us feedback. You can get started today by flashing a system image onto a Pixel 6, 7, or 8 series device, along with the Pixel Fold and Pixel Tablet. We are not offering sideload images for Developer Preview 2. If you don’t have a Pixel device, you can use the 64-bit system images with the Android Emulator in Android Studio. If you've already installed Android 15 Developer Preview 1, you should get an over-the-air update to Android 15 Developer Preview 2.

For the best development experience with Android 15, we recommend that you use the latest preview of Android Studio Jellyfish (or more recent Jellyfish+ versions). Once you’re set up, here are some of the things you should do:

    • Try the new features and APIs - your feedback is critical during the early part of the developer preview. Report issues in our tracker on the feedback page.
    • Test your current app for compatibility - learn whether your app is affected by changes in Android 15; install your app onto a device or emulator running Android 15 and extensively test it.

We’ll update the preview system images and SDK regularly throughout the Android 15 release cycle. This preview release is for developers only and not intended for daily or consumer use, so we're making it available by manual download only. Once you’ve manually installed a preview build, you’ll automatically get future updates over-the-air for all later previews and Betas. Read more here.

If you intend to move from the Android 14 QPR Beta program to the Android 15 Developer Preview program and don't want to have to wipe your device, we recommend that you move to Developer Preview 2 now. Otherwise you may run into time periods where the Android 14 Beta will have a more recent build date which will prevent you from going directly to the Android 15 Developer Preview without doing a data wipe.

As we reach our Beta releases, we'll be inviting consumers to try Android 15 as well, and we'll open up enrollment for the Android Beta program at that time. For now, please note that the Android Beta program is not yet available for Android 15.

For complete information, visit the Android 15 developer site.

Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.

Tune in for Google I/O on May 14

Posted by Jeanine Banks – VP & General Manager, Developer X, and Head of Developer Relations

Google I/O is arriving this year on May 14th and you’re invited to join us online! I/O offers something for everyone, whether you are developing a new application, modernizing an existing one, or transforming it into a business.

The Gemini era unlocks new possibilities for developers to build creative and productive AI-enabled applications. I/O is where you’ll hear how you can get from idea to production AI applications faster. We’re excited to share what’s new for mobile, web, and multiplatform development, and how to scale your applications in the cloud. You will be able to dive deeper into topics that interest you with over 100 sessions, workshops, codelabs, and demos.

Visit the Google I/O site and register to stay informed about I/O and other related events coming soon. The livestreamed keynotes start May 14 at 10am PT, so mark your calendar.

If you haven’t already, go try out our newest Google I/O puzzle and head to @googlefordevs on Instagram if you need a hint.

Key product updates from the 2024 Google for Games Developer Summit

Posted by Aurash Mahbod – General Manager, Games on Google Play

We’re on a mission to make Google Play the best destination for gaming, and your amazing games are what make it possible. That’s why we’re investing in tools that empower you to achieve more by increasing user engagement, accelerating your business growth, and maximizing your reach.

Today, we shared more details about those investments at the Google for Games Developer Summit. Check out the Keynote session on demand, or keep reading for key product updates from the summit.

More ways to increase user engagement

We know that identity, progress, and achievements are important to gamers. With Play Games Services (PGS), gamers can switch between devices and jump back into gameplay instantly, on any device connected with PGS1. Based on your feedback, we’ve made it easier for you to integrate PGS and provide more relevant experiences for your users:

    • No database changes are required to integrate PGS, and you no longer need to store the association between in-game accounts and PGS profiles.
    • You can start automatically syncing your users’ sign in information via PGS, including those without a profile, ensuring their progress is available to be synced when they change to a new device2.
    • Throughout this year, we’re rolling out more ways for you to create engaging in-game content by achievement level or game progress. For example, you will be able to create Quests within your game to challenge players while unlocking rewards using Play Points.

To start creating these engaging experiences within your game, integrate PGS today — and stay tuned for more capabilities coming soon.

We’ve also been working to tailor our store experience to better engage users who’ve already installed your game. Starting today, we’re rolling out enhancements to store listings to more prominently display your game updates, new content, and promotions directly within Play3. Your store listing page will source relevant content to keep your audience engaged with your game, including your latest YouTube videos, AI-generated FAQs, and more.

These improvements are currently limited to English language users and to select titles that are part of the early access program. If you’re interested in testing the enhanced store listing for your game and helping shape this feature with your feedback, you can express interest here.

moving image of Games Hub (left) and Games Home (right)

Expanded programs to accelerate growth

We’re also expanding our most popular programs to help you accelerate your business growth.

    • With over 220M members, Google Play Points is one of the largest loyalty programs in the world. Available in over 35 markets and covering the vast majority of purchase-ready, active spenders, Play Points helps you retain users and reach new ones. This year, we’re excited to announce that we’ll be launching Google Play Points in Brazil.
    • Play Points now offers the ability to set limited quantities for exclusive in-game offers, building excitement and boosting participation around limited deals. These offers have been highly effective at reaching paying users and driving repeat purchases.
image
    • Google Play Pass, which has grown over 120% in subscriptions over the past year, is expanding to include in-game offers from popular games. We’re launching in-game offers in 21 markets including Japan, and launching Play Pass in Korea later this year.
moving image of Google Games play pass

New ways to maximize reach

Finally, to help you reach more users with your games, we’re making it easier to scale and grow your multi-platform games across mobile, tablets, Chromebooks, and Windows PCs5.

ALT TEXT
    • We’re also making user acquisition campaigns easier and more helpful for PC games. You can now use the Play Install Referrer API for Google Play Games to achieve closed-loop marketing on PC, passing campaign or click information into your game so you can optimize marketing campaigns across ad networks, social, and other channels.

For more news from the Google for Games Developer Summit, check out g.co/gamedevsummit. You can also connect with us at GDC in San Francisco. We’ll be hosting a full day of sessions on March 19 and a week of hands-on product experience at the Moscone Center, West Hall, Level 2 Lobby from March 18 to March 22.

As always, thank you for partnering with us to bring amazing games to players everywhere.


___________________
1Subject to game availability and PC compatibility.
2This feature requires PGS SDK V2 integration. Developers can save sign-in information to PGS for users, but only retrieve this once the user has set up a profile.
3Feature availability limited to English language users and select titles that are part of the early access program.
4Google internal data. New buyer refers to users who made a purchase during this pilot experiment after making no purchase for the past 50 days.
5Windows is a trademark of the Microsoft group of companies.
6Google internal data.

Introducing Play Install Referrer for Google Play Games on PC

Posted by Arjun Dayal, Director – Google Play Games

Making informed marketing decisions relies on identifying your most valuable user acquisition channels for your games. By tracking referral data, you can understand which traffic sources send the most users to download your app from the Google Play store. These insights can help you make the most of your advertising spend and maximize ROI. That’s why in 2017, we launched the Play Install Referrer API, which gives you an easy, reliable way to track your apps’ referral information directly from the Play store.

Until now, this feature was only available for your games in the mobile Play store. Today, we’re pleased to announce support for Google Play Games on PC, allowing you to attribute conversions from your marketing activities on the Web1. If you use the Google Play Install Referrer API to track your referral sources, you can now attribute conversions to specific campaigns directly from Google Play by manually retrieving referral information, or using third-party analytics tools from Google’s App Attribution Partners.

Getting started is easy. First, generate a Google Play URL for your game’s Google Play store listing page and add a referrer query parameter for your Web campaign. Then, when a PC user clicks the link, they will be redirected to your game’s listing page on the Google Play Web store, which will give them the option to “Install on Windows.” Once the user launches your game, you’ll be able to track the referral using the Google Play Install Referrer library.

“With integration support from Adjust, developers can quickly and efficiently measure marketing campaigns for their games on Google Play Games on PC. We’re excited about the opportunity this brings for developers to broaden their games’ reach and strengthen cross-platform measurement.” 

- Gijsbert Pols, Director of Connected TV & New Channels, Adjust

Learn more about third-party referral codes for Google Play Games on PC and start optimizing your marketing performance today.


______________

1Subject to device compatibility with Google Play Games on PC.