Tag Archives: Android Developer

Android Developer Challenge: here’s what we’re looking for! (Apply by Dec. 2)

Last month, we kicked off the next Android Developer Challenge, and asked you to submit your ideas focused on helpful innovation, powered by on-device machine learning. But what exactly do we mean when we say helpful innovation? We’re glad you asked! We rounded up a few of Google’s on-device machine learning offerings, together with some great recent examples of this technology in action, to help inspire your submission. Don’t forget, submit your idea by December 2!

Using machine learning to tackle Fall Armyworm

Take Nazirini Siraji. When she and a team of developers noticed a crop-pest threatening the livelihood of Ugandan farmers, they taught themselves TensorFlow to combat this pest. They collected training data from nearby fields in the form of images. With TensorFlow, they re-trained a MobileNet, a technique known as transfer learning and then used the TensorFlow Converter to generate a TensorFlow Lite FlatBuffer file which they deployed in an Android app. With the app, a farmer can snap a picture of their crop and the image frame is analysed to look for Fall armyworm damage. Depending on the results from this phase, a suggestion of a possible solution is given. It’s pretty cool!

Helping doctors detect respiratory diseases using machine learning

Tambua Health is helping doctors determine the likelihood of respiratory diseases by turning any smartphone into a powerful non-invasive screening tool. They developed an app using TensorFlow Lite that can help doctors analyze lung sounds for the presence of abnormal sounds like wheezes, crackles, stridor, and other adventitious sounds.

adidas uses machine learning to make the shopping experience easier

Even brands are tapping the power of machine learning. Take adidas, who recently launched a new “Bring It to Me” experience for their London store. Shoppers can use Visual Lookup to scan products on their phones while they are in the store, and the app lets them check stock and request their size without the need for queues. Under the hood, ML Kit is helping power the experience. It’s another way machine learning is helping users get things done more quickly.

The benefits of on-device machine learning

Running machine learning on a user’s device comes with a number of benefits. First, you reduce the amount of data you send to your server, enhancing user privacy. And because it runs on device, it can also work offline - perfect for inaccessible areas such as the middle of a rainforest, a desert or the London Underground. Last but not least, the most exciting aspect of running your model on device is low latency and this can enable all kinds of new user experiences. Machine learning is not just for automating tasks, it can work alongside your users and give them super powers too!

At Google, we offer a number of different technologies to help you take advantage of this:

  • ML Kit offers a turnkey SDK to help you tackle tasks with powerful Google Machine Learning models
  • The TensorFlow Lite Framework lets you take a custom model and optimise it to run it on Android
  • There’s also the infrastructure of Firebase / Google Cloud, which can help you train on-device models using AutoML Vision Edge for specific model types or give you the raw processing power to train your own model

If you’ve got a great idea that can help users get things done, we want to hear from you! We’ll pick 10 concepts and provide expertise and guidance to those developers to help in their plans to bring their ideas to fruition. And once the app is ready, we’ll help showcase it in front of the billions of users on Google Play, through a collection and more. You can read more about all of the prizes here.

There’s still time to submit your idea before the December 2 deadline. We can’t wait to hear what you come up with, and to work with you on bringing helpful innovation powered by on-device machine learning to more and more users!

3 things to know about Android Studio from Android Dev Summit 2019

Posted by Deepanshu Madan, Product Manager

Last month’s #AndroidDevSummit was jam-packed with announcements and technical news...so much that we wouldn’t be surprised if you missed something. So all this month, we’ll be diving into key areas from throughout the summit so you don’t miss anything. Earlier this week, we spotlighted Kotlin and Jetpack Compose, and today, we’re highlighting Android Studio, with the top three things you should know:

#1: Support for Jetpack Compose

For the best experience developing with Jetpack Compose, you can now use the latest version of Android Studio 4.0 in the canary channel, and benefit from smart editor features, such as New Project templates, code completion and the ability to immediately preview your Jetpack Compose UI.

#2: What’s new in Android Studio session

We covered both new features and successes of our quality initiative called Project Marble. On the quality aspect, we discuss improvements around hangs and latency, memory leak detection, automatic IDE heap sizing and build speed. Also during the session you will find demos on new developments & features in Android Studio such as Build Attribution tool which helps you understand and diagnose problems with your build system, Java 8 library desugaring, View binding, Kotlin Android live templates, an updated live Layout inspector which allows you to drill into resources right from the view to find where a property value originates in the source code with a 3D visualization of your view hierarchy.

#3: Android Studio Design tools

We introduced new features of Layout & Navigation editor including a new split view, new tools such as Multi-preview which allows you to visualize your layout in different configurations and MotionEditor, visual design editor for the MotionLayout layout type, making it easier to create and preview animations. The Motion Editor provides a simple interface for manipulating elements from the MotionLayout library that serves as the foundation for animation in Android apps. In previous releases, creating and altering these elements required manually editing constraints in XML resource files. Now, the Motion Editor can generate this XML for you, with support for start and end states, keyframes, transitions, and timelines.

You can find the entire playlist of Android Dev Summit sessions and videos here. We’ll continue to spotlight other areas later this month, so keep an eye out and follow AndroidDevelopers on Twitter. Thanks so much for letting us be a part of this experience with you!

3 things to know about Kotlin from Android Dev Summit 2019

Last month’s #AndroidDevSummit was jam-packed with announcements and technical news...so much that we wouldn’t be surprised if you missed something. So all this month, we’ll be diving into key areas from throughout the summit so you don’t miss anything. First up, we’re spotlighting Kotlin, with the top things you should know:

#1: Kotlin momentum on Android

Kotlin is at the heart of modern Android development — and we’ve been excited to see how quickly it has won over developers around the world. At Android Dev Summit we announced that nearly 60% of the top 1000 Android apps on the Play Store now use Kotlin, and we’re seeing more developers adopt it every day. Kotlin has helpful features like null safety, data classes, coroutines, and complete interoperability with the Java programming language. We’re doubling down on Kotlin with more Kotlin-first APIs even beyond AndroidX — we just released KTX extensions, including coroutines support, for Play Core. There’s never been a better time to give Kotlin a try.

#2: Learn more: Getting started with Kotlin & diving into advanced Kotlin with coroutines

If you’re introducing Kotlin into an existing codebase, chances are that you’ll be calling the Java programming language from Kotlin and vice versa. At Android Dev Summit, developer advocates Murat Yener, Nicole Borrelli, and Wenbo Zhu took a look at how nullability, getters, setters, default parameters, exceptions, and more work across the two languages.

For those looking into more advanced Kotlin topics, we recommend watching Jose Alcérreca's and Yigit Boyar's talk that explains how coroutines and Flow can fit together with LiveData in your app's architecture and one on testing coroutines by Sean McQuillan and Manuel Vivo.

#3: Get certified in Kotlin

We announced the launch of our Associate Android Developer certification in Kotlin. Now you can prove your proficiency with modern Kotlin development on Android to your coworkers, your professional network, or even your future employer. As part of this launch, you can take this exam at a discount when using the code ADSCERT99 through January 25.

It’s especially great to hear from you, the Android community, at events like Android Dev Summit: what do you want to hear more about, and how can we help with something you’re working on. We asked you to submit your burning questions on Twitter and the livestream, and developer advocates Florina Muntenescu and Sean McQuillan answered your Kotlin and coroutines questions live during our #AskAndroid segment:

You can find the entire playlist of Android Dev Summit sessions and videos here. We’ll continue to spotlight other areas later this month, so keep an eye out and follow Android Developers on Twitter. Thanks so much for letting us be a part of this experience with you!

Java is a registered trademark of Oracle and/or its affiliates.

3 things to know about Jetpack Compose from Android Dev Summit 2019

Posted by Anna-Chiara Bellini, @dr0nequeen

Last month’s #AndroidDevSummit was jam-packed with announcements and technical news...so much that we wouldn’t be surprised if you missed something. So all this month, we’ll be diving into key areas from throughout the summit so you don’t miss anything. Earlier today, we spotlighted Kotlin and now we’re diving into Jetpack Compose, with the top three things you should know:

#1: Jetpack Compose is available in Developer Preview!

Jetpack Compose is Android’s modern toolkit for building native UI. It allows developers to write beautiful Android apps in an intuitive way, writing less code and accelerating development. It's powerful, because when writing code is a pleasure, you can focus on making your apps look beautiful and giving your users the best experience. At #AndroidDevSummit, we released the Developer Preview, to enable more feedback as we work towards bringing Jetpack Compose to beta next year. All you need to do is download the Android Studio 4.0 Canary build to try it out. To give feedback, feel free to reach out through the Kotlinlang Slack channel or our Bug Tracker to let us know what you think. It's your chance to help us build the right thing!

#2: See what's new in Jetpack Compose

At Android Dev Summit we showed how we've designed Compose to simplify development of Android apps, details on the new Material UI components we are building, and insights on some of the learnings that are informing the way we think about Compose. We also showcased how to write a small app, including layout and state management for a list of elements, in just a few lines of code.

Watch the Android Dev Summit session video to learn more:

If you want to take a look behind the scenes, we also had a tech deep dive into the inner workings of Compose:

#3: Try it out, with our tutorial, sample app, and codelab!

Jetpack Compose is still in Developer Preview, which means it's a great time to try it out and let us know what you think about it. To help you with that, you can follow our tutorial, which will take you through the first steps of building a Compose app, and also take a look at our sample app, Jetnews, that shows what is currently possible with Jetpack Compose. We also have a popular codelab available to take you through how to build UIs with Compose, how to manage state in composable functions, and data flow principles in Compose.

Jetnews sample app

...and we also heard from you!

But Android Dev Summit isn’t just about what we’ve got to say; it’s also about you telling us what you’d like to see worked on to make your life easier. And this year, one thing we heard strongly from our community was how important it is to have the right tools to manage your layouts on so many different form factors and devices. Jetpack Compose has full Android Studio support and you can iterate fast with live Previews.

Android Dev Summit is over for this year, but you can keep giving us your feedback through the Kotlinlang Slack channel and Bug Tracker.

You can find the entire playlist of Android Dev Summit sessions and videos here. We’ll continue to spotlight other areas later this month, so keep an eye out and follow AndroidDevelopers on Twitter. Thanks so much for letting us be a part of this experience with you!

One Biometric API Over all Android

Posted by Isai Damier, Android Developer Platform Engineering (@isaidamier)

Kevin Chyn, Android Framework

Curtis Belmonte, Android Framework

With the launch of Android 10 (API level 29), developers can now use the Biometric API, part of the AndroidX Biometric Library, for all their on-device user authentication needs. The Android Framework and Security team has added a number of significant features to the AndroidX Biometric Library, which makes all of the biometric behavior from Android 10 available to all devices that run Android 6.0 (API level 23) or higher. In addition to supporting multiple biometric authentication form factors, the API has made it much easier for developers to check whether a given device has biometric sensors. And if there are no biometric sensors present, the API allows developers to specify whether they want to use device credentials in their apps.

The features do not just benefit developers. Device manufacturers and OEMs have a lot to celebrate as well. The framework is now providing a friendly, standardized API for OEMs to integrate support for all types of biometric sensors on their devices. In addition, the framework has built-in support for facial authentication in Android 10 so that vendors don’t need to create a custom implementation.

A bit of background

The FingerprintManager class was introduced in Android 6.0 (API level 23). At the time -- and up until Android 9 (API level 28) -- the API provided support only for fingerprint sensors, and with no UI. Developers needed to build their own fingerprint UI.

Based on developer feedback, Android 9 introduced a standardized fingerprint UI policy. In addition, BiometricPrompt was introduced to encompass more sensors than just fingerprint. In addition to providing a safe, familiar UI for user authentication, it enabled a small, maintainable API surface for developers to access the variety of biometric hardware available on OEM devices. OEMs can now customize the UI with necessary affordances and iconography to expose new biometrics, such as outlines for in-display sensors. With this, app developers don’t need to worry about implementing customized, device-specific implementations for biometric authentication.

Then, in Android 10, the team introduced some pivotal features to turn the biometric API into a one-stop-shop for in-app user authentication. BiometricManager enables developers to check whether a device supports biometric authentication. Furthermore, the setDeviceCredentialAllowed() method was added to allow developers the option to use a device’s PIN/pattern/password instead of biometric credentials, if it makes sense for their app.

The team has now packaged every biometric feature you get in Android 10 into the androidx.biometric Gradle dependency so that a single, consistent, interface is available all the way back to Android 6.0 (API level 23).

How it works

The androidx.biometric Gradle dependency is a support library for the Android framework Biometric classes. On API 29 and above, the library uses the classes under android.hardware.biometrics, FingerprintManager back to API 23, and Confirm Credential all the way back to API 21. Because of the variety of APIs, we strongly recommend using the androidx support library regardless of which API level your app targets.

To use the Biometric API in your app, do the following.

1. Add the Gradle dependency to your app module

$biometric_version is the latest release of the library

def biometric_version= '1.0.0-rc02'
implementation "androidx.biometric:biometric:$biometric_version"

2. Check whether the device supports biometric authentication

The BiometricPrompt needs to be recreated every time the Activity/Fragment is created; this should be done inside onCreate() or onCreateView() so that BiometricPrompt.AuthenticationCallback can start receiving callbacks properly.

To check whether the device supports biometric authentication, add the following logic:

val biometricManager = BiometricManager.from(context)
if (biometricManager.canAuthenticate() == BiometricManager.BIOMETRIC_SUCCESS){
   // TODO: show in-app settings, make authentication calls.
}

3. Create an instance of BiometricPrompt

The BiometricPrompt constructor requires both an Executor and an AuthenticationCallback object. The Executor allows you to specify a thread on which your callbacks should be run.

The AuthenticationCallback has three methods:

  1. onAuthenticationSucceeded() is called when the user has been authenticated using a credential that the device recognizes.
  2. onAuthenticationError() is called when an unrecoverable error occurs.
  3. onAuthenticationFailed() is called when the user is rejected, for example when a non-enrolled fingerprint is placed on the sensor, but unlike with onAuthenticationError(), the user can continue trying to authenticate.

The following snippet shows one way of implementing the Executor and how to instantiate the BiometricPrompt:

private fun instanceOfBiometricPrompt(): BiometricPrompt {
   val executor = ContextCompat.getmainExecutor(context)

   val callback = object: BiometricPrompt.AuthenticationCallback() {
       override fun onAuthenticationError(errorCode: Int, errString: CharSequence) {
           super.onAuthenticationError(errorCode, errString)
           showMessage("$errorCode :: $errString")
       }

       override fun onAuthenticationFailed() {
           super.onAuthenticationFailed()
           showMessage("Authentication failed for an unknown reason")
       }

       override fun onAuthenticationSucceeded(result: BiometricPrompt.AuthenticationResult) {
           super.onAuthenticationSucceeded(result)
           showMessage("Authentication was successful")
       }
   }

   val biometricPrompt = BiometricPrompt(context, executor, callback)
   return biometricPrompt
}

Instantiating the BiometricPrompt should be done early in the lifecycle of your fragment or activity (e.g., in onCreate or onCreateView). This ensures that the current instance will always properly receive authentication callbacks.

4. Build a PromptInfo object

Once you have a BiometricPrompt object, you ask the user to authenticate by calling biometricPrompt.authenticate(promptInfo). If your app requires the user to authenticate using a Strong biometric or needs to perform cryptographic operations in KeyStore, you should use authenticate(PromptInfo, CryptoObject) instead.

This call will show the user the appropriate UI, based on the type of biometric credential being used for authentication – such as fingerprint, face, or iris. As a developer you don’t need to know which type of credential is being used for authentication; the API handles all of that for you.

This call requires a BiometricPrompt.PromptInfo object. A PromptInfo is where you define the text that appears in the prompt: such as title, subtitle, description. Without a PromptInfo, it is not clear to the end user which app is asking for their biometric credentials. PromptInfo also allows you to specify whether it’s OK for devices that do not support biometric authentication to grant access through the device credentials, such as password, PIN, or pattern that are used to unlock the device.

Here is an example of a PromptInfo declaration:

private fun getPromptInfo(): BiometricPrompt.PromptInfo {
   val promptInfo = BiometricPrompt.PromptInfo.Builder()
       .setTitle("My App's Authentication")
       .setSubtitle("Please login to get access")
       .setDescription("My App is using Android biometric authentication")
              .setDeviceCredentialAllowed(true)
       .build()
   return promptInfo
}

For actions that require a confirmation step, such as transactions and payments, we recommend using the default option -- setConfirmationRequired(true) -- which will add a confirmation button to the UI, as shown in Figure 2.

Figure 1. Example face authentication flow using BiometricPrompt with setConfirmationRequired(false).

Figure 2. Example face authentication flow using BiometricPrompt with setConfirmationRequired(true) (default behavior).

5. Ask the user to authenticate

Now that you have all the required pieces, you can ask the user to authenticate.

val canAuthenticate = biometricManager.canAuthenticate()
if (canAuthenticate == BiometricManager.BIOMETRIC_SUCCESS) {
   biometricPrompt.authenticate(promptInfo)
} else {
   Log.d(TAG, "could not authenticate because: $canAuthenticate")
}

And that’s it! You should now be able to perform authentication, using biometric credentials, on any device that runs Android 6.0 (API level 23) or higher.

Going forward

Because the ecosystem continues to evolve rapidly, the Android Framework team is constantly thinking about how to provide long-term support for both OEMs and developers. Given the biometric library’s consistent, system-themed UI, developers don’t need to worry about device-specific requirements, and users get a more trustworthy experience.

We welcome any feedback from developers and OEMs on how to make it more robust, easier to use, and supportive of a wider range of use cases.

For in-depth examples that showcase additional use cases and demonstrate how you might integrate this library into your app, check out our repo, which contains functional sample apps that make use of the library. You can also read the associated developer guide and API reference for more information.

A modern approach to Android development, with Jetpack Compose and more!

Posted by Stephanie Cuthbertson, Director, Product Management animated Android header image for blog post

Modern Android Development today

Perhaps as a consequence of Android's flexibility, we often get asked by developers what does the Android team recommend when it comes to building apps? You’ve told us: you love our openness..but you’d also love us to marry it with an opinion about the right way to do things. And, to make sure the right way is also the easiest way. And so today, at the Android Dev Summit, the team wanted to answer that question for you.

We call our recommendation “modern Android development". Opinionated and powerful, for fast, easy development. Taking away everything that slows you down so you can focus on building incredible experiences. You can see it in the investments we’ve made like creating Android Studio and Jetpack. (Over 90% of our pro developers are using Android Studio today.) Kotlin and Compose are especially great recent examples. Kotlin is a modern, concise language -- something you asked us for and is now the recommended language for Android. Compose is a modern declarative UI toolkit built for the next 10 years. And this might sound a little strange, but we chose and designed these tools to be enjoyable to use: we think that’s important too. Both Kotlin and Compose also have a critically important property. They are designed to be compatible with your existing apps. This means you can phase in Kotlin code and Compose views on your timeline.

It all starts with a great modern language: Kotlin

Modern Android starts with fantastic language support.In fact, we recently passed a milestone where almost 60% of our top 1000 apps are using Kotlin. And we’re working with JetBrains to make it even better: faster kotlin compile speeds, incremental annotation processing with KAPT, better IDE typing latency, more lint checks, desugaring in D8 and R8, new optimizations in R8 that are aware of Kotlin-specific bytecode patterns. And we’re releasing full IDE support for Kotlin build scripts today. If you want to grow your skills, we are launching an Advanced Android course with Kotlin on Udacity. And, for those who are already experts, we’re also launching a new Android Developer Certification in Kotlin, available at a discount for the next three months. We’re working to make all of our supported first-class languages — Kotlin, the Java programming language, and C++ — better for you and your team, with Java8 library desugaring, NDK r21 with updated LLVM, GNU Make, Fortify enabled by default, and more.

Jetpack: Build high quality, powerful apps with less code

Jetpack is designed to solve real-world problems you face every day, and is used by over 84% of the top 10,000 Play Store apps. And we continue to make Jetpack even more helpful:

  • Benchmarking, first announced at Google I/O, is now available as a release candidate. This library makes it easier to measure the performance of your app with confidence.
  • Viewbinding is an easier way to access Views from your code. It is a type-safe solution with minimal build-time impact, no more findViewById(), no more annotation processors.
  • CameraX simplifies the development experience and lets you focus on your app instead by addressing the differences between the many devices in the Android ecosystem, like Samsung, Xiaomi, Oppo, Motorola, LG who are already unifying behind CameraX. Previewed at Google I/O, CameraX will be available in Beta in December.

Compose: Android’s new UI toolkit to build beautiful, native apps, now in developer preview

Compose makes it easy to build beautiful, native apps. It provides a declarative way to build UIs which makes your code more intuitive and concise. Inspired by Kotlin, you can adopt Compose at your own pace thanks to seamless compatibility with the existing UI toolkit.

Today we are releasing the Jetpack Compose Developer Preview. All you need to do is download the latest Preview build of Android Studio. Compose is being developed completely in the open, in AOSP. The continuous feedback we receive has led to many API improvements and we want to thank you for providing feedback in our developer studies and the Kotlinlang Slack group. As we enter developer preview, we need even more feedback as we work towards bringing Jetpack Compose to beta next year and ready for use in production apps.

Android Studio 4.0 Canary

Today we also released the first canary of Android Studio 4.0 - built hand in hand with Compose for powerful, integrated tooling support. Android Studio 4.0 includes Compose Live Preview, Code Completion, and a full sample of a Compose app. You’ll also find the new Motion Editor, Java 8 Language library desugaring, full support for KTS files, Kotlin live templates, and more.

Android App Bundles and dynamic delivery testing improvements

In just eighteen months after launch, over 270K Android App Bundles are now in production covering 25% of all active installs. Based on your feedback, we’re making App Bundles and Dynamic Delivery much easier to test. Internal app sharing lets you share test builds of your app bundle as easily as you share APKs. You can now grant anyone in the team ability to upload artifacts. You don’t need to sign test versions with your production app signing key, you don’t need to use version codes, and you can upload debuggable artifacts. We’re also making it possible to get download links for old versions of your app from the Play Console, whether they’re app bundles or APKs. And starting today, we’re launching offline testing of dynamic delivery with the fake split install manager so can replicate splits being installed by the Play Store while testing locally.

A modern distribution platform, centered around user trust

User trust and safety has always been a top priority at Google Play, with human reviewers, constant improvements to play protect, and policy updates to evolve with the threats we see. As a result, apps that are installed from Google Play are an order of magnitude safer than from any other source. This year, we’ve been increasing all our detection capabilities for impersonators, repackaging, bad content and other forms of abuse, but we know there's more we can do, and the threats are constantly changing. With your help, we’ve reduced access to sensitive data and have made Play even safer for children and families. We restricted SMS/Call log permissions to only apps that need them as part of their core functionality, and as a result 98% fewer apps access this sensitive data. Thanks to your hard work, users are safer, and know they are safer when they download apps that request fewer permissions.

The Android Developer Challenge!

Over ten years ago, we announced the first Android Developer Challenge. Today, modern Android is shaping the next generation platform. So it seems kind of fitting to announce: the Android Developer Challenge is back! The first Developer Challenge we’re announcing is Helpful Innovation and Machine Learning. Take Live Captions: for the almost 500 million people who are deaf and hard of hearing, Live Captions bring content to life and is exactly the type of machine learning-powered innovation we expect to see more of someday, and with your help we can turn someday into today. You can read more about the challenge here.

So - that’s a quick tour of Modern Android and the road ahead across our developer experience! Whether you’re joining us for Android Dev Summit in person or on the livestream, we have nearly 60 sessions from over 100 speakers where we’ll go deep on everything you need to know about Android. Thank you!

Android Developer Challenge: helpful innovation, powered by On-Device Machine Learning + you!

Posted by The Android team

Android Developer Challenge banner

Developers like you have always played an important role in shaping the direction of Android, fueling the wave of Android innovation. It’s the reason that when we first launched the SDK for Android 10+ years ago, we simultaneously announced the Android Developer Challenge: a way to help reward model apps and show us what user problems you wanted to solve. As Android continues to push the boundaries into emerging areas like ML, 5G, foldables and more, we need your help to bring to life the consumer experiences that will define these new frontiers.

So we’re bringing back the Android Developer Challenge and asking you to help us unlock new experiences on Android, and help inspire other developers around these emerging technologies.

As we kick off this challenge, the first area we’ll be focusing on is On-Device Machine Learning. At Google, we’re big believers in how this new technology can open up a world of helpful innovation so you can get things done in ways you never thought possible. Take Live Captions: for the almost 500 million people who are deaf and hard of hearing, Live Captions bring content to life and is exactly the type of machine learning-powered innovation we expect to see more of someday, and with your help we can turn someday into today!

Bringing your idea to life in front of billions of eyes

Got an idea? Whether it’s still a concept or ready for users, tell us how you could use Google’s help, and how it supports the mission of using machine learning to help people get something done. Join the #AndroidDeveloperChallenge topic on GitHub, and share your idea as a repository under this topic. Don’t forget to come back here and officially submit your concept.

We’ll pick 10 concepts and provide expertise and guidance to those developers to help in their plans to bring their ideas to fruition. And once the app is ready, we’ll help showcase it in front of the billions of users on Google Play, through a collection and more. Here’s what those 10 developers will get:

Expertise and development support bootcamp: We’ll work with you to provide expertise and guidance to help in your plans to bring your app from concept to reality, including:

  • An all-expenses paid, working session with a panel of experts at Google HQ in Mountain View, CA
  • Google engineer mentorship at the bootcamp, providing guidance and technical expertise on how to help your plans to bring your app to fruition

Exposure and street cred! Once your idea is ready for prime-time, we’ll help you get users, and celebrate you to the broader Android community, including:

  • A collection on Google Play where we’ll feature your app (apps must be ready for Google Play on May 1, and must meet Google’s minimum quality requirements)
  • Tickets to Google I/O 2020
  • And we’ll celebrate these experiences to the broader Android developer community on developers.android.com. We might even showcase you at Google I/O, in places like the sandbox, sessions, perhaps even a keynote!

Helpful innovation is an important investment area for us on the Android team, and On-Device Machine Learning has played a critical role in powering new features in the last several releases of Android. We’re just beginning to scratch the surface, and we can’t wait to see what you come up with!

Here’s how to watch the 2019 Android Dev Summit!

We’re less than 24 hours away from kicking off the 2019 Android Dev Summit, broadcasting live from the Google Events Center (MP7) in Sunnyvale, CA on October 23 & 24. We’ll be broadcasting the entire two days of the event live on YouTube, Twitter and on the Android Dev Summit website. Here’s what you need to know:

The keynote, sessions, sandbox and more, kicking off at 10AM PDT

The summit kicks off on October 23 at 10 a.m. PDT with a keynote, where you'll hear from Dave Burke, VP Engineering for Android, Stephanie Cuthbertson, Director of Product Management and others on the present and future of Android development. From there, we'll dive into two days of deep technical content from the Android engineering team, on topics such as Android platform, Android Studio, Android Jetpack, Kotlin, Google Play, and more. The full agenda is here so you can plan your summit experience.

#AskAndroid your pressing questions!

Tweet us your best questions using the hashtag #AskAndroid in the lead-up to the Android Dev Summit. We’ve gathered experts from Jetpack to Kotlin to Android 10, so we’ve got you covered. We’ll be answering your questions live between sessions on the livestream. Plus, we will share updates directly from the Google Events Center to our social channels, so be sure to follow along!

Previewing #AndroidDevSummit: Sessions, App, & Livestream Details

Posted by The #AndroidDevSummit team

In two weeks, we'll be welcoming Android developers from around the world at Android Dev Summit 2019, broadcasting live from the Google Events Center (MP7) in Sunnyvale, CA on October 23 & 24. Whether you’re joining us in person or via the livestream, we’ve got a great show planned for you; starting today, you can read more details about the keynote, sessions, codelabs, sandbox, the mobile app, and how online viewers can participate.

The keynote, sessions, sandbox and more, kicking off at 10AM PDT

The summit kicks off on October 23 at 10 a.m. PDT with a keynote, where you'll hear from Dave Burke, VP Engineering for Android, and others on the present and future of Android development. From there, we'll dive into two days of deep technical content from the Android engineering team, on topics such as Android platform, Android Studio, Android Jetpack, Kotlin, Google Play, and more.

The full agenda is now available, so you can start to plan your summit experience. We'll also have demos in the sandbox, hands-on learning with codelabs, plus exclusive content for those watching on the livestream.

Get the Android Dev Summit app on Google Play!

The official app for Android Dev Summit 2019 has started rolling out on Google Play. With the app, you can explore the agenda by searching through topics and speakers. Plan your summit experience by saving events to your personalized schedule. You’ll be able to watch the livestream and find recordings after sessions occur, and more. (By the way, the app is also an Instant app, so with one tap you can try it out first before installing!)

2019 #AndroidDevSummit app

A front-row seat from your desk, and #AskAndroid your pressing questions!

We’ll be broadcasting live on YouTube and Twitter starting October 23 at 10 a.m. PDT. In addition to a front row seat to over 25 Android sessions, there will be exclusive online-only content and an opportunity to have your most pressing Android development questions answered live, broadcast right from the event.

Tweet us your best questions using the hashtag #AskAndroid in the lead-up to the Android Dev Summit. We’ve gathered experts from Jetpack to Kotlin to Android 10, so we’ve got you covered. We’ll be answering your questions live between sessions on the livestream. Plus, we will share updates directly from the Google Events Center to our social channels, so be sure to follow along!

Previewing #AndroidDevSummit: Sessions, App, & Livestream Details

Posted by The #AndroidDevSummit team

In two weeks, we'll be welcoming Android developers from around the world at Android Dev Summit 2019, broadcasting live from the Google Events Center (MP7) in Sunnyvale, CA on October 23 & 24. Whether you’re joining us in person or via the livestream, we’ve got a great show planned for you; starting today, you can read more details about the keynote, sessions, codelabs, sandbox, the mobile app, and how online viewers can participate.

The keynote, sessions, sandbox and more, kicking off at 10AM PDT

The summit kicks off on October 23 at 10 a.m. PDT with a keynote, where you'll hear from Dave Burke, VP Engineering for Android, and others on the present and future of Android development. From there, we'll dive into two days of deep technical content from the Android engineering team, on topics such as Android platform, Android Studio, Android Jetpack, Kotlin, Google Play, and more.

The full agenda is now available, so you can start to plan your summit experience. We'll also have demos in the sandbox, hands-on learning with codelabs, plus exclusive content for those watching on the livestream.

Get the Android Dev Summit app on Google Play!

The official app for Android Dev Summit 2019 has started rolling out on Google Play. With the app, you can explore the agenda by searching through topics and speakers. Plan your summit experience by saving events to your personalized schedule. You’ll be able to watch the livestream and find recordings after sessions occur, and more. (By the way, the app is also an Instant app, so with one tap you can try it out first before installing!)

2019 #AndroidDevSummit app

A front-row seat from your desk, and #AskAndroid your pressing questions!

We’ll be broadcasting live on YouTube and Twitter starting October 23 at 10 a.m. PDT. In addition to a front row seat to over 25 Android sessions, there will be exclusive online-only content and an opportunity to have your most pressing Android development questions answered live, broadcast right from the event.

Tweet us your best questions using the hashtag #AskAndroid in the lead-up to the Android Dev Summit. We’ve gathered experts from Jetpack to Kotlin to Android 10, so we’ve got you covered. We’ll be answering your questions live between sessions on the livestream. Plus, we will share updates directly from the Google Events Center to our social channels, so be sure to follow along!