Tag Archives: Android Developer

Next-generation Dex Compiler Now in Preview

Posted by James Lau, Product Manager

Android developers know that dex compilation is a key step in building an APK. This is the process of transforming .class bytecode into .dex bytecode for the Android Runtime (or Dalvik, for older versions of Android). The dex compiler mostly works under the hood in your day-to-day app development, but it directly impacts your app's build time, .dex file size, and runtime performance.

That's why we are investing in making important improvements in the dex compiler. We're excited to announce that the next-generation dex compiler, D8, is now available for preview as part of Android Studio 3.0 Beta release.

When comparing with the current DX compiler, D8 compiles faster and outputs smaller .dex files, while having the same or better app runtime performance.

* Tested with benchmark project here.
*Tested with benchmark project here.

How to try it?

D8 is available for your preview starting with Android Studio 3.0 Beta. To try it, set the following in your project's gradle.properties file:

android.enableD8=true

We have tested D8's correctness and performance on a number of apps, and the results are encouraging. We're confident enough with the results that we are switching to use D8 as the default dex compiler for building AOSP. There are currently no known issues, but we would love to hear your feedback. You can file a bug report using this link.

What's next?

We plan to preview D8 over the next several months with the Android Studio 3.0 release. During this time, we will focus on addressing any critical bug reports we receive from the community. We plan to bring D8 out of preview and enable it as the default dex compiler in Android Studio 3.1. At that time, the DX compiler will officially be put in maintenance mode. Only critical issues with DX will be fixed moving forward.

Beyond D8, we are also working on R8, which is a Proguard replacement for whole program minification and optimization. While the R8 project has already been open sourced, it has not yet been integrated with the Android Gradle plugin. We will provide more details about R8 in the near future when we are ready to preview it with the community.

Tool developers: get your bytecode tools Java 8 ready

In April, we announced Java 8 language features with desugaring. The desugaring step currently happens immediately after Java compilation (javac) and before any bytecode reading or rewriting tools are run. Over the next couple of months, the desugar step will move to a later stage in the pipeline, as part of D8. This will allow us to further reduce the overall build time and produce more optimized code. This change means that any bytecode reading or rewriting tools will run before the desugar step. If you develop .class bytecode reading or rewriting tools for Android, you will need to make sure they can handle the Java 8 bytecode format so they can continue to work properly when we move desugaring into D8.

Happy dex'ing!

Semantic Time support now available on the Awareness APIs

Posted by Ritesh Nayak M, Product Manager

Last year at I/O we launched the Awareness API, a simple yet powerful API that let developers use signals such as Location, Weather, Time and User Activity to build contextually relevant app experiences.

Available via Google Play services, the Awareness API offers two ways to take advantage of context signals within your app. The Snapshot API lets your app request information about the user's current context, while the Fence API lets your app react to changes in user's context, and when it matches a certain set of conditions. For example, "tell me whenever the user is walking and their headphone is plugged in".

Until now, you could specify a time fence on the Awareness APIs but were restricted to using absolute/canonical representation of time. Based on developer feedback, we realized that the flexibility of the API in regards to building time fences did not support higher level abstractions people use when they think and talk about time. "This weekend", "on the next holiday", "after sunset", are all very common and colloquial ways of expressing time. That's why we're adding Semantic time support to these APIs starting today

For e.g., if you were building a fitness app and wanted a way to prompt users everyday morning to start their routine, or if you're a reading app that wants to turn on night mode after dusk; you would have to query a 3p API for sunrise/sunset information at the user location and then write up an Awareness fence with those canonical time values. With our latest update, you can use our TIME_INSTANT_SUNRISE and TIME_INSTANT_SUNSET constants and let the platform manage all the complexity for you.

Let's look at an example. Suppose you're building a fitness app which prompts users on Tuesday, and Thursday around sunrise to begin their morning work out. You can set up this triggering using the following lines of code.

// A sun-state-based fence that is TRUE only on Tuesday and Thursday during Sunrise 
AwarenessFence.and(
    TimeFence.aroundTimeInstant(TimeFence.TIME_INSTANT_SUNRISE,
            -10 * ONE_MINUTE_MILLIS, 5 * ONE_MINUTE_MILLIS),
    AwarenessFence.or(
        TimeFence.inIntervalOfDay(TimeFence.DAY_OF_WEEK_TUESDAY,
                0, ONE_DAY_MILLIS),
        TimeFence.inIntervalOfDay(TimeFence.DAY_OF_WEEK_THURSDAY,
                0, ONE_DAY_MILLIS)));

One of our favorite semantic time features is public holidays. Every country and regions within it have different holidays. Assume you were a local hiking & adventure app that wants to show users activities they can indulge in on a holiday that falls on a Friday or a Monday. You can use a combination of Days and Holiday flags to identify this state for all your users around the world. You can do this with just 3 lines of code and have this work in any part of the world.

// A local-time fence that is TRUE only on public holidays in the
// device locale that fall on Fridays or Mondays.
AwarenessFence.and(
    TimeFence.inTimeInterval(TimeFence.TIME_INTERVAL_HOLIDAY),
    AwarenessFence.or(
        TimeFence.inIntervalOfDay(TimeFence.DAY_OF_WEEK_FRIDAY,
                9 * ONE_HOUR_MILLIS, 11 * ONE_HOUR_MILLIS),
        TimeFence.inIntervalOfDay(TimeFence.DAY_OF_WEEK_MONDAY,
                9 * ONE_HOUR_MILLIS, 11 * ONE_HOUR_MILLIS)));

In both example cases, Awareness does the heavy lifting of localizing time and holidays based on the device locale settings.

We're excited to see what problems you'll solve using this powerful API. Please join our mailing list to get updates about this and other Context APIs at Google.

Java 8 Language Features Support Update

Posted by James Lau, Product Manager

Yesterday, we released Android Studio 2.4 Preview 6. Java 8 language features are now supported by the Android build system in the javac/dx compilation path. Android Studio's Gradle plugin now desugars Java 8 class files to Java 7-compatible class files, so you can use lambdas, method references and other features of Java 8.

For those of you who tried the Jack compiler, we now support the same set of Java 8 language features but with faster build speed. You can use Java 8 language features together with tools that rely on bytecode, including Instant Run. Using libraries written with Java 8 is also supported.

We first added Java 8 desugaring in Android Studio 2.4 Preview 4. Preview 6 includes important bug fixes related to Java 8 language features support. Many of these fixes were made in response to bug reports you filed. We really appreciate your help in improving Android development tools for the community!

It's easy to try using Java 8 language features in your Android project. Just download Android Studio 2.4 Preview 6, and update your project's target and source compatibility to Java version 1.8. You can find more information in our preview documentation.

Happy lambda'ing!

Calling all early adopters for Android Studio previews

Posted by Scott Main, Technical Writer

If you love trying out all of the newest features in Android Studio and helping us make it a better IDE, we're making it even easier to download early preview builds with a new website. Here, you can download and stay up to date on all the latest Android Studio previews and other tools announcements.



Android Studio previews give you early access to new features in all aspects of the IDE, plus early versions of other tools such as the Android Emulator and platform SDK previews. You can install multiple versions of Android Studio side-by-side, so if a bug in the preview build blocks your app development, you can keep working on the same project from the stable version.

The latest preview for Android Studio 2.4 just came out last week, and it includes new features to support development with the Android O Developer Preview. You can download and set up the O preview SDK from inside Android Studio, and then use Android O’s XML font resources and autosizing TextView in the Layout Editor.

By building your apps with the Android Studio preview, you're also helping us create a better version of Android Studio. We want to hear from you if you encounter any bugs.

5 Tips for launching successful apps and games on Google Play

Posted by Adam Gutterman, Go-To-Market Strategic Lead, Google Play Games

Last month at the Game Developers Conference (GDC), we held a developer panel focused on sharing best practices for building successful app and game businesses. Check out 5 tips for developers, both large and small, as shared by our gaming partners at Electronic Arts (EA), Hutch Games, Nix Hydra, Space Ape Games and Omnidrone.



1. Test, test, test

The best time to test, is before you launch; so test boldly and test a lot! Nix Hydra recommends testing creative, including art style and messaging, as well as gameplay mechanics, onboarding flows and anything else you're not sure about. Gathering feedback from real users in advance of launching can highlight what's working and what can be improved to ensure your game's in the best shape possible at launch.

2. Store listing experiments

Run experiments on all of your store listing page assets. Taking bold risks instead of making assumptions allows you to see the impact of different variables with your actual user base on Google Play. Test in different regions to ensure your store listing page is optimized for each major market, as they often perform differently.

3. Early Access program

Space Ape Games recently used Early Access to test different onboarding experiences and gameplay control methods in their game. Finding the right combination led them to double-digit growth in D1 retention. Gathering these results in advance of launch helped the team fine tune and polish the game, minimizing risk before releasing to the masses.

"Early Access is cool because you can ask the big questions and get real answers from real players," Joe Raeburn, Founding Product Guy at Space Ape Games.

Watch the Android Developer Story below to hear how Omnidrone benefits from Early Access using strong user feedback to improve retention, engagement and monetization in their game.


Mobile game developer Omnidrone benefits from Early Access.

4. Pre-registration

Electronic Arts has run more than 5 pre-registration campaigns on Google Play. Pre-registration allows them to start marketing and build awareness for titles with a clear call-to-action before launch. This gives them a running start on launch day having built a group of users to activate upon the game's release resulting in a jump in D1 installs.

5. Seek feedback

All partners strongly recommended seeking feedback early and often. Feedback tells both sides of the story, by pointing out what's broken as well as what you're doing right. Find the right time and channels to request feedback, whether they be in-game, social, email, or even through reading and responding to reviews within the Google Play store.

If you're a startup who has an upcoming launch on Google Play or has launched an app or game recently and you're interested in opportunities like Early Access and pre-registration, get in touch with us so we can work with you.

Watch sessions from Google Developer Day at GDC17 on the Android Developers YT channel to learn tips for success. Also, visit the Android Developers website to stay up-to-date with features and best practices that will help you grow a successful business on Google Play.


How useful did you find this blogpost?
        


Publish your app with confidence from the Google Play Developer Console

Posted by Kobi Glick, Product Manager, Google Play

Publishing a new app, or app update, is an important and exciting milestone for every developer. In order to make the process smoother and more trackable, we're announcing the launch of a new way to publish apps on Google Play with some new features. The changes will give you the ability to manage your app releases with more confidence via a new manage releases page in the Google Play Developer Console.




Manage your app updates with clarity and control

The new manage releases page is where you upload alpha, beta, and production releases of your app. From here, you can see important information and the status of all your releases across tracks.

The new manage releases page.
Easier access to existing and new publishing features

Publishing an app or update is a big step, and one that every developer wants to have confidence in taking. To help, we've added two new features.
First, we've added a validation step that highlights potential issues before you publish. The new "review and rollout" page will appear before you confirm the roll out of a new app and flag if there are validation errors or warnings. This new flow will make the app release process easier, especially for apps using multi-APK. It also provides new information; for example, in cases where you added new permissions to your app, the system will highlight it.


Second, it's now simpler to perform and track staged roll-outs during the publishing flow. With staged rollouts, you can release your update to a growing % of users, giving you a chance to catch and address any issues before affecting your whole audience.

If you want to review the history of your releases, it is now possible to track them granularly and download previous APKs.

Finally we've added a new artifacts library under manage releases where you can find all the files that help you manage a release.
Start using the new manage releases page today
You can access the new manage releases page in the Developer Console. Visit the Google Play Developer Help Center for more information. With these changes, we're helping you to publish, track and manage your app with confidence on Google Play.


How useful did you find this blogpost?
                                                                              

Android Things Developer Preview 2




Posted by Wayne Piekarski, Developer Advocate for IoT

Today we are releasing Developer Preview 2 (DP2) for Android Things, bringing new features and bug fixes to the platform. We are committed to providing regular updates to developers, and aim to have new preview releases approximately every 6-8 weeks. Android Things is a comprehensive solution to building Internet of Things (IoT) products with the power of Android. Now any Android developer can quickly build a smart device using Android APIs and Google services, while staying highly secure with updates direct from Google. It includes familiar tools such as Android Studio, the Android Software Development Kit (SDK), Google Play Services, and Google Cloud Platform. Android Things supports a System-on-Module (SoM) architecture, where a core computing module can be initially used with development boards and then easily scaled to large production runs with custom designs, while continuing to use the same Board Support Package (BSP) from Google.
New features and bug fixes
Thanks to great developer feedback from our Developer Preview 1, we have now added support for USB Audio to the Hardware Abstraction Layer (HAL) for Intel Edison and Raspberry Pi 3. NXP Pico already contains direct support for audio on device. We have also resolved many bugs related to Peripheral I/O (PIO). Other feature requests such as Bluetooth support are known issues, and the team is actively working to fix these. We have added support for the Intel Joule platform, which offers the most computing power in our lineup to date.
Native I/O and user drivers
There are many developers who use native C or C++ code to develop IoT software, and Android Things supports the standard Android NDK. We have now released a library to provide native access to the Peripheral API (PIO), so developers can easily use their existing native code. The documentation explains the new API, and the sample provides a demonstration of how to use it.
An important new feature that was made available with Android Things DP1 was support for user drivers. Developers can create a user driver in their APK, and then bind it to the framework. For example, your driver code could read a GPIO pin and trigger a regular Android KeyEvent, or read in an external GPS via a serial port and feed this into the Android location APIs. This allows any application to inject hardware events into the framework, without customizing the Linux kernel or HAL. We maintain a repository of user drivers for a variety of common hardware interfaces such as sensors, buttons, and displays. Developers are also able to create their own drivers and share them with the community.
TensorFlow for Android Things
One of the most interesting features of Android Things is the ability to easily deploy machine learning and computer vision. We have created a highly requested sample that shows how to use TensorFlow on Android Things devices. This sample demonstrates accessing the camera, performing object recognition and image classification, and speaking out the results using text-to-speech (TTS). An early-access TensorFlow inference library prebuilt for ARM and x86 is provided for you to easily add TensorFlow to any Android app with just a single line in your build.gradle file.



TensorFlow sample identifying a dog's breed (American Staffordshire terrier) 
on a Raspberry Pi 3 with camera

Feedback
Thank you to all the developers who submitted feedback for the previous developer preview. Please continue to send us your feedback by filing bug reports and feature requests, and ask any questions on stackoverflow. To download images for Developer Preview 2, visit the Android Things download page, and find the changes in the release notes. You can also join Google's IoT Developers Community on Google+, a great resource to keep up to date and discuss ideas, with over 2900 new members.


Introducing the ExifInterface Support Library

With the release of the 25.1.0 Support Library, there's a new entry in the family: the ExifInterface Support Library. With significant improvements introduced in Android 7.1 to the framework's ExifInterface, it only made sense to make those available to all API 9+ devices via the Support Library's ExifInterface.

The basics are still the same: the ability to read and write Exif tags embedded within image files: now with 140 different attributes (almost 100 of them new to Android 7.1/this Support Library!) including information about the camera itself, the camera settings, orientation, and GPS coordinates.

Camera Apps: Writing Exif Attributes

For Camera apps, the writing is probably the most important - writing attributes is still limited to JPEG image files. Now, normally you wouldn't need to use this during the actual camera capturing itself - you'd instead be calling the Camera2 API CaptureRequest.Builder.set() with JPEG_ORIENTATION, JPEG_GPS_LOCATION or the equivalents in the Camera1 Camera.Parameters. However, using ExifInterface allows you to make changes to the file after the fact (say, removing the location information on the user's request).

Reading Exif Attributes

For the rest of us though, reading those attributes is going to be our bread-and-butter; this is where we see the biggest improvements.

Firstly, you can read Exif data from JPEG and raw images (specifically, DNG, CR2, NEF, NRW, ARW, RW2, ORF, PEF, SRW and RAF files). Under the hood, this was a major restructuring, removing all native dependencies and building an extensive test suite to ensure that everything actually works.

For apps that receive images from other apps with a content:// URI (such as those sent by apps that target API 24 or higher), ExifInterface now works directly off of an InputStream; this allows you to easily extract Exif information directly out of content:// URIs you receive without having to create a temporary file.

Uri uri; // the URI you've received from the other app
InputStream in;
try {
  in = getContentResolver().openInputStream(uri);
  ExifInterface exifInterface = new ExifInterface(in);
  // Now you can extract any Exif tag you want
  // Assuming the image is a JPEG or supported raw format
} catch (IOException e) {
  // Handle any errors
} finally {
  if (in != null) {
    try {
      in.close();
    } catch (IOException ignored) {}
  }
}

Note: ExifInterface will not work with remote InputStreams, such as those returned from a HttpURLConnection. It is strongly recommended to only use them with content:// or file:// URIs.

For most attributes, you'd simply use the getAttributeInt(), getAttributeDouble(), or getAttribute() (for Strings) methods as appropriate.

One of the most important attributes when it comes to displaying images is the image orientation, stored in the aptly-named TAG_ORIENTATION, which returns one of the ORIENTATION_ constants. To convert this to a rotation angle, you can post-process the value.

int rotation = 0;
int orientation = exifInterface.getAttributeInt(
    ExifInterface.TAG_ORIENTATION,
    ExifInterface.ORIENTATION_NORMAL);
switch (orientation) {
  case ExifInterface.ORIENTATION_ROTATE_90:
    rotation = 90;
    break;
  case ExifInterface.ORIENTATION_ROTATE_180:
    rotation = 180;
    break;
  case ExifInterface.ORIENTATION_ROTATE_270:
    rotation = 270;
    break;
}

There are some helper methods to extract values from specific Exif tags. For location data, the getLatLong() method gives you the latitude and longitude as floats and getAltitude() will give you the altitude in meters. Some images also embed a small thumbnail. You can check for its existence with hasThumbnail() and then extract the byte[] representation of the thumbnail with getThumbnail() - perfect to pass to BitmapFactory.decodeByteArray().

Working with Exif: Everything is optional

One thing that is important to understand with Exif data is that there are no required tags: each and every tag is optional - some services even specifically strip Exif data. Therefore throughout your code, you should always handle cases where there is no Exif data, either due to no data for a specific attribute or an image format that doesn't support Exif data at all (say, the ubiquitous PNGs or WebP images).

Add the ExifInterface Support Library to your project with the following dependency:

compile "com.android.support:exifinterface:25.1.0"

But when an Exif attribute is exactly what you need to prevent a mis-rotated image in your app, the ExifInterface Support Library is just what you need to #BuildBetterApps

Important best practices to improve app engagement

Posted by Niko Schröer, Business Development, Google Play

Driving installs is important to growing a user base, but it's not much use if your app sits on users' devices and is rarely opened. In a competitive app landscape, it's increasingly important to engage and retain users over the long term to build a successful business. Users who are using your app more will have a higher lifetime value and be more likely to share your app. Watch my Playtime session below to hear about the tools and features other developers are using to increase app engagement. You can also read the summary of my main tips below.

1. Build a high quality app to engage Android users

Building a high quality app is the foundation of a great user experience on Android. The better your app's user experience is, the more engaged your users will be. Optimizing for material design, for example, can significantly improve user engagement as well as building for Android Wear, Auto or TV where it makes sense based on your value proposition.

To achieve high quality, we recommend you to check out the latest Android features, tips, and best practices in our Playbook for Developers.

The developer of the golf app, Hole19, tailored their app's user experience thoughtfully for Android Wear and, as a result, saw a 40% increase in user engagement compared to non-Wear users. Watch a video about Hole19's success.

2. Make your users feel at home

Personalising your app experience to make users feel at home is a good way to start a long lasting relationship. Onboarding new users is a crucial step in this process. Onboarding should be fast and seamless and ask for minimal user input - after all users want to start using your app as quickly as possible. Furthermore, the onboarding should be a core part of the overall product experience. Use images and wording that's true to your brand and only ask for user input when it's actually needed, to reduce friction and avoid losing users.

Freeletics, a fitness app, created an engaging user onboarding flow in which they tailored imagery and text to male and female users respectively. They also moved the registration process to a later stage in the funnel to reduce friction. The improved onboarding flow increased user activity by 58% within the first 7 days. They also implemented Google Smart Lock to seamlessly sign-in returning users.

3. Optimize feature releases as a way to increase user engagement

Introducing new features is essential to staying ahead of competition and relevant to your users to ensure they keep coming back to your app. To make new feature launches successful drivers for user engagement, follow these simple steps:

  • Define a clear objective for each release to measure your impact, e.g. increase number of users who edit a photo by at least 10%.
  • Use beta testing to gather user feedback and iterate a feature before it's rolled out to all of your users.
  • Enable the pre-launch report in the Play developer console to spot potential flaws and ensure technical stability in your alpha and beta apps.
  • Guide users to each new feature as if it is a light onboarding experience. Visually highlight what's new and provide a short explanation why users should care.
  • Measure performance with analytics to see if the new feature drives engagement (that you've defined as your objective).

4. Use notifications wisely

Push notifications are a popular engagement tool and rightfully so. However, there is a fine line between driving engagement and annoying users (who might then uninstall your app). Follow these guidelines to ensure your notifications are on the right side of the line:

  • Be relevant and only send messages that matter to the user in context. Be creative and true to your brand, speak your users language and use an authentic tone.
  • Make notifications actionable for your users and don't forget to deep link to content where applicable to save your users time.
  • Remember that not all your users are equal so personalize your message to different user cohorts with Firebase Notifications.
  • Consider timeliness of your messages to get users the right notification at the right time and with the right frequency. For example, it might be better to send a notification about something interesting to read at a time when the user normally gets out their phone – like during their commute – instead of the middle of the day, when they might be busy and dismiss a new notification.
  • Finally, give users control over what notifications they receive so that they can opt-in and opt-out of the notifications they like and don't like respectively. If users get annoyed about certain types of notifications and don't have a way to disable them, they might uninstall your app.

The Norwegian news app Aftenposten implemented a new onboarding flow that clarified which notifications were available, allowing readers to manage their preferences. This reduced uninstalls by 9.2.% over 60 days and led to a 28% decrease in the number of users muting notifications completely. Read more about Aftenposten's success.

5. Reward your most engaged users

Last but not least, you should find ways to reward your most loyal users to retain them over time and to make it desirable to less engaged users to engage more. These rewards can come in many shapes and forms. Start by keeping it simple and make sure the reward adds real value to the user and fits in your app's ecosystem. You can do this by:

  • Giving sneak peeks of new features by inviting them to a beta group.
  • Decorating user accounts with badges based on their behaviour.
  • Offer app exclusive discounts or promo codes that can only be redeemed in your app.

Generally, the more you can personalize the reward the better it will work.

Find success with ongoing experimentation

A great Android app gives developers a unique opportunity to create a lasting relationship with users and build a sustainable business with happy customers. Therefore optimising apps to engage and retain your users by following these 5 tips should be front and centre of your development goals and company strategy. Find more tips and best practices by watching the sessions at this year's Playtime events.

How useful did you find this blogpost?



Android Wear 2.0 for China – Developer Preview

Posted by Hoi Lam, Developer Advocate

Today at Google Developer Day China, we are happy to announce a developer preview of Android Wear 2.0 for developers creating apps for China. Android Wear 2.0 is the biggest update since our partners launched their first devices in China last year.

We're making a Developer Preview available today and plan to release additional updates in the coming months. Please send us your feedback by filing bugs or posting in our Android Wear Developers community.

Developing for the Chinese Market

With Android Wear 2.0, apps can access the internet directly on Android Wear devices. As a result, for the majority of apps, having a companion phone application is no longer necessary. This means that most developers creating apps for Android Wear 2.0 may no longer need to import the Google Play services library.

There are two situations where developers will need to import Google Play services for China:

  • Apps that require direct interaction with the paired mobile device - some experiences require Android Wear to connect directly to a paired phone. In this case, the Data Layer API introduced in Android Wear 1.0 will continue to function.
  • New FusedLocationProvider for China - we have added location detection to the SDK for Chinese developers. With the user's permission, your app can receive location updates via the FusedLocationProvider.

You can find more details about how to import the China compatible version of Google Play services library here.

Product testing for Android Wear 2.0 for China

The Android Wear 2.0 Developer Preview includes an updated SDK with tools, and system images for testing using the Huawei Watch.

To get started, follow these steps:

Give us feedback

We will update this developer preview over the next few months based on your feedback. The sooner we hear from you, the more we can include in the final release, so don't be shy!


Android Wear 2.0 中国版 - 开发者预览版

编辑: 林海泉, Android Wear 开发平台负责人

今天在上海举办的Google 开发者大会上,我们正式宣布了一款专门针对中国市场的Android Wear 2.0 开发者预览版。Android Wear 2.0系统,将是自我们的合作伙伴首次发布手表产品以来最重大的更新。

开发者预览版已于今日正式上线。与此同时,我们也计划在未来的几个月内持续进行更新。请您将您遇到的问题在此提交反馈,或者在我们的Android Wear开发者论坛发表意见。

为中国市场开发应用

在Android Wear 2.0系统中,应用可以由Android Wear手表直接连接至互联网。因此,对于大多数应用来说,手机端的伴侣应用也就变得不再必要。这也意味着,多数为Android Wear 2.0开发应用的开发者将不再需要引用Google Play services客户端库。

目前,在两个情况下开发者仍然需要引入Google Play Services客户端库来为中国市场开发应用:

  • 需要与手机直接进行通信的应用 - 有一些用例需要Android Wear手表与已配对手机直接连接。在这种情况下,Android Wear 1.0中引入的Data Layer API仍然可以继续使用。
  • 使用 FusedLocationProvider - 我们在最新的中国版SDK中加入了定位的支持。在用户的许可下,您的应用可以通过FusedLocationProvider来接收定位更新。

您可以在这里找到关于如何引入与中国版兼容的Google Play service的更多信息。

Android Wear 2.0 中国版产品测试

Android Wear 2.0 开发者预览版包括最新的SDK套件,手表测试系统镜像(基于华为手表)。

情按照以下步骤进行测试:

开发反馈

我们会根据您的反馈在未来的几个月中更新开发者预览版。您给我们的反馈越早,我们将会在最终的发布版本中包含更多针对您的反馈的解决方案。敬请期待!