Author Archives: Reto Meier

[New eBook] Download The No-nonsense Guide to App Growth

Originally posted on the AdMob Blog.

What’s the secret to rapid growth for your app?

Play Store or App Store optimization? A sophisticated paid advertising strategy? A viral social media campaign?

While all of these strategies could help you grow your user base, the foundation for rapid growth is much more basic and fundamental—you need an engaging app.

This handbook will walk you through practical ways to increase your app’s user engagement to help you eventually transition to growth. You’ll learn how to:

  • Pick the right metric to represent user engagement
  • Look at data to audit your app and find areas to fix
  • Promote your app after you’ve reached a healthy level of user engagement

Download a free copy here.

For more tips on app monetization, be sure to stay connected on all things AdMob by following our Twitter and Google+ pages.

Posted by Raj Ajrawat, Product Specialist, AdMob

Lighting the way with BLE beacons

Originally posted on the Google Developers blog.

Posted by Chandu Thota, Engineering Director and Matthew Kulick, Product Manager

Just like lighthouses have helped sailors navigate the world for thousands of years, electronic beacons can be used to provide precise location and contextual cues within apps to help you navigate the world. For instance, a beacon can label a bus stop so your phone knows to have your ticket ready, or a museum app can provide background on the exhibit you’re standing in front of. Today, we’re beginning to roll out a new set of features to help developers build apps using this technology. This includes a new open format for Bluetooth low energy (BLE) beacons to communicate with people’s devices, a way for you to add this meaningful data to your apps and to Google services, as well as a way to manage your fleet of beacons efficiently.

Eddystone: an open BLE beacon format

Working closely with partners in the BLE beacon industry, we’ve learned a lot about the needs and the limitations of existing beacon technology. So we set out to build a new class of beacons that addresses real-life use-cases, cross-platform support, and security.

At the core of what it means to be a BLE beacon is the frame format—i.e., a language—that a beacon sends out into the world. Today, we’re expanding the range of use cases for beacon technology by publishing a new and open format for BLE beacons that anyone can use: Eddystone. Eddystone is robust and extensible: It supports multiple frame types for different use cases, and it supports versioning to make introducing new functionality easier. It’s cross-platform, capable of supporting Android, iOS or any platform that supports BLE beacons. And it’s available on GitHub under the open-source Apache v2.0 license, for everyone to use and help improve.

By design, a beacon is meant to be discoverable by any nearby Bluetooth Smart device, via its identifier which is a public signal. At the same time, privacy and security are really important, so we built in a feature called Ephemeral Identifiers (EIDs) which change frequently, and allow only authorized clients to decode them. EIDs will enable you to securely do things like find your luggage once you get off the plane or find your lost keys. We’ll publish the technical specs of this design soon.


Eddystone for developers: Better context for your apps

Eddystone offers two key developer benefits: better semantic context and precise location. To support these, we’re launching two new APIs. The Nearby API for Android and iOS makes it easier for apps to find and communicate with nearby devices and beacons, such as a specific bus stop or a particular art exhibit in a museum, providing better context. And the Proximity Beacon API lets developers associate semantic location (i.e., a place associated with a lat/long) and related data with beacons, stored in the cloud. This API will also be used in existing location APIs, such as the next version of the Places API.

Eddystone for beacon manufacturers: Single hardware for multiple platforms

Eddystone’s extensible frame formats allow hardware manufacturers to support multiple mobile platforms and application scenarios with a single piece of hardware. An existing BLE beacon can be made Eddystone compliant with a simple firmware update. At the core, we built Eddystone as an open and extensible protocol that’s also interoperable, so we’ll also introduce an Eddystone certification process in the near future by closely working with hardware manufacturing partners. We already have a number of partners that have built Eddystone-compliant beacons.

Eddystone for businesses: Secure and manage your beacon fleet with ease

As businesses move from validating their beacon-assisted apps to deploying beacons at scale in places like stadiums and transit stations, hardware installation and maintenance can be challenging: which beacons are working, broken, missing or displaced? So starting today, beacons that implement Eddystone’s telemetry frame (Eddystone-TLM) in combination with the Proximity Beacon API’s diagnostic endpoint can help deployers monitor their beacons’ battery health and displacement—common logistical challenges with low-cost beacon hardware.

Eddystone for Google products: New, improved user experiences

We’re also starting to improve Google’s own products and services with beacons. Google Maps launched beacon-based transit notifications in Portland earlier this year, to help people get faster access to real-time transit schedules for specific stations. And soon, Google Now will also be able to use this contextual information to help prioritize the most relevant cards, like showing you menu items when you’re inside a restaurant.

We want to make beacons useful even when a mobile app is not available; to that end, the Physical Web project will be using Eddystone beacons that broadcast URLs to help people interact with their surroundings.

Beacons are an important way to deliver better experiences for users of your apps, whether you choose to use Eddystone with your own products and services or as part of a broader Google solution like the Places API or Nearby API. The ecosystem of app developers and beacon manufacturers is important in pushing these technologies forward and the best ideas won’t come from just one company, so we encourage you to get some Eddystone-supported beacons today from our partners and begin building!

Connect With the World Around You Through Nearby APIs

Originally posted on the Google Developers blog.

Posted by Akshay Kannan, Product Manager

Mobile phones have made it easy to communicate with anyone, whether they’re right next to you or on the other side of the world. The great irony, however, is that those interactions can often feel really awkward when you're sitting right next to someone.

Today, it takes several steps -- whether it’s exchanging contact information, scanning a QR code, or pairing via bluetooth -- to get a simple piece of information to someone right next to you. Ideally, you should be able to just turn to them and do so, the same way you do in the real world.

This is why we built Nearby. Nearby provides a proximity API, Nearby Messages, for iOS and Android devices to discover and communicate with each other, as well as with beacons.

Nearby uses a combination of Bluetooth, Wi-Fi, and inaudible sound (using the device’s speaker and microphone) to establish proximity. We’ve incorporated Nearby technology into several products, including Chromecast Guest Mode, Nearby Players in Google Play Games, and Google Tone.

With the latest release of Google Play services 7.8, the Nearby Messages API becomes available to all developers across iOS and Android devices (Gingerbread and higher). Nearby doesn’t use or require a Google Account. The first time an app calls Nearby, users get a permission dialog to grant that app access.

A few of our partners have built creative experiences to show what's possible with Nearby.

Edjing uses Nearby to let DJs publish their tracklist to people around them. The audience can vote on tracks that they like, and their votes are updated in realtime.

Trello uses Nearby to simplify sharing. Share a Trello board to the people around you with a tap of a button.

Pocket Casts uses Nearby to let you find and compare podcasts with people around you. Open the Nearby tab in Pocket Casts to view a list of podcasts that people around you have, as well as podcasts that you have in common with others.

Trulia uses Nearby to simplify the house hunting process. Create a board and use Nearby to make it easy for the people around you to join it.

To learn more, visit developers.google.com/nearby.

M Developer Preview Gets Its First Update

By Jamal Eason, Product Manager, Android

Earlier this summer at Google I/O, we launched the M Developer Preview. The developer preview is an early access opportunity to test and optimize your apps for the next release of Android. Today we are releasing an update to the M Developer Preview that includes fixes and updates based on your feedback.

What’s New

The Developer Preview 2 update includes the up to date M release platform code, and near-final APIs for you to validate your app. To provide more testing support, we have refined the Nexus system images and emulator system images with the Android platform updates. In addition to platform updates, the system images also include Google Play services 7.6.

How to Get the Update

If you are already running the M developer preview launched at Google I/O (Build #MPZ44Q) on a supported Nexus device (e.g. Nexus 5, Nexus 6, Nexus 9, or Nexus Player), the update can be delivered to your device via an over-the-air update. We expect all devices currently on the developer preview to receive the update over the next few days. We also posted a new version of the preview system image on the developer preview website. (To view the preview website in a language other than English, select the appropriate language from the language selector at the bottom of the page).

For those developers using the emulator, you can update your M preview system images via the SDK Manager in Android Studio.

What are the Major Changes?

We have addressed many issues brought up during the first phase of the developer preview. Check out the release notes for a detailed list of changes in this update. Some of the highlights to the update include:

  • Android Platform Changes:
    • Modifications to platform permissions including external storage, Wi-Fi & Bluetooth location, and changes to contacts/identity permissions. Device connections through the USB port are now set to charge-only mode by default. To access the device, users must explicitly grant permission.
  • API Changes:
    • Updated Bluetooth Stylus APIs with updated callback events. View.onContextClickListener and GestureDetector.OnContextClickListener to listen for stylus button presses and to perform secondary actions.
    • Updated Media API with new callback InputDevice.hasMicrophone() method for determining if a device microphone exists.
  • Fixes for developer-reported issues:
    • TextInputLayout doesn't set hint for embedded EditText. (fixed issue)
    • Camera Permission issue with Legacy Apps (fixed issue)

Next Steps

With the final M release still on schedule for this fall, the platform features and API are near final. However, there is still time to report critical issues as you continue to test and validate your apps on the M Developer Preview. You can also visit our M Developer Preview community to share ideas and information.

Thanks again for your support. We look forward to seeing your apps that are ready to go for the M release this fall.

The App Developer Business Kit: Now available in 10 languages

Posted by Sean Meng, a Product Marketing Manager on the AdMob team

Today we’re excited to launch The App Developer Business Kit in 10 more languages. The website includes tips for new app developers on building, promoting and monetizing your app. Check out the Business Kit in your language:

To help you make decisions about growing your app business in other regions, we’ve added 6 new market reports providing great insights about app users in Italy, Spain, Germany, Brazil, France, and Russia. Did you know that Brazilian smartphone users engage with ads more frequently than users in the US and Japan? Or that while nearly 2/3rds of French users exclusively download free apps, only 31% of Brazilian smartphone users do? Check out statistics like these about exciting regions around the world here.

Stay connected on all things mobile apps by following us on Google+ and Twitter.

Game Performance: Data-Oriented Programming

Posted by Shanee Nishry, Game Developer Advocate

To improve game performance, we’d like to highlight a programming paradigm that will help you maximize your CPU potential, make your game more efficient, and code smarter.

Before we get into detail of data-oriented programming, let’s explain the problems it solves and common pitfalls for programmers.

Memory

The first thing a programmer must understand is that memory is slow and the way you code affects how efficiently it is utilized. Inefficient memory layout and order of operations forces the CPU idle waiting for memory so it can proceed doing work.

The easiest way to demonstrate is by using an example. Take this simple code for instance:

char data[1000000]; // One Million bytes
unsigned int sum = 0;

for ( int i = 0; i < 1000000; ++i )
{
  sum += data[ i ];
}

An array of one million bytes is declared and iterated on one byte at a time. Now let's change things a little to illustrate the underlying hardware. Changes marked in bold:

char data[16000000]; // Sixteen Million bytes
unsigned int sum = 0;

for ( int i = 0; i < 16000000; i += 16 )
{
  sum += data[ i ];
}

The array is changed to contain sixteen million bytes and we iterate over one million of them, skipping 16 at a time.

A quick look suggests there shouldn't be any effect on performance as the code is translated to the same number of instructions and runs the same number of times, however that is not the case. Here is the difference graph. Note that this is on a logarithmic scale--if the scale were linear, the performance difference would be too large to display on any reasonably-sized graph!


Graph in logarithmic scale

The simple change making the loop skip 16 bytes at a time makes the program run 5 times slower!

The average difference in performance is 5x and is consistent when iterating 1,000 bytes up to a million bytes, sometimes increasing up to 7x. This is a serious change in performance.

Note: The benchmark was run on multiple hardware configurations including a desktop with Intel 5930K 3.50GHz CPU, a Macbook Pro Retina laptop with 2.6 GHz Intel i7 CPU and Android Nexus 5 and Nexus 6 devices. The results were pretty consistent.

If you wish to replicate the test, you might have to ensure the memory is out of the cache before running the loop because some compilers will cache the array on declaration. Read below to understand more on how it works.

Explanation

What happens in the example is quite simply explained when you understand how the CPU accesses data. The CPU can’t access data in RAM; the data must be copied to the cache, a smaller but extremely fast memory line which resides near the CPU chip.

When the program starts, the CPU is set to run an instruction on part of the array but that data is still not in the cache, therefore causing a cache miss and forcing the CPU to wait for the data to be copied into the cache.

For simplicity sake, assume a cache size of 16 bytes for the L1 cache line, this means 16 bytes will be copied starting from the requested address for the instruction.

In the first code example, the program next tries to operate on the following byte, which is already copied into the cache following the initial cache miss, therefore continuing smoothly. This is also true for the next 14 bytes. After 16 bytes, since the first cache miss the loop, will encounter another cache miss and the CPU will again wait for data to operate on, copying the next 16 bytes into the cache.

In the second code sample, the loop skips 16 bytes at a time but hardware continues to operate the same. The cache copies the 16 subsequent bytes each time it encounters a cache miss which means the loop will trigger a cache miss with each iteration and cause the CPU to wait idle for data each time!

Note: Modern hardware implements cache prefetch algorithms to prevent incurring a cache miss per frame, but even with prefetching, more bandwidth is used and performance is lower in our example test.

In reality the cache lines tend to be larger than 16 bytes, the program would run much slower if it were to wait for data at every iteration. A Krait-400 found in the Nexus 5 has a L0 data cache of 4 KB with 64 Bytes per line.

If you are wondering why cache lines are so small, the main reason is that making fast memory is expensive.

Data-Oriented Design

The way to solve such performance issues is by designing your data to fit into the cache and have the program to operate on the entire data continuously.

This can be done by organizing your game objects inside Structures of Arrays (SoA) instead of Arrays of Structures (AoS) and pre-allocating enough memory to contain the expected data.

For example, a simple physics object in an AoS layout might look like this:

struct PhysicsObject
{
  Vec3 mPosition;
  Vec3 mVelocity;

  float mMass;
  float mDrag;
  Vec3 mCenterOfMass;

  Vec3 mRotation;
  Vec3 mAngularVelocity;

  float mAngularDrag;
};

This is a common way way to present an object in C++.

On the other hand, using SoA layout looks more like this:

class PhysicsSystem
{
private:
  size_t mNumObjects;
  std::vector< Vec3 > mPositions;
  std::vector< Vec3 > mVelocities;
  std::vector< float > mMasses;
  std::vector< float > mDrags;

  // ...
};

Let’s compare how a simple function to update object positions by their velocity would operate.

For the AoS layout, a function would look like this:

void UpdatePositions( PhysicsObject* objects, const size_t num_objects, const float delta_time )
{
  for ( int i = 0; i < num_objects; ++i )
  {
    objects[i].mPosition += objects[i].mVelocity * delta_time;
  }
}

The PhysicsObject is loaded into the cache but only the first 2 variables are used. Being 12 bytes each amounts to 24 bytes of the cache line being utilised per iteration and causing a cache miss with every object on a 64 bytes cache line of a Nexus 5.

Now let’s look at the SoA way. This is our iteration code:

void PhysicsSystem::SimulateObjects( const float delta_time )
{
  for ( int i = 0; i < mNumObjects; ++i )
  {
    mPositions[ i ] += mVelocities[i] * delta_time;
  }
}

With this code, we immediately cause 2 cache misses, but we are then able to run smoothly for about 5.3 iterations before causing the next 2 cache misses resulting in a significant performance increase!

The way data is sent to the hardware matters. Be aware of data-oriented design and look for places it will perform better than object-oriented code.

We have barely scratched the surface. There is still more to data-oriented programming than structuring your objects. For example, the cache is used for storing instructions and function memory so optimizing your functions and local variables affects cache misses and hits. We also did not mention the L2 cache and how data-oriented design makes your application easier to multithread.

Make sure to profile your code to find out where you might want to implement data-oriented design. You can use different profilers for different architecture, including the NVIDIA Tegra System Profiler, ARM Streamline Performance Analyzer, Intel and PowerVR PVRMonitor.

If you want to learn more on how to optimize for your cache, read on cache prefetching for various CPU architectures.

An update on Eclipse Android Developer Tools

Posted by Jamal Eason, Product Manager, Android

Over the past few years, our team has focused on improving the development experience for building Android apps with Android Studio. Since the launch of Android Studio, we have been impressed with the excitement and positive feedback. As the official Android IDE, Android Studio gives you access to a powerful and comprehensive suite of tools to evolve your app across Android platforms, whether it's on the phone, wrist, car or TV.

To that end and to focus all of our efforts on making Android Studio better and faster, we are ending development and official support for the Android Developer Tools (ADT) in Eclipse at the end of the year. This specifically includes the Eclipse ADT plugin and Android Ant build system.

Time to Migrate

If you have not had the chance to migrate your projects to Android Studio, now is the time. To get started, download Android Studio. For many developers, migration is as simple as importing your existing Eclipse ADT projects in Android Studio with File → New→ Import Project as shown below:

For more details on the migration process, check out the migration guide. Also, to learn more about Android Studio and the underlying build system, check out this overview page.

Next Steps

Over the next few months, we are migrating the rest of the standalone performance tools (e.g. DDMS, Trace Viewer) and building in additional support for the Android NDK into Android Studio.

We are focused on Android Studio so that our team can deliver a great experience on a unified development environment. Android tools inside Eclipse will continue to live on in the open source community via the Eclipse Foundation. Check out the latest Eclipse Andmore project if you are interested in contributing or learning more.

For those of you that are new to Android Studio, we are excited for you to integrate Android Studio into your development workflow. Also, if you want to contribute to Android Studio, you can also check out the project source code. To follow all the updates on Android Studio, join our Google+ community.

Android Developer Story: Shifty Jelly drives double-digit growth with material design and expansion to the car and wearables

Posted by Lily Sheringham, Google Play team

Pocket Casts is a leading podcasting app on Google Play built by Australian-based mobile development company Shifty Jelly. The company recently achieved $1 million in sales for the first time, reaching more than 500K users.

According to the co-founder Russell Ivanovic, the adoption of material design played a significant role in driving user engagement for Pocket Casts by streamlining the user experience. Moreover, users are now able to access the app beyond the smartphone -- in the car with Android Auto, on a watch with Android Wear or on the TV with Google Cast. The rapid innovation of Android features helped Pocket Casts increase sales by 30 percent.

We chatted with co-founders and Android developers Russell and Philip Simpson to learn more about how they are growing their business with Android.

Here are some of the features Pocket Casts used:

  • Material Design: Learn more about material design and how it helps you create beautiful, engaging apps.
  • Android Wear: Extend your app to Android Wear devices with enhanced notifications or a standalone wearable app.
  • Android Auto: Extend your app to an interface that’s optimized for driving with Android Auto.
  • Google Cast: let your users cast your app’s content to Google Cast devices like Chromecast, Android TV, and speakers with Google Cast built-in.

And check out the Pocket Casts app on Google Play!

Fitness Apps on Android Wear

Posted by Joshua Gordon, Developer Advocate

Go for a run, improve your game, and explore the great outdoors with Android Wear! Developers are creating a diverse array of fitness apps that provide everything from pace and heart rate while running, to golf tips on your favorite course, to trail maps for hiking. Let’s take a look features of the open and flexible Wear platform they use to create great user experiences.

Always-on stats

If your app supports always-on, you’ll never have to touch or twist your watch to activate the display. Running and want to see your pace? Glance at your wrist and it’s there! Runtastic, Endomondo, and MapMyRun use always-on to keep your stats visible, even in ambient mode. When it’s time for golf, I use Golfshot. Likewise, Golfshot uses always-on to continuously show yardage to the hole, so I never have to drop my club. Check out the doc, DevByte, and code sample to learn more.

Runtastic automatically transitions to ambient mode to conserve battery. There, it reduces the frequency at which stats are updated to about once per 10 seconds.

Maps, routes, and markers

It's encouraging to see how much ground I’ve covered when I go for a run or ride! Using the Maps API, you can show users their route, position, and place markers on the map they can tap to see more info you provide. All of this functionality is available to you using the same Maps API you’ve already worked with on Android. Check out the doc, DevByte, code sample, and blog post to learn more.

Endomondo tracks your route while your run. You can pan and zoom the map.

Google Fit

Google Fit is an open platform designed to make it easier to write fitness apps. It provides APIs to help with many common tasks. For example, you can use the Recording API to estimate how many steps the user has taken and how many calories they've burned. You can make that data to your app via the History API, and even access it over the web via REST, without having to write your own backend. Now, Google Fit can store data from a wide variety of exercises, from running to weightlifting. Check out the DevByte and code samples to learn more.

Bluetooth Low Energy: pair with your watch

With the latest release of Android Wear, developers can now pair BLE devices directly with the Wearable. This is a great opportunity for all fitness apps -- and especially for running -- where carrying both a phone and the Wearable can be problematic. Imagine if your users could pair their heart rate straps or bicycle cadence sensors directly to their Wear device, and leave their phones at home. BLE is now supported by all Wear devices, and is supported by Google Fit. To learn more about it, check out this guide and DevByte.

Pack light with onboard GPS

When I’m running, carrying both a phone and a wearable can be a bit much. If you’re using an Android Wear device that supports onboard GPS, you can leave your phone at home! Since not all Wear devices have an onboard GPS sensor, you can use the FusedLocationProviderApi to seamlessly retrieve GPS coordinates from the phone if not available on the wearable. Check out this handy guide for more about detecting location on Wear.

RunKeeper supports onboard GPS if it’s available on your Wearable.

Sync data transparently

When I’m back home and ready for more details on my activity, I can see them by opening the app on my phone. My favorite fitness apps transparently sync data between my Wearable and phone. To learn more about syncing data between devices, watch this DevByte on the DataLayer API.

Next Steps

Android Wear gives you the tools and training you need to create exceptional fitness apps. To get started on yours, visit developer.android.com/wear and join the discussion at g.co/androidweardev.

Growing Android TV engagement with search and recommendations

Posted by Maru Ahues, Media Developer Advocate

When it comes to TV, content is king. But to enjoy great content, you first need to find it. We created Android TV with that in mind: a truly smart TV should deliver interesting content to users. Today, EPIX® joins a growing list of apps that use the Android TV platform to make it easy to enjoy movies, TV shows, sports highlights, music videos and more.

Making TV Apps Searchable

Think of your favorite movie. Now try to locate it in one of your streaming apps. If you have a few apps to choose from, it might take some hunting before you can watch that movie. With Android TV, we want to make it easier to be entertained. Finding ‘Teenage Mutant Ninja Turtles’ should be as easy as picking up the remote, saying ‘Teenage Mutant Ninja Turtles’ and letting the TV find it.

Searching for ‘Teenage Mutant Ninja Turtles’ shows results from Google Play and EPIX

You can drive users directly to content within your app by making it searchable from the Android TV search interface. Join app developers like EPIX, Sky News, YouTube, and Hulu Plus who are already making content discovery a breeze.

Recommending TV Content

When users want suggestions for content, the recommendations row on Android TV helps them quickly access relevant content right from the home screen. Recommendations are based on the user’s recent and frequent usage behaviors, as well as content preferences.

Recommendations from installed apps, like EPIX, appear in the Android TV home screen

Android TV allows developers to create recommendations for movies, TV shows, music and other types of content. Your app can provide recommendations to users to help get your content noticed. As an example, EPIX shows hollywood movies. NBA Game Time serves up basketball highlights. Washington Post offers video summaries of world events, and YouTube suggests videos based on your subscriptions and viewing history.

With less than one year since the consumer launch of Android TV, we’re already building upon a simpler, smarter and more personalized TV experience, and we can’t wait to see what you create.