Tag Archives: Firebase

Announcing DevFest 2020

Posted by Jennifer Kohl, Program Manager, Developer Community Programs

DevFest Image

On October 16-18, thousands of developers from all over the world are coming together for DevFest 2020, the largest virtual weekend of community-led learning on Google technologies.

As people around the world continue to adapt to spending more time at home, developers yearn for community now more than ever. In years past, DevFest was a series of in-person events over a season. For 2020, the community is coming together in a whole new way – virtually – over one weekend to keep developers connected when they may want it the most.

The speakers

The magic of DevFest comes from the people who organize and speak at the events - developers with various backgrounds and skill levels, all with their own unique perspectives. In different parts of the world, you can find a DevFest session in many local languages. DevFest speakers are made up of various types of technologists, including kid developers , self-taught programmers from rural areas , and CEOs and CTOs of startups. DevFest also features a wide range of speakers from Google, Women Techmakers, Google Developer Experts, and more. Together, these friendly faces, with many different perspectives, create a unique and rich developer conference.

The sessions and their mission

Hosted by Google Developer Groups, this year’s sessions include technical talks and workshops from the community, and a keynote from Google Developers. Through these events, developers will learn how Google technologies help them develop, learn, and build together.

Sessions will cover multiple technologies, such as Android, Google Cloud Platform, Machine Learning with TensorFlow, Web.dev, Firebase, Google Assistant, and Flutter.


At our core, Google Developers believes community-led developer events like these are an integral part of the advancement of technology in the world.

For this reason, Google Developers supports the community-led efforts of Google Developer Groups and their annual tentpole event, DevFest. Google provides esteemed speakers from the company and custom technical content produced by developers at Google. The impact of DevFest is really driven by the grassroots, passionate GDG community organizers who volunteer their time. Google Developers is proud to support them.

The attendees

During DevFest 2019, 138,000+ developers participated across 500+ DevFests in 100 countries. While 2020 is a very different year for events around the world, GDG chapters are galvanizing their communities to come together virtually for this global moment. The excitement for DevFest continues as more people seek new opportunities to meet and collaborate with like-minded, community-oriented developers in our local towns and regions.

Join the conversation on social media with #DevFest.

Sign up for DevFest at goo.gle/devfest.





Still curious? Check out these popular talks from DevFest 2019 events around the world...

Reviewing ad issues in mobile apps with the Google Mobile Ads SDK

In order to help mobile app publishers review ad issues (e.g., out-of-memory caused by graphic intense creatives, violations of Ad Manager policies, or AdMob policies and restrictions) in production apps, we have recently added an ad response ID to the ResponseInfo and GADResponseInfo objects in the Google Mobile Ads Android SDK (v. 19.0.0) and iOS SDK (v. 7.49.0). An ad response ID is a unique string for each ad response from the AdMob or Ad Manager server, regardless of ad formats. If the same ad is returned more than once, the ad response ID will differ each time.

You can look up an ad response ID in the Ad Review Center (AdMob, Ad Manager) to find and block the offending ad. You can also report problematic ads to Google using the ad response ID, especially when it is difficult to capture a mobile ad's click string.

The screenshot above shows an ad response ID in Android Studio logcat.

If you use Firebase, you can refer to the Firebase Crashlytics Android (AdMob, Ad Manager) or iOS (AdMob, Ad Manager) guide for logging the ad response ID. This technique can be useful for debugging production app crashes as you would have both the SDK symbols and the ad response ID data in the same log.

We hope this new feature makes it easier to troubleshoot ad issues.

If you would like to give us feedback on this feature, please post your comments and questions on our Google Mobile Ads SDK Technical Forum.

Reviewing ad issues in mobile apps with the Google Mobile Ads SDK

In order to help mobile app publishers review ad issues (e.g., out-of-memory caused by graphic intense creatives, violations of Ad Manager policies, or AdMob policies and restrictions) in production apps, we have recently added an ad response ID to the ResponseInfo and GADResponseInfo objects in the Google Mobile Ads Android SDK (v. 19.0.0) and iOS SDK (v. 7.49.0). An ad response ID is a unique string for each ad response from the AdMob or Ad Manager server, regardless of ad formats. If the same ad is returned more than once, the ad response ID will differ each time.

You can look up an ad response ID in the Ad Review Center (AdMob, Ad Manager) to find and block the offending ad. You can also report problematic ads to Google using the ad response ID, especially when it is difficult to capture a mobile ad's click string.

The screenshot above shows an ad response ID in Android Studio logcat.

If you use Firebase, you can refer to the Firebase Crashlytics Android (AdMob, Ad Manager) or iOS (AdMob, Ad Manager) guide for logging the ad response ID. This technique can be useful for debugging production app crashes as you would have both the SDK symbols and the ad response ID data in the same log.

We hope this new feature makes it easier to troubleshoot ad issues.

If you would like to give us feedback on this feature, please post your comments and questions on our Google Mobile Ads SDK Technical Forum.

Reviewing ad issues in mobile apps with the Google Mobile Ads SDK

In order to help mobile app publishers review ad issues (e.g., out-of-memory caused by graphic intense creatives, violations of Ad Manager policies, or AdMob policies and restrictions) in production apps, we have recently added an ad response ID to the ResponseInfo and GADResponseInfo objects in the Google Mobile Ads Android SDK (v. 19.0.0) and iOS SDK (v. 7.49.0). An ad response ID is a unique string for each ad response from the AdMob or Ad Manager server, regardless of ad formats. If the same ad is returned more than once, the ad response ID will differ each time.

You can look up an ad response ID in the Ad Review Center (AdMob, Ad Manager) to find and block the offending ad. You can also report problematic ads to Google using the ad response ID, especially when it is difficult to capture a mobile ad's click string.

The screenshot above shows an ad response ID in Android Studio logcat.

If you use Firebase, you can refer to the Firebase Crashlytics Android (AdMob, Ad Manager) or iOS (AdMob, Ad Manager) guide for logging the ad response ID. This technique can be useful for debugging production app crashes as you would have both the SDK symbols and the ad response ID data in the same log.

We hope this new feature makes it easier to troubleshoot ad issues.

If you would like to give us feedback on this feature, please post your comments and questions on our Google Mobile Ads SDK Technical Forum.

Google Play services and Firebase migrating to AndroidX

Posted by Doug Stevenson, Developer Advocate

Later this year, the Google Play services and Firebase SDKs will migrate from the Android Support libraries to androidx-packaged library artifacts. We are targeting this change for June/July of 2019. This will not only make our SDKs better, but make it easier for you to use the latest Jetpack features in your app.

If your app depends on any com.google.android.gms or com.google.firebase libraries, you should prepare for this migration. To quickly test your build with androidx-packaged library artifacts, add the following two lines to your gradle.properties file:

android.useAndroidX=true
android.enableJetifier=true

If your build still works, then you're done! You will be ready to use the new Google Play services and Firebase SDKs when they arrive. If you experience any new build issues or want more information on this migration, visit the official Jetpack migration guide. We will communicate when the androidx migration is complete in the near future, stay tuned!

New ML Kit features easily bring Machine Learning to your apps

Posted by Brahim Elbouchikhi, Director of Product Management and Matej Pfajfar, Engineering Director

We launched ML Kit at I/O last year with the mission to simplify Machine Learning for everyone. We couldn’t be happier about the experiences that ML Kit has enabled thousands of developers to create. And more importantly, user engagement with features powered by ML Kit is growing more than 60% per month. Below is a small sample of apps we have been working with.

But there is a lot more. At I/O this year, we are excited to introduce four new features.

The Object Detection and Tracking API lets you identify the prominent object in an image and then track it in real-time. You can pair this API with a cloud solution (e.g. Google Cloud’s Product Search API) to create a real-time visual search experience.

When you pass an image or video stream to the API, it will return the coordinates of the primary object as well as a coarse classification. The API then provides a handle for tracking this object's coordinates over time.

A number of partners have built experiences that are powered by this API already. For example, Adidas built a visual search experience right into their app.

The On-device Translation API allows you to use the same offline models that support Google Translate to provide fast, dynamic translation of text in your app into 58 languages. This API operates entirely on-device so the context of the translated text never leaves the device.

You can use this API to enable users to communicate with others who don't understand their language or translate user-generated content.

To the right, we demonstrate the use of ML Kit’s text recognition, language detection, and translation APIs in one experience.

We also collaborated with the Material Design team to produce a set of design patterns for integrating ML into your apps. We are open sourcing implementations of these patterns and hope that they will further accelerate your adoption of ML Kit and AI more broadly.

Our design patterns for machine learning powered features will be available on the Material.io site.

With AutoML Vision Edge, you can easily create custom image classification models tailored to your needs. For example, you may want your app to be able to identify different types of food, or distinguish between species of animals. Whatever your need, just upload your training data to the Firebase console and you can use Google’s AutoML technology to build a custom TensorFlow Lite model for you to run locally on your user's device. And if you find that collecting training datasets is hard, you can use our open source app which makes the process simpler and more collaborative.

Wrapping up

We are excited by this first year and really hope that our progress will inspire you to get started with Machine Learning. Please head over to g.co/mlkit to learn more or visit Firebase to get started right away.

Web Notifications API Support Now Available in FCM Send v1 API

Posted by Mertcan Mermerkaya, Software Engineer

We have great news for web developers that use Firebase Cloud Messaging to send notifications to clients! The FCM v1 REST API has integrated fully with the Web Notifications API. This integration allows you to set icons, images, actions and more for your Web notifications from your server! Better yet, as the Web Notifications API continues to grow and change, these options will be immediately available to you. You won't have to wait for an update to FCM to support them!

Below is a sample payload you can send to your web clients on Push API supported browsers. This notification would be useful for a web app that supports image posting. It can encourage users to engage with the app.

{
"message": {
"webpush": {
"notification": {
"title": "Fish Photos ?",
"body":
"Thanks for signing up for Fish Photos! You now will receive fun daily photos of fish!",
"icon": "firebase-logo.png",
"image": "guppies.jpg",
"data": {
"notificationType": "fishPhoto",
"photoId": "123456"
},
"click_action": "https://example.com/fish_photos",
"actions": [
{
"title": "Like",
"action": "like",
"icon": "icons/heart.png"
},
{
"title": "Unsubscribe",
"action": "unsubscribe",
"icon": "icons/cross.png"
}
]
}
},
"token": "<APP_INSTANCE_REGISTRATION_TOKEN>"
}
}

Notice that you are able to set new parameters, such as actions, which gives the user different ways to interact with the notification. In the example below, users have the option to choose from actions to like the photo or to unsubscribe.

To handle action clicks in your app, you need to add an event listener in the default firebase-messaging-sw.js file (or your custom service worker). If an action button was clicked, event.action will contain the string that identifies the clicked action. Here's how to handle the "like" and "unsubscribe" events on the client:

// Retrieve an instance of Firebase Messaging so that it can handle background messages.
const messaging = firebase.messaging();

// Add an event listener to handle notification clicks
self.addEventListener('notificationclick', function(event) {
if (event.action === 'like') {
// Like button was clicked

const photoId = event.notification.data.photoId;
like(photoId);
}
else if (event.action === 'unsubscribe') {
// Unsubscribe button was clicked

const notificationType = event.notification.data.notificationType;
unsubscribe(notificationType);
}

event.notification.close();
});

The SDK will still handle regular notification clicks and redirect the user to your click_action link if provided. To see more on how to handle click actions on the client, check out the guide.

Since different browsers support different parameters in different platforms, it's important to check out the browser compatibility documentation to ensure your notifications work as intended. Want to learn more about what the Send API can do? Check out the FCM Send API documentation and the Web Notifications API documentation. If you're using the FCM Send API and you incorporate the Web Notifications API in a cool way, then let us know! Find Firebase on Twitter at @Firebase, and Facebook and Google+ by searching "Firebase".

Introducing ML Kit

Posted by Brahim Elbouchikhi, Product Manager

In today's fast-moving world, people have come to expect mobile apps to be intelligent - adapting to users' activity or delighting them with surprising smarts. As a result, we think machine learning will become an essential tool in mobile development. That's why on Tuesday at Google I/O, we introduced ML Kit in beta: a new SDK that brings Google's machine learning expertise to mobile developers in a powerful, yet easy-to-use package on Firebase. We couldn't be more excited!



Machine learning for all skill levels

Getting started with machine learning can be difficult for many developers. Typically, new ML developers spend countless hours learning the intricacies of implementing low-level models, using frameworks, and more. Even for the seasoned expert, adapting and optimizing models to run on mobile devices can be a huge undertaking. Beyond the machine learning complexities, sourcing training data can be an expensive and time consuming process, especially when considering a global audience.

With ML Kit, you can use machine learning to build compelling features, on Android and iOS, regardless of your machine learning expertise. More details below!

Production-ready for common use cases

If you're a beginner who just wants to get the ball rolling, ML Kit gives you five ready-to-use ("base") APIs that address common mobile use cases:

  • Text recognition
  • Face detection
  • Barcode scanning
  • Image labeling
  • Landmark recognition

With these base APIs, you simply pass in data to ML Kit and get back an intuitive response. For example: Lose It!, one of our early users, used ML Kit to build several features in the latest version of their calorie tracker app. Using our text recognition based API and a custom built model, their app can quickly capture nutrition information from product labels to input a food's content from an image.

ML Kit gives you both on-device and Cloud APIs, all in a common and simple interface, allowing you to choose the ones that fit your requirements best. The on-device APIs process data quickly and will work even when there's no network connection, while the cloud-based APIs leverage the power of Google Cloud Platform's machine learning technology to give a higher level of accuracy.

See these APIs in action on your Firebase console:

Heads up: We're planning to release two more APIs in the coming months. First is a smart reply API allowing you to support contextual messaging replies in your app, and the second is a high density face contour addition to the face detection API. Sign up here to give them a try!

Deploy custom models

If you're seasoned in machine learning and you don't find a base API that covers your use case, ML Kit lets you deploy your own TensorFlow Lite models. You simply upload them via the Firebase console, and we'll take care of hosting and serving them to your app's users. This way you can keep your models out of your APK/bundles which reduces your app install size. Also, because ML Kit serves your model dynamically, you can always update your model without having to re-publish your apps.

But there is more. As apps have grown to do more, their size has increased, harming app store install rates, and with the potential to cost users more in data overages. Machine learning can further exacerbate this trend since models can reach 10's of megabytes in size. So we decided to invest in model compression. Specifically, we are experimenting with a feature that allows you to upload a full TensorFlow model, along with training data, and receive in return a compressed TensorFlow Lite model. The technology behind this is evolving rapidly and so we are looking for a few developers to try it and give us feedback. If you are interested, please sign up here.

Better together with other Firebase products

Since ML Kit is available through Firebase, it's easy for you to take advantage of the broader Firebase platform. For example, Remote Config and A/B testing lets you experiment with multiple custom models. You can dynamically switch values in your app, making it a great fit to swap the custom models you want your users to use on the fly. You can even create population segments and experiment with several models in parallel.

Other examples include:

Get started!

We can't wait to see what you'll build with ML Kit. We hope you'll love the product like many of our early customers:

Get started with the ML Kit beta by visiting your Firebase console today. If you have any thoughts or feedback, feel free to let us know - we're always listening!

Announcing new SDK versioning in Google Play services and Firebase

Posted by Doug Stevenson, Developer Advocate

Starting today, the Android SDKs for Google Play services and Firebase will be using a new build and versioning scheme. This may require some changes to the way you build your Android app, so be sure to read here thoroughly to get all the details.

Here's a quick summary of what's new in these SDKs:

  • All dependencies now use semantic versioning.
  • Each dependency may be updated individually, removing the need to upgrade them all simultaneously in your app.
  • Each dependency has a faster cycle for bug fixes and new features.

Beginning with version 15 of all Play services and Firebase libraries, version numbers adhere to the semantic versioning scheme. As you know, semver is an industry standard for versioning software components, so you can expect that version number changes for each library indicate the amount of change in the library.

Each Maven dependency matching com.google.android.gms:play-services-* and com.google.firebase:firebase-* is no longer required to have the same version number in order to work correctly at build time and at run time. You will be able to upgrade each dependency independently from each other. As such, a common pattern for specifying the shared version number for Play and Firebase dependencies in Gradle builds will no longer work as expected. The pattern (now anti-pattern) looks like this:

buildscript {
    ext {
        play_version = '15.0.0'
    }
}

dependencies {
    // DON'T DO THIS!!
    // The following use of the above buildscript property is no longer valid.
    implementation "com.google.android.gms:play-services-auth:${play_version}"
    implementation "com.google.firebase:firebase-auth:${play_version}"
    implementation "com.google.firebase:firebase-firestore:${play_version}"
}

The above Gradle configuration defines a buildscript property called play_version with the version of the Play and Firebase SDKs, and uses that to declare dependencies. This pattern has been helpful to keep all the dependency versions together, as previously required. However, this pattern no longer applies starting with version 15 for each library. Each dependency that you use may now be at different versions. You can expect that individual library updates may not be released at the same time - they may be updated independently.

In order to support this change in versioning, the Play services Gradle plugin has been updated. If you're using this plugin, it appears like this at the bottom of build.gradle in your app module:

apply plugin: 'com.google.gms.google-services'

Here is what has changed in this plugin:

  • It checks for compatible versions of Play and Firebase libraries. This is similar to enabling the failOnVersionConflict() ResolutionStrategy.
  • Licensing information is embedded in each individual build artifact. If you use the oss-licenses plugin to manage license requirements, you should update it to the latest.

The first version of this plugin that works with the new versioning system is 3.3.0. When working with the new versions of Play and Firebase libraries, it should be added to your buildscript classpath dependencies as follows:

classpath 'com.google.gms:google-services:3.3.0'

If you're not using this plugin, but you still want strict version checking of your dependencies, you can apply this new Gradle plugin instead:

apply plugin: 'com.google.android.gms.strict-version-matcher-plugin'

In order to use this plugin, you will also need to add the following to your buildscript classpath, obtained from Google's Maven Repository:

classpath 'com.google.android.gms:strict-version-matcher-plugin:1.0.0'

If you're not using Android Studio 3.1 to develop your app, you will need to upgrade in order to get the correct version checking behavior within the IDE. Get the newest version of Android Studio here.

With these changes in place, you are now able to adopt new versions of the various SDKs more freely, without a strict requirement to update everything at once. It also enables the development teams for each SDK to ship fixes and enhancements more quickly. Going forward, you can track the releases for Play services SDKs and Firebase SDKs with the provided links.

Transitioning Google URL Shortener to Firebase Dynamic Links

Posted by Michael Hermanto, Software Engineer, Firebase

We launched the Google URL Shortener back in 2009 as a way to help people more easily share links and measure traffic online. Since then, many popular URL shortening services have emerged and the ways people find content on the Internet have also changed dramatically, from primarily desktop webpages to apps, mobile devices, home assistants, and more.

To refocus our efforts, we're turning down support for goo.gl over the coming weeks and replacing it with Firebase Dynamic Links (FDL). FDLs are smart URLs that allow you to send existing and potential users to any location within an iOS, Android or web app. We're excited to grow and improve the product going forward. While most features of goo.gl will eventually sunset, all existing links will continue to redirect to the intended destination.

For consumers

Starting April 13, 2018, anonymous users and users who have never created short links before today will not be able to create new short links via the goo.gl console. If you are looking to create new short links, we recommend you use Firebase Dynamic Links or check out popular services like Bitly and Ow.ly as an alternative.

If you have existing goo.gl short links, you can continue to use all features of goo.gl console for a period of one year, until March 30, 2019, when we will discontinue the console. You can manage all your short links and their analytics through the goo.gl console during this period.

After March 30, 2019, all links will continue to redirect to the intended destination. Your existing short links will not be migrated to the Firebase console, however, you will be able to export your link information from the goo.gl console.

For developers

Starting May 30, 2018, only projects that have accessed URL Shortener APIs before today can create short links. To create new short links, we recommend FDL APIs. FDL short links will automatically detect the user's platform and send the user to either the web or your app, as appropriate.

If you are already calling URL Shortener APIs to manage goo.gl short links, you can continue to use them for a period of one year, until March 30, 2019, when we will discontinue the APIs.

As it is for consumers, all links will continue to redirect to the intended destination after March 30, 2019. However, existing short links will not be migrated to the Firebase console/API.

URL Shortener has been a great tool that we're proud to have built. As we look towards the future, we're excited about the possibilities of Firebase Dynamic Links, particularly when it comes to dynamic platform detection and links that survive the app installation process. We hope you are too!