Tag Archives: assistant

Everything Assistant at I/O

Posted by Mike Bifulco

Google I/O banner

We’re excited to host the first ever virtual Google I/O Conference this year, from May 18-20, 2021 – and everyone's invited! Developers around the world will join us for keynotes, technical sessions, codelabs, demos, meetups, workshops, and Ask Me Anything (AMA) sessions hosted by Googlers whose teams have been hard at work preparing new features, APIs, and tools for you to try out. We can’t wait for you to explore everything Google has to share. Given the sheer amount of content that will be shared during those 3 days, this guide is meant to help you find sessions that might interest you if you’re interested in building and integrating with Google Assistant.

With that in mind, here’s a rundown of everything Assistant at Google I/O 2021:

Keynote: What’s New in Google Assistant (register)

We’ll kick off news from Assistant with our keynote session, which will be livestreamed on May 19th at 9:45am PST. Expect to hear about what’s happened in Assistant over the past year, new product announcements, feature updates, and tooling changes.

Keynote: What’s New in Smart Home (register)

In celebration of Google Assistant's 5th birthday, we'll share our Smart Home journey and the things we’ve learned along the way. We'll also dive into product vision, new product announcements, and showcase great Assistant experiences built by our developer community. Catch the Smart Home keynote on May 19th at 4:15pm PST.

Technical Sessions

Technical sessions are 15 minute deep dives into new features, tools, and other announcements from product teams. These 4 sessions will be available on demand, so you can watch them any time after they officially launch during the event.

Driving a Successful Launch for Conversational Actions (register)

In this session, we’ll discuss marketing activities that will help users discover and engage with what you’ve built on Google Assistant. Learn some of the basics of putting together a marketing team, a go-to-market plan, and some recommended activities for promoting engagement with your Conversational Actions.

How to Voicify Your Android App (register)

In this session, you’ll learn how to implement voice capabilities in your Android App. Get users into your app with a voice command using App Actions.

Android Shortcuts for Assistant (register)

Now that you've added a layer of voice interaction to your Android app, learn what's new with Android Shortcuts and how they can be extended to the Google Assistant.

Refreshing Widgets (register)

Widgets in Android 12 are coming with a fresh new look and feel. Come to this session to learn how you can make the most of what’s coming to Widgets, while also making them more useful and discoverable through integrations with Assistant and Assistant Auto.

Ask Me Anything (AMA)

AMAs are a great opportunity for you to have your questions fielded by Googlers. If you register for I/O, you’ll be able to pre-submit questions to any of these AMAs. Teams of Googlers will be answering audience questions live during I/O. All AMA sessions will be livestreamed at specific dates and times, so be sure to add them to your calendar.

App Actions: Ask Me Anything

May 19th, 10:15am PST (register)

This is the place to bring all of your burning questions about App Actions for Android. Our App Actions team will include Program Managers, Developer Advocates, and Engineers who are looking forward to answering your questions. Maybe you’re building an app which uses Custom Intents, or you’ve got questions about some of the new feature announcements from our Technical Sessions (see above!) - the team is looking forward to helping.

Games on Google Assistant: Ask Me Anything

May 19th, 11:00pm PST (register)

Join a panel of Googlers to ask your questions about building Games with Google Assistant. Our team of Product Managers and Game developers are here to help you - from designing and building games, to toolchain questions, to figuring out what types of games people are playing on their smart devices.

Workshops

This year, our workshops will be conducted online via livestream. Each workshop will be led by a Googler providing instruction alongside a team of Googler TAs, who will be there to answer your questions via live chat. Workshops will show you how to apply the things you learn at I/O by giving you hands-on experience with new tools and APIs. Each workshop has limited space for registrations, so be sure to sign up early if you’re interested.

Extend an Android app to Google Assistant with App Actions

May 19th, 11:00am PST (register)

Learn to develop App Actions using common built-in intents in this intermediate codelab, enabling users to open app features and search for in-app content, with Google Assistant.

Debugging the Smart Home

May 19th, 11:30pm PST (register)

Improve your products' reliability and user experience with Google's new smart home quality tools in this intermediate codelab. Learn how to view, analyze, debug and fix issues with your smart home integrations.

Meetups

Women in Voice Meetup

May 20th, 4:00pm PST (register)

This meetup will be a chance for developers to share influential work by women in Voice AI and to discuss ways allies can help women in Voice to be more successful while building a more inclusive ecosystem.

Smart Home Developer Meetup

[Americas] May 18, 3:00pm PST (register)
[APAC] May 19th, 9:00pm PST (register)
[EMEA] May 20th, 6:00am PST (register)

This meetup will be a chance for developers interested in Smart Home to chat with the Smart Home partner engineering team about developing and debugging smart home integrations, share projects, or ask questions.

Register now

Registration for Google I/O 2021 is now open - and attending I/O 2021 is entirely free and open to all. We hope to see you there, and can’t wait to share what we’ve been working on with you. To register for the event, head over to the Google I/O registration page.

Policy changes and certification requirement updates for Smart Home Actions

Posted by Toni Klopfenstein, Developer Advocate

Illustration of 2 animated locks and phone with Actions on Google logo on screen

As more developers onboard to the Smart Home Actions platform, we have gathered feedback about the certification process for launching an Action. Today, we are pleased to announce we have updated our Actions policy to enable developers to more quickly develop their Actions, and to help streamline the certification and launch process for developers. These updates will also help to provide a consistent, cohesive experience for smart device users.

Device quality guidelines

Ensuring each device type meets quality benchmark metrics provides end users with reliable and timely responses from their smart devices.With these policy updates, minimum latency and reliability metrics have been added to each device type guide. To ensure consistent device control and timely updates to Home Graph, all cloud controlled smart devices need to maintain a persistent connection through a hub or the device itself, and cannot rely on mobile devices and tablets.

Along with these quality benchmarks, we have also updated our guides with required and recommended traits for each device. By implementing these within an Action, developers can ensure their end users can trigger devices in a consistent manner and access the full range of device capabilities. To assist you in ensuring your Action is compliant with the updated policy, the Test Suite testing tool will now more clearly flag any device type or trait issues.

Safety and security

Smart home users care deeply about the safety and security of the devices integrated into their homes, so we have also updated our requirements for secondary user verification. This verification step must be implemented for any Action that can set a device in an unprotected state, such as unlocking a door, regardless of whether you are building a Conversational Action or Smart Home Action. Once configured with a secondary verification method, developers can provide users a way to opt out of this flow. For any developer wishing to include an opt-out selection to their customers, we have provided a warning message template to ensure users understand the security implications for turning this feature off.

For devices that may pose heightened safety risks, such as cooking appliances, we require UL certificates or similar certification forms to be provided along with the Test Suite results before an Action can be released to production.

Works With 'Hey Google' badge

These policy updates also will affect the use of the Works With Hey Google badge. The badge will only be available for use on marketing materials for new Smart Home Direct Actions that have successfully integrated any device types referenced.

Any Conversational Actions currently using the badge will not be approved for use for any new marketing assets, including packaging/product refreshes. Any digital assets using the badge will need to be updated to remove the badge by the end of 2021.

Timeline

With the roll-out today, there will be a 1 month grace period for developers to update new integrations to match the new policy requirements. For Actions currently deployed to production, compliance will be evaluated when the Action is recertified. Once integrations have been certified and launched to production, Actions will need to be recertified annually, or any time new devices or device functionality is added to the Action. Notifications for recertification will be shared with the developer account associated with your Action in the console.

This policy grace-period ends April 12, 2021.

Please review the updated policy, as well as our updated docs for launching your Smart Home Action. You can also check out our policy video for more information.

We want to hear from you, so continue sharing your feedback with us through the issue tracker, and engage with other smart home developers in the /r/GoogleAssistantDev community. Follow @ActionsOnGoogle on Twitter for more of our team's updates, and tweet using #AoGDevs to share what you’re working on. We can’t wait to see what you build!

Announcing New Smart Home App Discovery Features

Posted by Toni Klopfenstein, Developer Advocate

When a user connects a smart device to the Google Assistant via the Home app, the user must select the appropriate related Action from the list of all available Actions. The user then clicks through multiple screens to complete their device setup. Today, we're releasing two new features to improve this device discovery process and drive customer adoption of your Smart Home Action through the Google Home app. App Discovery and Deep Linking are two convenience features that help users find your Google-Assistant compatible smart devices quickly and onboard faster.

App Discovery enables users to quickly find your smart home Action thanks to suggestion chips within the Google Home app. You can implement this new feature through the Actions Console by creating a verified brand link between your Action, your website, and your mobile app. App Discovery doesn't require any coding work to implement, making this a development-light feature that provides great improvements to the user experience of device linking.

In addition to helping users discover your Action directly through suggestion chips, Deep Linking enables you to guide users to your account linking flow within the Google Home app in one step. These deep links are easily added to your mobile app or web content, guiding users to your smart home integration with a single tap.

Deep Linking and App Discovery can help you create a more streamlined onboarding experience for your users, driving increased engagement and user satisfaction, and can be implemented with minimal engineering work.

To implement App Discovery and Deep Linking for your Smart Home Action, check out the developer documents, or watch the video covering these new features.

You can also check out the smart home codelabs if you are just starting to build out your Action.

We want to hear from you, so continue sharing your feedback with us through the issue tracker, and engage with other smart home developers in the /r/GoogleAssistantDev community. Follow @ActionsOnGoogle on Twitter for more of our team's updates, and tweet using #AoGDevs to share what you’re working on. We can’t wait to see what you build!

Announcing New Smart Home SHED Types and Traits

Posted by Toni Klopfenstein, Developer Advocate

Back in April, we released the first set of Smart Home Entertainment Device (SHED) types, including TV, set-top box, and remote, as well as the traits AppSelector, InputSelector, MediaState, TransportControl, and Volume. We are excited to announce the release of new Smart Home Entertainment Device (SHED) types and traits. These new device types and traits compliment the original set we released earlier this year, and help build out a more complete solution for smart home media and gaming devices. By implementing these types and traits on your entertainment devices, you can enable users to fully access device and media controls from any Assistant surface.

SHED Types and Traits

To expand the SHED options, we've released the following new device types for Smart Home:

  • Audio-video receiver
  • Streaming box
  • Streaming stick
  • Soundbar
  • Streaming soundbar
  • Speaker

We've also released the following new trait:

  • Channel

To ensure a consistent, high-quality experience for all end users, each of these device types require your service to report activityState and playbackState to Google using the ReportState API. This requirement improves the portability between media devices and helps the Assistant better understand user intents for these devices. By implementing the complete set of recommended device traits, you can further improve the quality of your smart home Action and improve device targeting for media playback command fulfilment.

For more information on how to implement these new device features, check out the docs and samples. You can also join us at our "Hey Google" Smart Home Virtual Summit to learn more about these new features.

We want to hear from you, so continue sharing your feedback with us through the issue tracker, and engage with other smart home developers in the /r/GoogleAssistantDev community. Follow @ActionsOnGoogle on Twitter for more of our team's updates, and tweet using #AoGDevs to share what you’re working on. We can’t wait to see what you build!

Join the "Hey Google" Smart Home Virtual Summit

Posted by Toni Klopfenstein, Developer Relations

Over the past year, we've been focused on building new tools and features to support our smart home developer community. Though we weren't able to engage with you in person at Google I/O, we are pleased to announce the "Hey Google" Smart Home Virtual Summit on July 8th - an opportunity for us to come together and dive into the exciting new and upcoming features for smart home developers and users.

Join us in the keynote where Michele Turner, the Product Management director of the Smart Home Ecosystem, will share our recent smart home product initiatives and how developers can benefit from these capabilities. She will also introduce new tools that make it easier for you to develop with Google Assistant. We will also be hosting a partner panel, where you can hear from industry leaders on how they navigate the impact of COVID-19 and their thoughts on the state of the industry.

Registration is FREE! Head on over to the Summit website to register and check out the schedule. Events will be held during EMEA, APAC, and AMER friendly times. We hope to see you and your colleagues there!

Finding COVID-19 testing centers in Search, Maps, and Assistant in India

Many experts agree that widespread testing is a key tool in the fight against COVID-19. That's why we're working with the Indian Council of Medical Research (ICMR) and MyGov to help people find local COVID-19 testing centres on Google Search, Assistant, and Maps.


When making a coronavirus-related search (eg. “coronavirus testing”), people will now see a ‘Testing’ tab on the results page providing a list of nearby testing labs along with key information and guidance needed before using their services. This includes government-mandated requirements such as: 
  • Calling the national or state helplines before heading out to get tested
  • Carrying a doctor’s prescription (referral required)
  • Testing restrictions (tests are limited to certain patients)
  • Information about whether the lab is government- or private-run.




    Search for ‘Covid 19 testing’ to see nearby authorized test labs, along with key recommendations (images are representational)


    On Google Maps, when people search for keywords like “covid 19 testing” or “coronavirus testing” they will see a list of nearby testing labs, with a link to Google Search for the government-mandated requirements.
    Search for ‘Covid 19 testing’ on Maps to see a list of nearby testing labs, with a link to Google Search for testing requirements (image representational)


    While this experience is designed to help people find authorized testing centers near them, it's important to follow the recommended guidelines that help determine testing eligibility before visiting. Tapping the ‘Learn more’ link leads to authoritative information from the Ministry of Health and Family Welfare (MoHFW), Government of India.


    So far we have integrated over 700 testing labs on Search, Assistant, and Maps spanning more than 300 cities, and we continue to work with ICMR as we surface more labs across the country. This experience is available in English and in eight Indian languages -- Hindi, Bengali, Telugu, Tamil, Malayalam, Kannada, Marathi, and Gujarati.


    We hope these new experiences play a part in helping people as well as healthcare workers as we collectively work toward overcoming this pandemic. 

    Posted by Jayant Baliga, Product Manager, Google Maps

    Developer Preview of Local Home SDK

    Posted by Toni Klopfenstein

    Recently at Google I/O, we gave you a sneak peek at our new Local Home SDK, a suite of local technologies to enhance your smart home integrations. Today, the SDK is live as a developer preview. We've been working hard testing the platform with our partners, including GE, LIFX, Philips Hue, TP-Link, and Wemo, and are excited to bring you these additional technologies for connecting smart devices to the Google Assistant.

    Figure 1: The local execution path

    This SDK enables developers to more deeply integrate their smart devices into the Assistant by building upon the existing Smart Home platform to create a local execution path via Google Home smart speakers and Nest smart displays. Developers can now run their business logic to control new and existing smart devices in JavaScript that executes on the smart speakers and displays, benefitting users with reduced latency and higher reliability.

    How it works:

    The SDK introduces two new intents, IDENTIFY and REACHABLE_DEVICES. The local home platform scans the user's home network via mDNS, UDP, or UPnP to discover any smart devices connected to the Assistant, and triggers IDENTIFY to verify that the device IDs match those returned from the familiar Smart Home API SYNC intent. If the detected device is a hub or bridge, REACHABLE_DEVICES is triggered and treats the hub as the proxy device for communicating locally. Once the local execution path from Google Home to a device is established, the device properties are updated in Home Graph.

    Figure 2: The intents used for each execution path

    When a user triggers a smart home Action that has a local execution path, the Assistant sends the EXECUTE intent to the Google Nest device rather than the developer's cloud fulfillment. The developer's JavaScript app is invoked, which then triggers the Local Home SDK to send control commands to the smart device over TCP, UDP socket, or HTTP/HTTPS requests. By defaulting to local execution rather than the cloud, users experience faster fulfillment of their requests. The execution requests can still be sent to the cloud path in case local execution fails. This redundancy minimizes the possibility of a failed request, and improves the overall user experience.

    Additional features of the Local Home platform include:

    • Support for all Wi-Fi-enabled device types and device traits without two-factor authentication enabled.
    • No user action required to deploy Local Home benefits to all devices.
    • Easily configure discovery protocols and the hosted JavaScript app URL through the Actions console.

    Figure 3: Local Home configuration tool in the Actions console

    JavaScript apps can be tested on-device, allowing developers to employ familiar tools like Chrome Developer Console for debugging. Because the Local Home SDK works with the existing smart home framework, you can self-certify new apps through the Test suite for smart home as well.

    Get started

    To learn more about the Local Home platform, check out the API reference, and get started adding local execution with the developer guide and samples. For general information covering how you can connect smart devices to the Google Assistant, visit the Smart Home documentation, or check out the Local Technologies for the Smart Home talk from Google I/O this year.

    You can send us any feedback you have through the bug tracker, or engage with the community at /r/GoogleAssistantDev. You can tag your posts with the flair local-home-sdk to help organize discussion.

    Flutter: a Portable UI Framework for Mobile, Web, Embedded, and Desktop

    Posted by the Flutter Team

    Today marks an important milestone for the Flutter framework, as we expand our focus from mobile to incorporate a broader set of devices and form factors. At I/O, we’re releasing our first technical preview of Flutter for web, announcing that Flutter is powering Google’s smart display platform including the Google Home Hub, and delivering our first steps towards supporting desktop-class apps with Chrome OS.

    From Mobile to Multi-Platform

    For a long time, the Flutter team mission has been to build the best framework for developing mobile apps for iOS and Android. We believe that mobile development is ripe for improvement, with developers today forced to choose between building the same app twice for two platforms, or making compromises to use cross-platform frameworks. Flutter hits the sweet spot of enabling a single codebase to deliver beautiful, fast, tailored experiences with high developer productivity for both platforms, and we’ve been excited to see how our early efforts have flourished into one of the most popular open source projects.

    As we started to home in on our 1.0 release last year, we began experimenting with broadening the scope of Flutter to other platforms. This was triggered both by internal teams within Google who are increasingly relying on Flutter, as well as the latent potential of the Dart platform for delivering portable experiences. In particular, a small team who were already building a web framework for Dart for internal usage started an exploratory project (codename “Hummingbird”) to evaluate the technical merits of porting the Flutter engine to support the standards-based web.

    The results of this project were startling, thanks in large part to the rapid progress in web browsers like Chrome, Firefox, and Safari, which have pervasively delivered hardware-accelerated graphics, animation, and text as well as fast JavaScript execution. Within a few months of beginning the project, we had the core Flutter framework primitives working, and soon after we had demos running on mobile and desktop browsers. Along with Dart’s long pedigree of compiling for the web, this proved that we could also bring the Flutter framework and apps to run on the web.

    In parallel, the core Flutter project has been making progress to enable desktop-class apps, with input paradigms such as keyboard and mouse, window resizing, and tooling for Chrome OS app development. The exploratory work that we did for embedding Flutter into desktop-class apps running on Windows, Mac and Linux has also graduated into the core Flutter engine.

    A Portable UI Framework for All Screens

    Flutter Mobile, Web, Desktop, and Embedded

    It’s worth pausing for a moment to acknowledge the business potential of a high-performance, portable UI framework that can deliver beautiful, tailored experiences to such a broad variety of form factors from a single codebase.

    For startups, the ability to reach users on mobile, web, or desktop through the same app lets them reach their full audience from day one, rather than having limits due to technical considerations. Especially for larger organizations, the ability to deliver the same experience to all users with one codebase reduces complexity and development cost, and lets them focus on improving the quality of that experience.

    With support for mobile, desktop, and web apps, our mission expands: we want to build the best framework for developing beautiful experiences for any screen.

    Flutter for Web

    This week, we are releasing the first technical preview of Flutter for the web. While this technology is still in development, we are ready for early adopters to try it out and give us feedback. Our initial vision for Flutter on the web is not as a general purpose replacement for the document experiences that HTML is optimized for; instead we intend it as a great way to build highly interactive, graphically rich content, where the benefits of a sophisticated UI framework are keenly felt.

    To showcase Flutter for the web, we worked with the New York Times to build a demo. In addition to world-class news coverage, the New York Times is famous for its crossword and other puzzle games. Since avid puzzlers want to play on whatever device they’re using at the time, their development team was attracted to Flutter as a potential solution for their needs. Discovering that they could reach the web with the same code was a huge boon. At Google I/O this week, you can get a sneak peek of their newly refreshed KENKEN puzzle game, which runs with the same code on Android, iOS, web, Mac, and Chrome OS.

    ken-gratulations puzzle

    Here’s what Eric von Coelln, Executive Director of Puzzles at the New York Times has to say about their experiences with Flutter:

    "The New York Times Crossword has more than 400,000 stand-alone subscriptions and is a daily ritual for puzzle solvers. Along with the Crossword, we’ve grown our portfolio of digital puzzles that reaches more than two million solvers each month.

    We were already beginning to explore Flutter as a potential solution to the challenge of quickly developing engaging, high-quality mobile experiences. Now the addition of being able to publish to web makes Flutter an even more appealing option to quickly deploy across all of our user platforms. This update of our old Flash-based KenKen game into a multi-platform playable experience is something we’re excited to bring to our solvers this year.”

    There’s lots more to say about Flutter for web than we have space for here, so check out the dedicated article about Flutter for web on the Flutter blog.

    At this early stage, we’re eager to get your feedback on how you’d like to use Flutter for web. We expect to rapidly evolve the code, with a particular focus on performance, and harmonizing the codebase with the rest of the Flutter project.

    Flutter for Mobile Devices

    The core Flutter framework also receives an upgrade this week, with the immediate availability of Flutter 1.5 in our stable channel. Flutter 1.5 includes hundreds of changes in response to developer feedback, including updates for new App Store iOS SDK requirements, updates to the iOS and Material widgets, engine support for new device types, and Dart 2.3 featuring new UI-as-code language features.

    As the framework itself matures, we’re investing in building out the supporting ecosystem. The architectural model of Flutter has always prioritized a small core framework, supplemented by a rich package community. In the last few months, Google has contributed production-quality packages for web views, Google Maps, and Firebase ML Vision, and this week, we’re adding initial support for in-app payments. And with over 2,000 open source packages available for Flutter, there are options available for most scenarios.

    One particularly exciting project that we’re announcing this week at I/O is the ML Kit Custom Image Classifier. Built using Flutter and Firebase, it offers an easy-to-use app-based workflow for creating custom image classification models. You can collect training data using the phone's camera, invite others to contribute to your datasets, trigger model training, and use trained models, all from the same app.

    Flutter ML Kit: create datasets, collaborate to collect data, train model, run inference

    Flutter continues to grow in popularity and adoption. A growing roster of demanding customers including eBay, Sonos, Square, Capital One, Alibaba and Tencent are developing apps with Flutter. And they’re having fun! Here’s what Larry McKenzie, a senior developer at eBay had to say about Flutter:

    “Flutter is fast! Features that once took us multiple days to implement can be finished in a single day. Many problems we used to spend a lot of time on, simply no longer occur. Our team can now focus on creating more polished user experiences and delivering functionality. Flutter is enabling us to exceed expectations!”

    More broadly, LinkedIn recently conducted a study that showed Flutter is the single fastest-growing skill among software engineers, based on site members claiming it on their profile over the last 12 months. And in the recent 2019 StackOverflow developer survey, Flutter was listed as one of the most-loved developer frameworks.

    Flutter for Desktop

    Flutter is also being used on the desktop. For some months, we’ve been working on the desktop as an experimental project. But now we’re graduating this into Flutter engine, integrating this work directly into the mainline repo. While these targets are not production-ready yet, we have published early instructions for developing Flutter apps to run on Mac, Windows, and Linux.

    Another quickly growing Flutter platform is Chrome OS, with millions of Chromebooks being sold every year, particularly in education. Chrome OS is a perfect environment for Flutter, both for running Flutter apps, and as a developer platform, since it supports execution of both Android and Linux apps. With Chrome OS, you can use Visual Studio Code or Android Studio to develop a Flutter app that you can test and run locally on the same device without an emulator. You can also publish Flutter apps for Chrome OS to the Play Store, where millions of others can benefit from your creation.

    Flutter for Embedded Devices

    As the final example of Flutter’s portability, we offer Flutter embedded on other devices. We recently published samples that demonstrate Flutter running directly on smaller-scale devices like Raspberry Pi, and we offer an embedding API for Flutter that allows it to be used in scenarios including home, automotive and beyond.

    Perhaps one of the most pervasive embedded platforms where Flutter is already running is on the smart display operating system that powers the likes of Google Home Hub.

    Within Google, some Google-built features for the Smart Display platform are powered by Flutter today. And the Assistant team is excited to continue to expand the portfolio of features built with Flutter for the Smart Display in the coming months; the goal this year is to use Flutter to drive the overall system UI.

    Other Resources

    We often get asked by developers how they can get started with Flutter. We are pleased today to announce a comprehensive new training course for Flutter, built by The App Brewery, authors of the highest-rated iOS training course on Udemy. Their new course has over thirty hours of content for Flutter, including videos, demos and labs, and with Google’s sponsorship, they are announcing today a time-limited discount of this course from the retail price of $199 to just $10.

    Many developers are creating inspiring apps with Flutter. In the run-up to Google I/O, we ran a contest called Flutter Create to encourage developers to see what they could build with Flutter in 5KB or less of Dart code. We had over 750 unique entries from around the world, with some amazing examples that pushed what we imagine would be possible in such a small size.

    Today, we’re announcing the winners, which can be found on flutter.dev/create. Congratulations to the overall winner, Zebiao Hu, who wins a fully-loaded iMac Pro worth over $10,000!

    Flutter is no longer a mobile framework, but a multi-platform framework that can help you reach your users wherever they are. We can’t wait to see what you’ll build with Flutter on the web, desktop, mobile, and beyond!

    Actions on Google at I/O 2019: New tools for web, mobile, and smart home developers

    Posted by Chris Turkstra, Director, Actions on Google

    People are using the Assistant every day to get things done more easily, creating lots of opportunities for developers on this quickly growing platform. And we’ve heard from many of you that want easier ways to connect your content across the Assistant.

    At I/O, we’re announcing new solutions for Actions on Google that were built specifically with you in mind. Whether you build for web, mobile, or smart home, these new tools will help make your content and services available to people who want to use their voice to get things done.

    Enhance your presence in Search and the Assistant

    Help people with their “how to” questions

    Every day, people turn to the internet to ask “how to” questions, like how to tie a tie, how to fix a faucet, or how to install a dog door. At I/O, we’re introducing support for How-to markup that lets you power richer and more helpful results in Search and the Assistant.

    Adding How-to markup to your pages will enable the page to appear as a rich result on mobile Search and on Google Assistant Smart Displays. This is an incredibly lightweight way for web developers and creators to connect with millions of people, giving them helpful step-by-step instructions with video, images and text. You can start seeing How-to markup results on Search today, and your content will become available on the Smart Displays in the coming months.

    Here’s an example where DIY Network added markup to their existing content on the web to provide a more helpful, interactive result on both Google Search and the Assistant:

    Mobile Search screenshot showing how to install a dog door How-to Markup of how to install a dog door

    For content creators that don’t maintain a website, we created a How-to Video Template where video creators can upload a simple spreadsheet with titles, text and timestamps for their YouTube video, and we’ll handle the rest. This is a simple way to transform your existing how-to videos into interactive, step-by-step tutorials across Google Assistant Smart Displays and Android phones.

    Check out how REI is getting extra mileage out of their YouTube video:

    Laptop to Home Hub displaying How To Template for the REI compass

    How-to Video Templates are in developer preview so you can start building today, and your content will become available on Android phones and Smart Displays in the coming months.

    Easier engagement with your apps

    Help people quickly get things done with App Actions

    If you’re an app developer, people are turning to your apps every day to get things done. And we see people turn to the Assistant every day for a natural way to ask for help via voice. This offers an opportunity to use intents to create voice-based entry points from the Assistant to the right spot in your app.

    Last year, we previewed App Actions, a simple mechanism for Android developers that uses intents from the Assistant to deep link to exactly the right spot in your app. At I/O, we are announcing the release of built-in intents for four new App Action categories: Health & Fitness, Finance and Banking, Ridesharing, and Food Ordering. Using these intents, you can integrate with the Assistant in no time.

    If I wanted to track my run with Nike Run Club, I could just say “Hey Google, start my run in Nike Run Club” and the app will automatically start tracking my run. Or, let’s say I just finished dinner with my friend Chad and we're splitting the check. I can say "Hey Google, send $15 to Chad on PayPal" and the Assistant takes me right into Paypal, I log in, and all of my information is filled in – all I need to do is hit send.

    Google Pixel showing App Actions Nike Run Club

    Each of these integrations were completed in less than a day with the addition of an Actions.xml file that handles the mapping of intents between your app and the Actions platform. You can start building with these new intents today and deploy to Assistant users on Android in the coming months. This is a huge opportunity to offer your fans an effortless way to engage more frequently with your apps.

    Build for devices in the home

    Take advantage of Smart Displays’ interactive screens

    Last year, we saw the introduction of the Smart Display as a new device category. The interactive visual surface opens up many new possibilities for developers.

    Today, we’re introducing a developer preview of Interactive Canvas which lets you create full-screen experiences that combine the power of voice, visuals and touch. Canvas works across Smart Displays and Android phones, and it uses open web technologies you’re likely already familiar with, like HTML, CSS and Javascript.

    Here’s an example of what you can build when you can leverage the full screen of a Smart Display:

    Full screen of a Smart Display

    Interactive Canvas is available for building games starting today, and we’ll be adding more categories soon. Visit the Actions Console to be one of the first to try it out.

    Enable smart home devices to communicate locally

    There are now more than 30,000 connected devices that work with the Assistant across 3,500 brands, and today, we’re excited to announce a new suite of local technologies that are specifically designed to create an even better smart home.

    Introducing a preview of the Local Home SDK which enables you to run your smart home code locally on Google Home Speakers and Nest Displays and use its radios to communicate locally with your smart devices. This reduces cloud hops and brings a new level of speed and reliability to the smart home. We’ve been working with some amazing partners including Philips, Wemo, TP-Link, and LIFX on testing this SDK and we’re excited to open it up for all developers next month.

    Flowchart of Local Home SDK

    Make setup more seamless

    And, through the Local Home SDK, we’re improving the device setup experience by providing users with a seamless setup experience, something we launched in partnership with GE smart lights this past October. So far, people have loved the ability to set up their lights in less than a minute in the Google Home app. We’re now scaling this to more partners, so go here if you’re interested.

    Make your devices smart with Assistant Connect

    Also, at CES earlier this year we previewed Google Assistant Connect which leverages the Local Home SDK. Assistant Connect enables smart home and appliance developers to easily add Assistant functionality into their devices at low cost. It does this by offloading a lot of work onto the Assistant to complete Actions, display content and respond to commands. We've been hard at work developing the platform along with the first products built on it by Anker, Leviton and Tile. We can't wait to show you more about Assistant Connect later this year.

    New device types and traits

    For those of you creating Actions for the smart home, we’re also releasing 16 new device types and three new device traits including LockUnlock, ArmDisarm, and Timer. Head over to our developer documentation for the full list of 38 device types and 18 device traits, and check out our sample project on GitHub to start building.

    Get started with our new tools for all types of developers

    Whether you’re looking to extend the reach of your content, drive more usage in your apps, or build custom Assistant-powered experiences, you now have more tools to do so.

    If you want to learn more about how you can start building with these tools, check out our website to get started and our schedule so you can tune in to all of our developer talks that we’ll be hosting throughout the week.

    We can’t wait to build together with you!

    Google Home and Google Home Mini launches in India

    https://lh6.googleusercontent.com/oJQ6oer9k-oKSnaW3KR5MwCfRndlnGcAU-FN0yA9kkYKRtwSGa3fSclA27uJ7h5rPzT1yEXW2U7wCkdYfJKj4vMZM5IPVKYOhWAHcCZ5n1qevNxrtJTE_cnGHcXNL4zsOhY9ElbG
    Bringing together the best of Google’s AI, software and hardware, now with a desi twist


    Whether you’re getting the kids ready for school, doing a batch of laundry, or answering the doorbell for the morning vendors, Indian homes are busy ones. From catching that Bollywood blockbuster on your smart TV, to whipping up a quick Chole Bhature, to sinking into soulful Sufi tunes at the end of a tiring day, you can now get hands-free help.


    Beginning today, Indian users can welcome in their lives Google Home -- our voice-activated speaker powered by the Google Assistant. With a simple “Ok, Google” or “Hey Google”, you can get answers, turn up the music, manage everyday tasks or even control smart devices around your home.


    Google Home understands Indian accents, and will respond to you with uniquely Indian contexts. What’s great about the Google Assistant is that it’s the same across all your devices, so that it works seamlessly for you wherever you need a helping hand. You can for instance ask it for the quickest route to office, then tell it to push the directions to Google Maps on your smartphone, and you’re ready to navigate as you head out.


    Designed to fit seamlessly into your home
    We didn’t want Google Home to feel like a gadget, and took inspiration from consumer products that are commonly found in homes, like wine glasses, candles, and even donuts for Mini.


    The top surface has LEDs that provide visual feedback when Google Home recognizes “Hey Google”, so you know when it is listening. In those rare moments when voice won’t do, the top surface is also a capacitive touch panel. You can simply use your finger to pause the music or adjust the volume.
    Google Home was designed with two microphones to enable accurate far-field voice recognition. The microphone system uses a technique called neural beam forming. We’ve simulated hundreds of thousands of noisy environments and applied machine learning to recognize patterns that allow us to filter and separate speech from noise. This allows us to deliver best-in-class voice recognition and minimize error rates -- even from across the room. Home will be available in India in the Chalk color variant.
    Google Home Mini is sleek and smooth, with no corners or edges. And it's small enough to easily place anywhere in your home. It’s almost entirely enclosed in custom fabric. We created this material from scratch, right down to the yarn. It’s durable and soft, but also transparent enough to let through both light and sound. And it is available in Chalk and Charcoal, with Coral coming soon. The four LED lights underneath the fabric that light up to show you when it hears you or when it’s thinking. Mini has far-field mics so it can hear you even when there’s music playing or loud noise in the background: its circular design it can project 360-degree sound, with just one speaker.


    These devices join the Made by Google family of hardware products in India, and will be available for purchase online exclusively on Flipkart, and in over 750 retail stores across the country including Reliance Digital, Croma, Bajaj Electronics, Vijay Sales, Sangeetha, and Poorvika.


    Tap into the power of Google with your Assistant
    Need answers to a problem? Ask questions, translate phrases, run simple maths calculations and look up the meaning of a word. Too busy to stay on top of the news? Ask and you shall receive the latest stories from sources such as Times of India, NDTV, Dainik Bhaskar, India Today, Aaj Tak and more. Need a helping hand in the kitchen? Find ingredient substitutes, pull up nutritional information and unit conversions without having to wash your atta-covered fingers.


    Google Home is truly ‘desi’
    With a distinctly Indian voice, your Assistant on Google Home speaks and understands your language. Ask it “Hey Google, how desi are you?”, put its cricket knowledge to the test with “Hey Google, what is a silly point?”, tell it to “Play songs from the movie Satte Pe Satta”, or even get step-by-step cooking instructions in the kitchen with, “Hey Google, give me a recipe for Dum Biryani”.


    Get personalised help for your everyday tasks
    The Google Assistant on Google Home has been designed to help you get more stuff done when you have your hands full. With your permission, it will help with things like your commute, your daily schedule and more. And the best part? Up to six people can connect their account to one Google Home, so if you ask your Assistant to tell you about your day, it can distinguish your voice from other people in your family, and give you personalised answers. Just ask “Hey Google, tell me about my day” or say, “Hey Google, how long will it take to get to work?” and you’ll get up to speed on everything you need to know. It can wake you up in the morning (or let you snooze), set a timer while you’re baking, and so more.


    Turn up those tunes
    Find the right rhythm for every occasion, whether you’re getting into the zone with sunrise yoga, hosting a dinner party, or burning off calories dancing with your little ones. You can play songs, playlists, artists, and albums from your favourite music subscription services like Google Play Music (with a six-month subscription, on us), along with offers from Saavn and Gaana*. You can also pair Google Home or Home Mini with your favorite Bluetooth speaker and set it to be the default output for all your music.


    Control your smart home
    Google Home can help you keep track of everything going on in your home--you can control your lights, switches and more, using compatible smart devices from brands like Philips Hue, D-Link and TP-Link. Just ask your Google Home, and your Assistant will turn off the kitchen light. If you have a Chromecast, you can also use voice commands to play Netflix, or YouTube on your TV and binge watch your favorite shows. Enjoy multi-room by grouping Google Home devices together (with Chromecast Audio, Chromecast built-in and Bluetooth speakers) to listen to the same song in every room.


    A speaker for any occasion
    Whether you’re hosting a dinner or a solo dance party, Google Home delivers crystal-clear sound and creates an enjoyable listening experience. Plus, we designed Home to fit stylishly into any room. And you have the option to customize the base with different colors to reflect your home’s style.


    With Google Home, we’re working with our partners to bring you many great launch offers: when buying Google Home or Google Home Mini on Flipkart you get a free JioFi router along with special offers on exchange and streaming music subscriptions; when buying a Google Home at Reliance Digital or MyJio stores you get a free JioFi router with 100GB of high-speed 4G data (worth Rs 2,499)**, and at select Philips Hue and Croma outlets you get a Philips Hue + Google Home Mini at a special bundled price. Also ACT Fibernet retail customers subscribing to 12-month advance rental plans of 90MBPS and above, will receive a Google Home Mini. And above all, users get 10 percent cashback when purchasing using HDFC Bank credit cards***.


    Google Home and Google Home Mini will be priced at Rs 9,999, and Rs 4,499 respectively.


    It’s just the beginning...
    Your Assistant on Google Home will continue to get better over time as we add more features (look out for Hindi support coming later this year!) And Google Home is open to third-party apps for the Assistant, so expect even more of your favourite services and content.

    Posted by Rishi Chandra, VP,  Product Management, Google Home


    Note:
    *Both available from April 10 to October 31, 2018, for all Google Home and Home Mini users in India
    **Offer valid until 30th April 2018
    ***Cashback limited to 10% of MRP