Author Archives: The Official Google Blog

Building for all learners with new apps, tools, and resources

Everyone deserves access to a quality education—no matter your background, where you live, or your abilities. We’re recognizing this on Global Accessibility Awareness Day, an effort to promote digital accessibility and inclusion for people with disabilities, by sharing new features, training, and partners, along with the many new products announced at Google I/O.

Since everyone learns in different ways, we design technology that can adapt to a broad range of needs and learning styles. For example, you can now add captions in Slides and turn on live captions in Hangouts Meet, and we’ve improved discoverability in the G Suite toolbar. By making these features available—with even more in the works—teachers can help students learn in ways that work best for them.

Working with our partners to expand access

We’re not the only ones trying to make learning more accessible, so we’ve partnered with companies who are building apps to make it easier for teachers to communicate with all students.

One of our partners, Crick Software, just launched Clicker Communicator, a child-friendly communication tool for the classroom: bridging the gap between needs/wants and curriculum access, empowering non-verbal students with the tools to initiate and lead conversations, and enabling proactive participation in the classroom. It’s one of the first augmentative and alternative communication (AAC) apps specifically created for Chromebook users.

Learn more about the Clicker Communicator for Chromebooks, one of the first augmentative and alternative communication (AAC) apps specifically created for Chromebook users.

Learn more about Clicker Communicator, an AAC app for Chromebooks.

Assessing with accessibility in mind

Teachers use locked mode when giving Quizzes in Google Forms, only on managed Chromebooks, to eliminate distractions. Locked mode is now used millions of times per month, and many students use additional apps for accommodations when taking quizzes. We’ve been working with many developers to make sure their tools work with locked mode. One of those developers is our partner Texthelp®. Coming soon, when you enable locked mode in Quizzes in Google Forms, your students will be able to access Read&Write for Google Chrome and EquatIO® for Google that they rely on daily.

Another partner, Don Johnston, supports students with their apps including Co:Writer for word prediction, translation, and speech recognition and Snap&Read for read aloud, highlighting, and note-taking. Students signed into these extensions can use them on the quiz—even in locked mode. This integration will be rolling out over the next couple of weeks.

Learn more about the accessibility features available in locked mode, including ChromeVox, select-to-speak, and visual aids including high contrast mode and magnifiers.

Tools, training, and more resources

Assistive technology has the power to transform learning for more students, but educators need training, support, and tutorials to help their students get the most from the technology.

The new Accessibility section on our Google for Education website has information on Chromebooks and G Suite for Education, a module on our Teacher Center and printable flashcards, and EDU in 90 YouTube videos on G Suite and Chromebook accessibility features. Check out our accessibility tools and find training on how to use them to create more engaging, accessible learning experiences.

EDU in 90 video of Chromebook accessibility features

Watch the EDU in 90 on Chrome accessibility features.

We love hearing stories of how technology is making learning more accessible for more learners, so please share how you're using accessibility tools to support all types of learners, and requests for how we can continue to improve to meet the needs of more learners.

Make your smart home more accessible with new tutorials

I’m legally blind, so from the moment I pop out of bed each morning, I use technology to help me go about my day. When I wake up, I ask my Google Assistant for my custom-made morning Routine which turns on my lights, reads my calendar and plays the news. I use other products as well, like screen readers and a refreshable braille display, to help me be as productive as possible.

I bring my understanding of what it's like to have a disability to work with me, where I lead accessibility for Google Search, Google News and the Google Assistant. I work with cross-functional teams to help fulfill Google’s mission of building products for everyone—including those of us in the disabled community.

The Assistant can be particularly useful for helping people with disabilities get things done. So today, Global Accessibility Awareness Day, we’re releasing a series of how-to videos with visual and audible directions, designed to help the accessibility community set up and get the most out of their Assistant-enabled smart devices.

You can find step-by-step tutorials to learn how to interact with your Assistant, from setting up your Assistant-enabled device to using your voice to control your home appliances, at our YouTube playlist which we’ll continue to update throughout the year.

Intro to Assistant Accessibility Videos

This playlist came out of conversations within the team about how we can use our products to make life a little easier. Many of us have some form of disability, or have a friend, co-worker or family member who does. For example, Stephanie Wilson, an engineer on the Google Home team, helped set up her parents’ smart home after her dad was diagnosed with Parkinson’s disease.

In addition to our own teammates, we're always listening to suggestions from the broader community on how we can make our products more accessible. Last week at I/O, we showed how we’re making the Google Assistant more accessible, using AI to improve products for people with a speech impairment, and added Live Caption in Android Q to give the Deaf community automatic captions for media that’s playing audio on your phone. All these changes were made after receiving feedback from people like you.

Head over to our Accessibility website to learn more, and if you have questions or feedback on accessibility within Google products, please share your feedback with us via our dedicated Disability Support team.

Three new machine learning courses

Many years ago, I took a dance lesson in Budapest to learn the csárdás, a Hungarian folk dance. The instructor shouted directions to me in enthusiastic Hungarian, a language I didn't understand, yet I still learned the dance by mimicking the instructor and the expert students. Now, I do love clear directions in a lesson—I am a technical writer, after all—but it’s remarkable what a person can learn by emulating the experts.  


In fact, you can learn a lot about machine learning by emulating the experts. That’s why we’ve teamed with ML experts to create online courses to help researchers, developers, and students. Here are three new courses:

  • Clustering: Introduces clustering techniques, which help find patterns and related groups in complex data. This course focuses on k-means, which is the most popular clustering algorithm. Although k-means is relatively easy to understand, defining similarity measures for k-means is challenging and fascinating.
  • Recommendation Systems: Teaches you how to create ML models that suggest relevant content to users, leveraging the experiences of Google's recommendation system experts. You'll discover both content-based and collaborative filtering, and uncover the mathematical alchemy of matrix factorization. To get the most out of this course, you'll need at least a little background in linear algebra.
  • Testing and Debugging: Explains the tricks that Google's ML experts use to test and debug ML models. Google's ML experts have spent thousands of hours deciphering the signals that faulty ML models emit. Learn from their mistakes.    
These new courses are engaging, practical, and helpful. They build on a series of courses we released last year, starting with Machine Learning Course Crash (MLCC), which teaches the fundamentals of ML. If you enjoyed MLCC, you're ready for these new courses. They will push you to think differently about the way you approach your work. Take these courses to copy the moves of the world's best ML experts.


We hear you: updates to Works with Nest

Last week we announced that we would stop supporting the Works with Nest (WWN) program on August 31, 2019 and transition to the Works with Google Assistant platform (WWGA). The decision to retire WWN was made to unify our efforts around third-party connected home devices under a single platform for developers to build features for a more helpful home. The goal is to simplify the experience for developers and to give you more control over how your data is shared. Since the announcement, we’ve received a lot of questions about this transition. Today we wanted to share our updated plan and clarify our approach.


First, we’re committed to supporting the integrations you value and minimizing disruptions during this transition, so here’s our updated plan for retiring WWN:

  • Your existing devices and integrations will continue working with your Nest Account, however you won’t have access to new features that will be available with a Google Account. If we make changes to the existing WWN connections available to you with your Nest Account, we will make sure to keep you informed.

  • We’ll stop accepting new WWN connections on August 31, 2019. Once your WWN functionality is available on the WWGA platform you can migrate with minimal disruption from a Nest Account to a Google Account.

Second, we want to clarify how this transition will work for you. Moving forward, we’ll deliver a single consumer and developer experience through the Google Assistant. WWGA already works with over 3,500 partners and 30,000 devices, and integrates seamlessly with Assistant Routines. Routines allow anyone to quickly customize how their smart devices work together based on simple triggers—whether you’re leaving home or going to bed.


One of the most popular WWN features is to automatically trigger routines based on Home/Away status. Later this year, we'll bring that same functionality to the Google Assistant and provide more device options for you to choose from. For example, you’ll be able to have your smart light bulbs automatically turn off when you leave your home. Routines can be created from the Google Home or Assistant apps, and can be created using the hardware you already own. Plus we’re making lots of improvements to setup and managing Routines to make them even easier to use.

We recognize you may want your Nest devices to work with other connected ecosystems. We’re working with Amazon to migrate the Nest skill that lets you control your Nest thermostat and view your Nest camera livestream via Amazon Alexa. Additionally, we’re working with other partners to offer connected experiences that deliver more custom integrations.

For these custom integrations, partners will undergo security audits and we’ll control what data is shared and how it can be used. You’ll also have more control over which devices these partners will see by choosing the specific devices you want to share. For example, you’ll be able to share your outdoor cameras, but not the camera in your nursery, with a security partner.

We know we can't build a one-size-fits-all solution, so we're moving quickly to work with our most popular developers to create and support helpful interactions that give you the best of Google Nest. Our goal remains to give you the tools you need to make your home, and those of other Nest users, helpful in the ways that matter most to you.


Affirming the identities of teachers and students in the classroom through #ISeeMe

Editor’s note: We’re thrilled to have Kristina Joye Lyles from DonorsChoose.org as a guest author, sharing about teaming up with Google.org to launch the #ISeeMe campaign.

I joined DonorsChoose.org in 2013 and have long been working with organizations like Google.org who share our belief in the power of teachers. To date, Google.org has provided over $25 million to support classrooms on DonorsChoose.org, and last week, they committed an additional $5 million to teachers, with a focus on supporting diverse and inclusive classrooms. Together, we’re kicking off #ISeeMe, a new effort to enable teachers and students across the country to celebrate their identities in their classrooms.

As a military brat, I attended many public schools across the U.S. but only had two teachers of color from kindergarten through twelfth grade. My teachers and professors of color had a particularly strong impact on me as mentors and role models; I was encouraged to see them as leaders in our school community, and their presence alone showed me that diversity and representation matter.

My story is like those of so many others. Research shows that students benefit from seeing themselves in their teachers and learning resources. For example, black students who have just one black teacher between third and fifth grade are 33 percent more likely to stay in school. Girls who attend high schools with a higher proportion of female STEM teachers are 19 percent more likely to graduate from college with a science or math major.

With this support from Google.org, teachers who are underrepresented in today’s public school classrooms--like teachers of color and female math and science teachers-- as well as all teachers looking to create more inclusive classrooms will get the support they need and deserve. Teachers from all backgrounds can take steps toward creating classrooms that reflect their students, whether they’re selecting novels with diverse characters to discuss or taking trainings to learn more about meeting the needs of students from culturally diverse backgrounds. And we’re eager to help them bring their ideas to life so that more students can see themselves reflected in their classrooms.

I’m thrilled that many teachers on DonorsChoose.org are already coming up with inspiring ways to foster classroom environments where every student can feel important and included.  Mr. Yung sees the power of food to bring his students together across different cultural backgrounds. Ms. McLeod is determined to bring her students from Lumberton, North Carolina, to the National Museum of African-American History and Culture in Washington, D.C. Mrs. Toro-Maysaspires to bring her bilingual students books with culturally relevant heroes and heroines.

We hope you’ll join us and the philanthropists of various backgrounds who have lit the torch for #ISeeMe today. If you are a public school teacher, you can set up an #ISeeMe classroom project right now at DonorsChoose.org. You can also access free inclusive classroom resources and ideas created for educators, by educators at any time in Google’s Teacher Center. And for those of you who have been inspired by a teacher, we invite you to explore classroom projects that are eligible for Google.org’s #ISeeMe donation matching—we would love to have your support for these teachers and classrooms.

New features to make audio more accessible on your phone

Smartphones are key to helping all of us get through our days, from getting directions to translating a word. But for people with disabilities, phones have the potential to do even more to connect people to information and help them perform everyday tasks. We want Android to work for all users, no matter their abilities. And on Global Accessibility Awareness Day, we’re taking another step toward this aim with updates to Live Transcribe, coming next month.


Available on 1.8 billion Android devices, Live Transcribe helps bridge the connection between the deaf and the hearing via real-time, real-world transcriptions for everyday conversations. With this update, we’re building on our machine learning and speech recognition technology to add new capabilities.


First, Live Transcribe will now show you sound events in addition to transcribing speech. You can see, for example, when a dog is barking or when someone is knocking on your door.  Seeing sound events allows you to be more immersed in the non-conversation realm of audio and helps you understand what is happening in the world. This is important to those who may not be able to hear non-speech audio cues such as clapping, laughter, music, applause, or the sound of a speeding vehicle whizzing by.


Second, you’ll now be able to copy and save transcripts, stored locally on your device for three days. This is useful not only for those with deafness or hearing loss—it also helps those who might be using real-time transcriptions in other ways, such as those learning a language for the first time or even, secondarily, journalists capturing interviews or students taking lecture notes. We’ve also made the audio visualization indicator bigger, so that users can more easily see the background audio around them.

New features of Live Transcribe

Caption: See sound events, like whistling or a dog barking, in the bottom left corner of the updated Live Transcribe.

With billions of active devices powered by Android, we’re humbled by the opportunity to build helpful tools that make the world’s information more accessible in the palm of everyone’s hand. As long as there are barriers for some people, we still have work to do. We’ll continue to release more features to enrich the lives of our accessibility community and the people around them.

Street View cars measure Amsterdam’s air quality

The quality of the air we breathe has a major impact on our health. Even in Amsterdam, a city where bikes make up 36 percent of the traffic, the average life span is cut short by a year as a result of polluted air. Information about air quality at the street level can help pinpoint areas where the quality is poor, which is useful for all types of people—whether you’re a bicyclist on your daily commute, a parent taking your children to a local park, or an urban planner designing new communities.

A Street View car in Amsterdam..jpg

A Street View car in Amsterdam.

Project Air View

Building on efforts in London and Copenhagen, Google and the municipality of Amsterdam are now working together to gain insight into the city’s air quality at the street level. Amsterdam already measures air quality at several points around the city. Information from two of our Street View cars in Project Air View will augment the measurements from these fixed locations, to yield a more detailed street-by-street picture of the city’s air quality.

To take the measurements, the Street View cars will be equipped with air sensors to measure nitric oxide, nitrogen dioxide, ultra-fine dust and soot (extremely small particles that are hardly ever measured). Scientists from Utrecht University are equipping the air sensors into the vehicles, and working with the municipality and Google to plan the routes for driving and lead the data validation and analysis. Once the data validation and analysis is complete, we’ll share helpful insights with the public, so that everyone—citizens, scientists, authorities and organizations—can make more informed decisions.

This research can spread awareness about air pollution and help people take action. For example, if the research shows differences in air quality between certain areas in the city, people could adjust their bike route or choose another time to exercise. Our hope is that small changes like this can help improve overall quality of life. For more information about Project Air View, visit g.co/earth/airquality.

Sharing Hawaiian food and tradition with generations to come

Highway Inn is an Oahu-based restaurant founded by Hawaii-born Japanese-American Seiichi Toguchi. At the start of World War II, Seiichi was taken from his home to an internment camp in California and assigned to work in the mess halls. There, Japanese-American chefs from around the country taught him how to cook, eventually inspiring him to open Highway Inn to share the foods he loved growing up. Seiichi passed the restaurant down to his son Bobby Toguchi, who has since passed it to his daughter, Monica Toguchi Ryan. Their family has been proudly serving authentic Hawaiian food for over 70 years.


As the third generation owner, Monica was determined to not just honor her family traditions and legacy, but also to share with younger generations the kinds of food that keep them connected to Hawaiian and local food culture. When her grandfather started the restaurant, he relied on word of mouth to reach new customers. Now, Monica uses Google Ads and their Business Profile on Google to connect with customers, helping them to grow from one location to three across Oahu. She and her family hope to continue preserving the beauty and tradition of Hawaiian food for generations to come.


This Asian American and Pacific Islander Heritage Month, we're telling this and other stories, like Kruti Dance Academy from Atlanta, Georgia. They are two of the many Asian American and Pacific Islander-owned small businesses having an impact on their local communities.

The importance of influence in design

Human behavior has always intrigued me—that's the reason I studied psychology as an undergraduate. At the time, I wondered how those learnings could one day apply to life in the “real world.” As it turns out, an understanding of people and human behavior is an invaluable asset when it comes to cultivating influence—especially when it comes to design.

In my role as VP of User Experience (UX) Design at Google, I’m constantly tasked with influencing others. I lead a team of designers, researchers, writers and engineers who are behind products like Google’s Shopping, Trips, Payments and Ads. To create great experiences, we must first convince the people building these products that design is elemental to delivering not just user value, but also business value. Over the years I've seen how the ability to build influence is essential to designing the best experiences.

User empathy is a fast track to influence

As UX professionals (designers, writers, researchers and front-end engineers), it’s our job to fully grasp the needs of people using our products and be the spokesperson for them. It’s easy to fall into the trap of believing that we understand our users without witnessing them actually using our products, or to believe that our personal experiences reflect those of people everywhere. Yet every time I go out into the real world and spend time with people actually using our products, I come back with an unexpected insight that changes how I initially thought about a problem.

In 2017, I took a trip to Jakarta to research the challenges of using smartphones in a region where service is relatively expensive and bandwidth is not readily available. It wasn’t until I was on the ground that I realized how degraded the experience was from what I’d pictured. Similarly, during a recent trip to Tel Aviv, I learned how difficult it is to get funding and grow a business. Developing this kind of understanding, which can only come from experience, helps motivate you to fix a problem from a different angle.

Ideally, we’d bring all of our team members into the field to have these first-hand experiences, but that approach doesn’t scale. What does scale is empathy. We can share our personal experiences, research and user stories to build greater understanding. Once we’ve built a foundation of shared understanding, we can have better influence over decisions that affect users.


customer personas

Understanding people's experiences and stories help build better products.

Inspire action with compelling stories

Research can provide the data and anecdotes that help others understand why your design meets a specific need, but how you present that data is equally important.

Creating rich stories full of photos and video clips helps expose others to how people use products and the challenges they encounter. On multiple occasions, I’ve been in a room where research clips of people interacting with a product or prototype are shared with executives and partners. Without fail, observing real people use products gets everyone animated and excited. Watching someone fumble through a task creates a sense of urgency to solve a problem that can’t be generated through data.

One way to do this is with prototyping software or animated slides that show a product flow or tell a narrative that helps people understand the pain points of a product or the ease of its well-designed experience. An interactive prototype lets people experience the full possibilities. If you’re lucky enough to work with a UX engineer, prototypes are probably already a part of your influence repertoire. There’s nothing better than prototyping and sharing a bold idea and hearing: “We need that! Let’s make it happen!”

Listen first

User experience is highly focused on empathy for users, yet we’re often so focused on people using our products that we don’t take the time to develop empathy for our colleagues. Making sure others feel seen, heard, and understood is a significant step toward influence. Similar to how we can mistakenly make assumptions about our users, we can fall into the same trap with our peers.

Too often people equate influence with asserting their perspective. Instead, influence starts with understanding the goals, motivations and frustrations of others.

It’s easy to make incorrect conclusions, so instead of rushing to make a point, start out by listening to your colleagues. Showing the courtesy of listening often begets reciprocity, and makes others more receptive to your perspective.

Our discipline is founded on exploring human connections and motivations through empathy and listening. Now you can use those tools to build influence, whether or not you work in UX.

Carmen Sandiego is back on Google Earth, gumshoe

This March, we put out the call for super sleuths to help us track down Carmen Sandiego in Google Earth. And we were blown away by the enthusiasm and speed with which people found the reformed VILE operative—who is now an ACME agent—by traveling from city to city around the globe.

You not only solved the caper, but also shared stories and memories of playing the original games, watching the shows (both old and new) and sharing the experience with friends, family and kids.

Today, we’ve teamed up with Carmen Sandiego once again—this time to help her recover Tutankhamun’s Mask. Le Chevre, a master climber and classmate of Carmen Sandiego at VILE Academy, has stolen the priceless artifact. We’re counting on gumshoes everywhere to help Carmen find him and recover the loot.

Trailer

To get your assignment, look for the special edition Pegman icon in Google Earth for Chrome, Android and iOS. Good luck, detectives!