Tag Archives: accessibility

Inside Google Korea’s new accessible office space

I don’t think I’ll ever forget the feeling of walking through the newly opened 28th floor of Google Korea. The space has been reimagined with a focus on “universal design” — meaning it was designed to be accessible to people of all abilities.

The idea for this space started a few years ago, when I was talking to other members of the Disability Alliance Employee Resource Group (ERG) in Korea about their web accessibility project— a conversation that then shifted to improving accessible design in the office. Was our office truly as accessible as it could be? Did everyone feel that they could do their best work without any restrictions due to their abilities? We pinpointed some areas for improvement, and that sparked a desire to make a change.

The Disability Alliance then partnered with Google's Real Estate & Workplace Services team to explore how we could implement some of these changes, especially as we expanded our space in Gangnam. Bit by bit, we made improvements to our existing office space, from adding braille to meeting room signs to adding drop-down thresholds for doors.

And when we had the opportunity to influence a brand new floor, we embraced the concept of universal design to co-design alongside the REWS team. Throughout the whole process, we incorporated feedback and co-designed with many people in our community— including Inho, a software engineer with a visual impairment. The design team made all designs and plans available in braille, so that anyone who was visually impaired could still review them.

Seeing our carefully thought out plans begin to take shape was incredible. Finally stepping into the finished space took my breath away, and I was so excited just thinking of how this could help so many of our colleagues thrive.

But don’t just take my word for it! Take a look at these four design details, and why they make such a difference.

We’re proud of how we've applied universal design principles in Google Korea, but we know this isn’t the end of the journey. In fact, I like to think that we’re just getting started. We’re constantly learning and seeking to understand the needs of all people — that’s how we can develop solutions that enable everyone to succeed.

Look to Speak launches in Ukraine

Nearly two years ago, Google launched Look to Speak, an Android app that allows people to use their eyes to select pre-written phrases and have them spoken aloud. Since then, the app has launched in 18 additional languages. Most recently, we made the app available in Ukrainian to help refugees and veterans of the war.

As a speech and language therapist working at Google, I’ve seen how technology can help people express their everyday needs, feelings and identity. To hear from someone about how Look to Speak can be particularly helpful in Ukraine where people are dealing with the injuries and side effects of war, I spoke with Oksana Lyalka, the founder and president of the Ukrainian Society for Speech and Language Therapy.

What is the situation like for veterans and refugees of the war in Ukraine who have speech and motor impairments?

Due to direct injuries and conditions caused by the war, the number of people with both motor and speech impairments are likely increasing. In addition, indirect impact like stress and malnutrition causes an increased risk of strokes, which can also lead to motor and speech impairments — and access to care remains limited. Also, for many refugees who left Ukraine and are in foreign countries, it’s difficult to get the help they need because many of them already have chronic impairments and their insurance does not cover therapy for communication disorders in another country. Plus, communication is a language-specific impairment. Meaning, it’s difficult to get the help they need in their native language outside of Ukraine.

What are the specific challenges that people are facing?

They are mainly left on their own with speech and motor impairments. Because: 1) There’s a shortage of speech language therapists. 2) There are even fewer who understand what these patients deal with. 3) Therapy is costly and not everyone has the resources to afford it, especially in war time.

How could a tool like Look to Speak be helpful in Ukraine?

When someone has only a speech disorder, they can still write to communicate. But when there are also motor disorders like we’ve discussed, people can end up with no way to communicate. With Look to Speak, even if someone can’t communicate using their mouth, they can communicate with their eyes. This allows caregivers and others in their environment to listen and understand in new ways. Communication is a two-way process, and the Look to Speak app can act as a bridge.

The First Lady of Ukraine Olena Zelenska on the Look to Speak app:

"One of everyone’s fundamental needs is the ability to communicate and interact with those around them. For most people, it is unnoticeable and automatic, similar to breathing. However, due to various factors, a person may lose this ability and be unable to talk or use a computer, tablet, mobile phone, or other devices. Especially now, in times when the war daily multiplies the chances of finding oneself in such conditions, we as a society must unite and help each other as much as we can to overcome these terrible circumstances. One of the examples of Ukraine's responsible cooperation with world technological leaders is the localization of Google’s Look to Speak app. It helps people with motor and speech impairments to communicate using their eye movements. It is good to know that Ukrainian public organizations in health care, medical institutions, and everyone who needs it can use advanced digital solutions now adapted to the needs of Ukrainian users. I am sure that initiatives like Look to Speak will not only provide new opportunities for our citizens but will also serve as a model for other technological companies that are now supporting Ukraine."

To learn more about Look to Speak in Ukraine, watch this video in Ukrainian.

Improving accessibility led this UX researcher to Google

Welcome to the latest edition of “My Path to Google,” where we talk to Googlers, interns and alumni about how they got to Google, what their roles are like and even some tips on how to prepare for interviews.

Today’s post is all about Jerry Robinson, a user experience (UX) researcher on our Central Product Inclusion, Equity and Accessibility team.

What’s your role at Google?

I’m the lead UX researcher on the Lookout team. Lookout is an Android app that uses AI to help people who are blind or have low vision perform daily tasks faster. It can read text and detect different objects within the camera’s field of view. One of my favorite features is the food label mode, which can quickly identify food products — like whether you’re holding a can of chicken or tomato soup.

I conduct research with current and potential Lookout users to find opportunities to make the app more useful. I love this part of my job because I get to hear directly from the people using our products and share what I’ve learned with my teammates. It’s a privilege to be in a role where I can help our product teams better understand our users and carry out Google’s mission to make information universally accessible.

Jerry stands outside next to a sign with the Google logo.

Can you tell us a bit about yourself?

I graduated from Morehouse College in 2004 with a degree in accounting. After working in the banking industry for five years, I decided to go to grad school and find a career where I could make an impact on people’s everyday lives. Also, as someone with a disability living in a world not always designed with them in mind, I was interested in accessibility and ways to support people with disabilities on their terms.

I earned a Master of Science in Information Management and a PhD in Information Science. My dissertation was focused on accessible design from the perspective of people with physical disabilities finding their own ways to adapt to inaccessible situations in their everyday lives.

How did the application and interview process go for you?

I met a Googler at an assistive technology conference a few years earlier who told me about the open role. I expressed interest and connected with a recruiter, and eventually received a referral from another Googler.

My biggest concern during the interview process was communication. I have a distinct speech pattern because of my cerebral palsy, and I’ve always been concerned that potential employers might hold that against me. However, I knew that Google had an inclusive work environment. And I was confident in my ability to conduct good UX research.

The interview process actually assured me that I was a strong candidate. My interview committee asked tough questions, but they were extremely thoughtful and kind. One of them told me to think of the interview more as a conversation, while another complimented me on my presentation. I felt a level of respect from the very beginning that put me at ease and made me more certain that I wanted to work here.

Jerry smiles and sits in a Google micro-kitchen.

Jerry in a Google office micro-kitchen.

What inspires you to come in (or log on) every day?

I’m inspired by all the Google UXers I work with who are passionate about designing for everyone. Google, and the tech industry overall, needs people who are dedicated to making accessible design the norm rather than an afterthought.

What resources did you use to prepare for your interview?

My recruiter was incredibly helpful. He gave me tips about what to communicate during each interview round, including how best to present the scope, complexity and impact of my work. I practiced my final presentation several times before my last round of interviews to build up my confidence. And I went to bed early the night before to make sure I felt rested.

Any tips to share with aspiring Googlers?

Do all that you can to prepare, but also be confident in what you bring to the table. Know that you’re going through the process because you’re already a qualified candidate. Remind yourself that as often as you need to.

Expanding accessible learning with Google for Education

The need for accessible tools and equitable learning environments has become more critical than ever, as the number of students with disabilities is on the rise, including those with specific learning disabilities.

Google for Education uses built-in accessibility features to help support the needs of all students, to help foster inclusive environments so that students can learn individually and as a group. Our accessibility features provide individualized support while giving students the resources they need to learn in a collaborative way.

Captions as a tool across Google products

We aim to build helpful features across all our products. One of those features is captions, which are useful not only for those who are deaf or hard of hearing, but also if a room is loud, or a student just needs additional support focusing, or if someone needs support in a different language. Captions are available in Google Meet in six languages, and you can change the font size and placement on the screen. You can also access and add captions to videos in YouTube, Google Drive and Chrome. For Android users, captions are also available through Live Transcribe.

Gif of multiple types of captions across Google products

Building accessibility tools into Chromebooks

Our accessibility features are easy to use and built directly into Chromebooks, like Select-to-Speak, the ChromeVox screen reader and magnification, that enable every individual to do their best work. Now we’ve added more dictation improvements, like the ability to speak into any text field on the Chromebook simply by clicking on the mic icon in the status area. You can also press Search + d to dictate, and you can now edit using just your voice.

More customization options in Google Workspace for Education

We recently announced more customization for accessibility settings in Google Docs, Sheets, Slides and Drawings, so users can set accessibility settings for each product individually. And soon, we’ll build upon that by consolidating the Docs settings for screen reader and braille support into a single setting, with a single checkbox. We’ll also soon be adding improvements to voice typing in Google Docs, voice typing for speaker notes in Google Slides, and captions in Google Slides, including adding automatic punctuation and the ability to access from all browsers.

For people who are blind or low vision and use screen readers, you can nowtype a keyboard shortcut (ALT + number, 1-7) that verbalizes the content of a Calendar event. This way, Calendar details can be heard on demand, instead of through time-consuming navigation.

Working with partners to expand accessible tools

We’re supporting teachers through our own tools and partnerships with organizations that share our mission. Many of these apps and extensions integrate with Google tools like Classroom, Google Workspace for Education, and Chromebooks.

This includes Texthelp, a company which makes extensions and tools that help people learn, understand and communicate through the use of digital learning and accessibility tools. Students can use tools like Read&Write to help with reading support, and Equatio to help with creating math equations digitally. Created by an occupational therapist, SnapType breaks down the barriers to education by helping students feel more confident and independent. Students challenged by handwriting or visual impairments can keep up in class with their peers by using SnapType to easily take a picture of their assignment and then type or dictate their schoolwork.

We’re also continually working to update our Help Center articles for screen reader users, including how to use a screen reader with Google Calendar, how to use a screen reader with Google Drive, and how to make your document or presentation more accessible. Stay up to date on the latest accessibility features from Google for Education.

New ways we’re making speech recognition work for everyone

Voice-activated technologies, like Google Home or the Google Assistant, can help people do things like make a phone call to someone, adjust the lighting in their house, or play a favorite song — all with the sound of their voice. But these technologies may not work as well for the millions of people around the world who have non-standard speech. In 2019 we launched our research initiative Project Euphonia with the aim of finding ways to leverage AI to make speech recognition technology more accessible.

Today, we’re expanding this commitment to accessibility through our involvement in the Speech Accessibility Project, a collaboration between researchers at the University of Illinois Urbana-Champaign and five technology companies, including Google. The university is working with advocacy groups, like Team Gleason and the Davis Phinney Foundation, to create datasets of impaired speech that can help accelerate improvements to automated speech recognition (ASR) for the communities these organizations support.

Since the launch of Project Euphonia, we’ve had the opportunity to work with community organizations to compile a collection of speech samples from over 2,000 people. This collection of utterances has allowed Project Euphonia researchers to adapt standard speech recognition systems to understand non-standard speech more accurately, and ultimately reduce median word error rates by an average of more than 80%. These promising results created the foundation for Project Relate, an Android app that allows people to submit samples of their voice and receive a personalized speech recognition model that more accurately understands their speech. It also encouraged the expansion of Project Euphonia to include additional languages like French, Japanese, and Spanish.

There’s still a lot to be done to develop ASR systems that can understand everyone’s voice — regardless of speech pattern. However, it’s clear that larger, more diverse datasets and collaboration with the communities we want to reach will help get us to where we want to go. That is why we’re making it easy for Project Euphonia participants to share copies of their recordings to the Speech Accessibility Project. Our hope is that by making these datasets available to research and development teams, we can help improve communication systems for everyone, including people with disabilities.

Google Workspace Updates Weekly Recap – September 30, 2022

New updates 


There are no new updates to share this week. Please see below for a recap of published announcements. 


Previous announcements


The announcements below were published on the Workspace Updates blog earlier this week. Please refer to the original blog posts for complete details.



New Google Calendar shortcuts improve glanceability for screen reader users 
Users of screen readers can now type a keyboard shortcut that verbalizes the content of an event, such as the title, date and time, guest list, and much more. | Learn more



For a recap of announcements in the past six months, check out What’s new in Google Workspace (recent releases).

New Google Calendar shortcuts improve glanceability for screen reader users

What’s changing 

In addition to the improved announcements for braille comments and highlights available in Google Docs on Web, and the recent launch of more control over accessibility preferences, we’re introducing Announce Shortcuts for Calendar event details. 

Users of screen readers can now type a keyboard shortcut that verbalizes the content of an event, such as the title, date and time, guest list, and much more. These shortcuts greatly improve glanceability because they enable Calendar details to be heard on demand instead of time consuming navigation through information. 


Getting started 

  • Admins: There is no admin control for this feature. 
  • End users: 
    • To view keyboard shortcuts in Calendar, type: 
      • Ctrl+/ on Windows and ChromeOS 
      • Cmd+/ on Mac 
    • Use the following keyboard combinations to access Announce Shortcuts: 
      • Alt+ for Windows 
      • Alt+Shift+ for ChromeOS 
      • Option+ for Mac 
    • Visit the Help Center to learn more about using a screen reader with Google Calendar. 

Rollout pace 



Availability 

  • Available to all Google Workspace customers, as well as legacy G Suite Basic and Business customers 
  • Available to users with personal Google Accounts 

Resources 


Roadmap 

  • This feature was listed as an upcoming release.

Recovery, community and healing on the job at Google

Just shy of a year ago, I can still vividly remember scanning The Keyword and coming across the headline, “How my recovery community helps keep me sober.”

Fresh out of grad school, I had been working at Google for just three months and I had been in recovery for almost three years. It was the first time in my life I wasn’t using drugs and alcohol to cope with the stresses and insecurities of work. Before I found recovery, I thought I owed my academic and professional successes to substance abuse. I drank and used drugs “to relax,” make friends and numb the chronic depression that immobilized me otherwise. Frankly, I didn’t know if I was cut out for Google on my own.

Finding hope through community

When I opened that link and read about Google’s Recover Together website — which includes a searchable map to find nearby recovery groups and support resources for people and their families — let alone featuring an actual Googler in recovery, I knew I was in the right place. Addiction is still too often shamed and silenced, so it’s all the more commendable for a company like Google to use its technology, finances and branding capital to bring resources to the millions of people impacted.

The compassion and dignity of that story made me feel hopeful that I could make it at Google clean and sober – but I realized I may not have to do it “on my own.” After some searching, I found that Google's Disability Alliance Employee Resource Group had a dedicated group for those in recovery from any form of addiction. I had already been taking advantage of individual counseling through Google’s Employee Assistance Program, but for me there is nothing like building community to support healing. Over the past year, the recovery group has supported me through onboarding, battling imposter syndrome and other work-related experiences that would have previously sent me searching for solace at the bottom of a bottle.

We do recover – together

It’s difficult to express gratitude for the vulnerability, courage and wisdom the recovery community has brought into my life. Part of that is why I’m so excited to amplify my personal impact and be a part of the group working this year to host a slew of events for National Recovery Month.

On September 7, Google’s internal recovery group hosted an event embodying what recovery awareness and advocacy is all about: showing up, speaking up and standing up over and over and over again. This featured a stop from Mobilize Recovery Across America’s cross country tour and representatives from the federal Substance Abuse and Mental Health Services Administration (SAMHSA). Attendees shared personal stories of addiction and recovery, tips to ensure events are inclusive (like providing non-alcoholic options), information of where to dispose of prescription drugs properly, and tangible resources of how to help someone find recovery treatment or access immediate assistance (like the 988 crisis lifeline). To conclude the evening, the Google campus was lit up purple, the official color for Recovery Month.

Mobilize Recovery bus parked next to Google campus, lit purple in celebration of recovery month

Hilary Swift for Mobilize Recovery

This month, Google added new personal recovery stories, including mine, to its Recover Together site to inspire hope and combat stigma. U.S. trends and data tell us this is needed more now than ever. Comparing January-September of 2021 to the same date range in 2022, Google’s U.S. based searches for "AA meeting locator" and “addiction treatment near me” increased by 350% and 85% respectively. Further, a national study by the Pew Research Center reports that nearly half of Americans have a family member or friend impacted by addiction, with a fairly even distribution by political party, gender and other markers of identity. My hope is that videos and stories like mine will help others feel less alone. I hope it helps people find a way to join me and the other 25 million Americans thriving in long-term recovery.

Whether you’re just beginning your journey, or well along the path, know that recovery is possible. We do not have to self-medicate in the shadows. My experience has taught me that the more we open up and reach out, the easier it all becomes.

Visit g.co/recovertogetherto find recovery support groups in your area, and check out mobilizerecovery.org/for more information.

More control over accessibility preferences in Docs, Sheets, Slides, and Drawings

Quick summary

Over the years, we’ve launched features to support our ongoing accessibility efforts to ensure our products work well for everyone. For users of screen readers, braille devices, screen magnifiers, and more, we're improving the ability to adjust your accessibility preferences for Docs, Sheets, Slides, and Drawings separately. 

Rather than having the same accessibility settings apply across these products, you’re now able to set preferences for each product individually. We expect this change to make it easier to ensure accessibility settings are personalized to best meet each user’s needs. 

Accessibility settings can now be personalized for Docs, Sheets, Slides, and Drawings

Getting started 

  • Admins: There is no admin control for this feature. 
  • End users: In your document, spreadsheet, slide deck, or drawing, navigate to Tools > Accessibility > select your preferred settings. Visit the Help Center to learn more about Accessibility

Rollout pace 

 Availability 

  • Available to all Google Workspace customers, as well as legacy G Suite Basic and Business customers 
  • Available to users with personal Google Accounts 

 Resources 

Roadmap 

Explore, communicate and customize with Android

Android is constantly adding features to better connect with the people and devices around you. Today, we’re introducing a set of updates to help your phone stand out as much as you do. From more expressive ways to message your friends, to subtle but smart upgrades to entertainment and accessibility, we ensure that every interaction with your Android device is more helpful than the last.

Add a personal touch to messaging with Gboard

Animated demonstration of a message turning into a decorative sticker at the touch of a button.

Caption: Add some flair to your messages with custom text stickers.

A picture is worth a thousand words — but Gboard can now turn your words into pictures, too. Previously available on Pixel phones, custom text stickers will soon be available to all Android Gboard users typing in English-U.S., allowing you to type what you want to say, select a design and share your message with your nearest and dearest.

Celebrate summer and Pride with new Emoji Kitchen stickers

Animation of a hand opening Messages, creating a watermelon soccer ball emoji and sending it to a contact.

Caption: Enjoy more celebratory emoji mashups to share with your friends.

New emoji mashups have arrived just in time for summer (for those of you in the Southern hemisphere, we got you covered too) with Emoji Kitchen. There are more than 1600 new combinations to help you express your excitement — like when you want to show how much you’re looking forward to your upcoming summer vacation ?️?️ or add a little hot summer twist to your usual go-to emoji (watermelon soccer ball, anyone?). We also have lots of rainbow-based stickers to help you embrace Pride Month in many unique ways. ?️‍?

Better conversations and connections with new accessibility features

Animated demonstration of how Sound Amplifier settings can boost audio and reduce background noise levels.

Caption: Amplify the sounds you want to hear, and filter out the sounds you don't.

Designed for and with people with hearing loss, Sound Amplifier uses your phone to amplify and filter important sounds around you. Today’s update brings improved background noise reduction, faster and more accurate sound and a revamped user interface that is easier to see.

Animated demonstration of Lookout generating a detailed description of an image sent via Gmail.

Caption: Hear a detailed description of images from just about any browser or app with Lookout Images mode.

Designed with and for people with low vision or blindness, Lookout uses your Android device’s camera to provide information about the world around you with a variety of modes. Now with the new Images mode, which uses Google’s latest machine learning model for image understanding, you can hear a description of an image by simply opening it from just about any app. In addition, enhancements to Text mode, Documents mode, Food Label mode and Explore mode are making Lookout more accurate. Lookout now also works offline without the need for Wi-Fi or data service. Download or updateLookout in Google Play to get the new features.

Use your Google Play Points for items in your apps and games at checkout

Video demonstration of using Play Points to get an in-app item without ever leaving the app.

Caption: Use your Play Points for in-app items at checkout without ever leaving the app.

Google Play Points is a rewards program that lets you earn points and rewards for the ways you already use Google Play. You can soon use your Play Points for in-app items at checkout, without leaving your favorite apps and games. Cover the entire item withPlay Points or split between Play Points and another form of payment. This is rolling out over the coming weeks in countries where Play Points is available.

These updates add to countless ways Android already helps you connect with others and the world around you. Visit android.com to learn more about these features and more.