Tag Archives: accessibility

Four ways to share your values with shoppers

Today's shoppers are increasingly looking for businesses that share their values. According to a recent global study, purpose-driven buyers now make up the largest segment of consumers, with 44% choosing products and brands based on how well they align with their values.

If you’re a business owner who prioritizes values such as sustainability, you can let your customers know you care. Many are already using Business Profile attributes on Search and Maps to showcase their commitment to social change.

Inside Google Korea’s new accessible office space

I don’t think I’ll ever forget the feeling of walking through the newly opened 28th floor of Google Korea. The space has been reimagined with a focus on “universal design” — meaning it was designed to be accessible to people of all abilities.

The idea for this space started a few years ago, when I was talking to other members of the Disability Alliance Employee Resource Group (ERG) in Korea about their web accessibility project— a conversation that then shifted to improving accessible design in the office. Was our office truly as accessible as it could be? Did everyone feel that they could do their best work without any restrictions due to their abilities? We pinpointed some areas for improvement, and that sparked a desire to make a change.

The Disability Alliance then partnered with Google's Real Estate & Workplace Services team to explore how we could implement some of these changes, especially as we expanded our space in Gangnam. Bit by bit, we made improvements to our existing office space, from adding braille to meeting room signs to adding drop-down thresholds for doors.

And when we had the opportunity to influence a brand new floor, we embraced the concept of universal design to co-design alongside the REWS team. Throughout the whole process, we incorporated feedback and co-designed with many people in our community— including Inho, a software engineer with a visual impairment. The design team made all designs and plans available in braille, so that anyone who was visually impaired could still review them.

Seeing our carefully thought out plans begin to take shape was incredible. Finally stepping into the finished space took my breath away, and I was so excited just thinking of how this could help so many of our colleagues thrive.

But don’t just take my word for it! Take a look at these four design details, and why they make such a difference.

We’re proud of how we've applied universal design principles in Google Korea, but we know this isn’t the end of the journey. In fact, I like to think that we’re just getting started. We’re constantly learning and seeking to understand the needs of all people — that’s how we can develop solutions that enable everyone to succeed.

Look to Speak launches in Ukraine

Nearly two years ago, Google launched Look to Speak, an Android app that allows people to use their eyes to select pre-written phrases and have them spoken aloud. Since then, the app has launched in 18 additional languages. Most recently, we made the app available in Ukrainian to help refugees and veterans of the war.

As a speech and language therapist working at Google, I’ve seen how technology can help people express their everyday needs, feelings and identity. To hear from someone about how Look to Speak can be particularly helpful in Ukraine where people are dealing with the injuries and side effects of war, I spoke with Oksana Lyalka, the founder and president of the Ukrainian Society for Speech and Language Therapy.

What is the situation like for veterans and refugees of the war in Ukraine who have speech and motor impairments?

Due to direct injuries and conditions caused by the war, the number of people with both motor and speech impairments are likely increasing. In addition, indirect impact like stress and malnutrition causes an increased risk of strokes, which can also lead to motor and speech impairments — and access to care remains limited. Also, for many refugees who left Ukraine and are in foreign countries, it’s difficult to get the help they need because many of them already have chronic impairments and their insurance does not cover therapy for communication disorders in another country. Plus, communication is a language-specific impairment. Meaning, it’s difficult to get the help they need in their native language outside of Ukraine.

What are the specific challenges that people are facing?

They are mainly left on their own with speech and motor impairments. Because: 1) There’s a shortage of speech language therapists. 2) There are even fewer who understand what these patients deal with. 3) Therapy is costly and not everyone has the resources to afford it, especially in war time.

How could a tool like Look to Speak be helpful in Ukraine?

When someone has only a speech disorder, they can still write to communicate. But when there are also motor disorders like we’ve discussed, people can end up with no way to communicate. With Look to Speak, even if someone can’t communicate using their mouth, they can communicate with their eyes. This allows caregivers and others in their environment to listen and understand in new ways. Communication is a two-way process, and the Look to Speak app can act as a bridge.

The First Lady of Ukraine Olena Zelenska on the Look to Speak app:

"One of everyone’s fundamental needs is the ability to communicate and interact with those around them. For most people, it is unnoticeable and automatic, similar to breathing. However, due to various factors, a person may lose this ability and be unable to talk or use a computer, tablet, mobile phone, or other devices. Especially now, in times when the war daily multiplies the chances of finding oneself in such conditions, we as a society must unite and help each other as much as we can to overcome these terrible circumstances. One of the examples of Ukraine's responsible cooperation with world technological leaders is the localization of Google’s Look to Speak app. It helps people with motor and speech impairments to communicate using their eye movements. It is good to know that Ukrainian public organizations in health care, medical institutions, and everyone who needs it can use advanced digital solutions now adapted to the needs of Ukrainian users. I am sure that initiatives like Look to Speak will not only provide new opportunities for our citizens but will also serve as a model for other technological companies that are now supporting Ukraine."

To learn more about Look to Speak in Ukraine, watch this video in Ukrainian.

Improving accessibility led this UX researcher to Google

Welcome to the latest edition of “My Path to Google,” where we talk to Googlers, interns and alumni about how they got to Google, what their roles are like and even some tips on how to prepare for interviews.

Today’s post is all about Jerry Robinson, a user experience (UX) researcher on our Central Product Inclusion, Equity and Accessibility team.

What’s your role at Google?

I’m the lead UX researcher on the Lookout team. Lookout is an Android app that uses AI to help people who are blind or have low vision perform daily tasks faster. It can read text and detect different objects within the camera’s field of view. One of my favorite features is the food label mode, which can quickly identify food products — like whether you’re holding a can of chicken or tomato soup.

I conduct research with current and potential Lookout users to find opportunities to make the app more useful. I love this part of my job because I get to hear directly from the people using our products and share what I’ve learned with my teammates. It’s a privilege to be in a role where I can help our product teams better understand our users and carry out Google’s mission to make information universally accessible.

Jerry stands outside next to a sign with the Google logo.

Can you tell us a bit about yourself?

I graduated from Morehouse College in 2004 with a degree in accounting. After working in the banking industry for five years, I decided to go to grad school and find a career where I could make an impact on people’s everyday lives. Also, as someone with a disability living in a world not always designed with them in mind, I was interested in accessibility and ways to support people with disabilities on their terms.

I earned a Master of Science in Information Management and a PhD in Information Science. My dissertation was focused on accessible design from the perspective of people with physical disabilities finding their own ways to adapt to inaccessible situations in their everyday lives.

How did the application and interview process go for you?

I met a Googler at an assistive technology conference a few years earlier who told me about the open role. I expressed interest and connected with a recruiter, and eventually received a referral from another Googler.

My biggest concern during the interview process was communication. I have a distinct speech pattern because of my cerebral palsy, and I’ve always been concerned that potential employers might hold that against me. However, I knew that Google had an inclusive work environment. And I was confident in my ability to conduct good UX research.

The interview process actually assured me that I was a strong candidate. My interview committee asked tough questions, but they were extremely thoughtful and kind. One of them told me to think of the interview more as a conversation, while another complimented me on my presentation. I felt a level of respect from the very beginning that put me at ease and made me more certain that I wanted to work here.

Jerry smiles and sits in a Google micro-kitchen.

Jerry in a Google office micro-kitchen.

What inspires you to come in (or log on) every day?

I’m inspired by all the Google UXers I work with who are passionate about designing for everyone. Google, and the tech industry overall, needs people who are dedicated to making accessible design the norm rather than an afterthought.

What resources did you use to prepare for your interview?

My recruiter was incredibly helpful. He gave me tips about what to communicate during each interview round, including how best to present the scope, complexity and impact of my work. I practiced my final presentation several times before my last round of interviews to build up my confidence. And I went to bed early the night before to make sure I felt rested.

Any tips to share with aspiring Googlers?

Do all that you can to prepare, but also be confident in what you bring to the table. Know that you’re going through the process because you’re already a qualified candidate. Remind yourself that as often as you need to.

Expanding accessible learning with Google for Education

The need for accessible tools and equitable learning environments has become more critical than ever, as the number of students with disabilities is on the rise, including those with specific learning disabilities.

Google for Education uses built-in accessibility features to help support the needs of all students, to help foster inclusive environments so that students can learn individually and as a group. Our accessibility features provide individualized support while giving students the resources they need to learn in a collaborative way.

Captions as a tool across Google products

We aim to build helpful features across all our products. One of those features is captions, which are useful not only for those who are deaf or hard of hearing, but also if a room is loud, or a student just needs additional support focusing, or if someone needs support in a different language. Captions are available in Google Meet in six languages, and you can change the font size and placement on the screen. You can also access and add captions to videos in YouTube, Google Drive and Chrome. For Android users, captions are also available through Live Transcribe.

Gif of multiple types of captions across Google products

Building accessibility tools into Chromebooks

Our accessibility features are easy to use and built directly into Chromebooks, like Select-to-Speak, the ChromeVox screen reader and magnification, that enable every individual to do their best work. Now we’ve added more dictation improvements, like the ability to speak into any text field on the Chromebook simply by clicking on the mic icon in the status area. You can also press Search + d to dictate, and you can now edit using just your voice.

More customization options in Google Workspace for Education

We recently announced more customization for accessibility settings in Google Docs, Sheets, Slides and Drawings, so users can set accessibility settings for each product individually. And soon, we’ll build upon that by consolidating the Docs settings for screen reader and braille support into a single setting, with a single checkbox. We’ll also soon be adding improvements to voice typing in Google Docs, voice typing for speaker notes in Google Slides, and captions in Google Slides, including adding automatic punctuation and the ability to access from all browsers.

For people who are blind or low vision and use screen readers, you can nowtype a keyboard shortcut (ALT + number, 1-7) that verbalizes the content of a Calendar event. This way, Calendar details can be heard on demand, instead of through time-consuming navigation.

Working with partners to expand accessible tools

We’re supporting teachers through our own tools and partnerships with organizations that share our mission. Many of these apps and extensions integrate with Google tools like Classroom, Google Workspace for Education, and Chromebooks.

This includes Texthelp, a company which makes extensions and tools that help people learn, understand and communicate through the use of digital learning and accessibility tools. Students can use tools like Read&Write to help with reading support, and Equatio to help with creating math equations digitally. Created by an occupational therapist, SnapType breaks down the barriers to education by helping students feel more confident and independent. Students challenged by handwriting or visual impairments can keep up in class with their peers by using SnapType to easily take a picture of their assignment and then type or dictate their schoolwork.

We’re also continually working to update our Help Center articles for screen reader users, including how to use a screen reader with Google Calendar, how to use a screen reader with Google Drive, and how to make your document or presentation more accessible. Stay up to date on the latest accessibility features from Google for Education.

New ways we’re making speech recognition work for everyone

Voice-activated technologies, like Google Home or the Google Assistant, can help people do things like make a phone call to someone, adjust the lighting in their house, or play a favorite song — all with the sound of their voice. But these technologies may not work as well for the millions of people around the world who have non-standard speech. In 2019 we launched our research initiative Project Euphonia with the aim of finding ways to leverage AI to make speech recognition technology more accessible.

Today, we’re expanding this commitment to accessibility through our involvement in the Speech Accessibility Project, a collaboration between researchers at the University of Illinois Urbana-Champaign and five technology companies, including Google. The university is working with advocacy groups, like Team Gleason and the Davis Phinney Foundation, to create datasets of impaired speech that can help accelerate improvements to automated speech recognition (ASR) for the communities these organizations support.

Since the launch of Project Euphonia, we’ve had the opportunity to work with community organizations to compile a collection of speech samples from over 2,000 people. This collection of utterances has allowed Project Euphonia researchers to adapt standard speech recognition systems to understand non-standard speech more accurately, and ultimately reduce median word error rates by an average of more than 80%. These promising results created the foundation for Project Relate, an Android app that allows people to submit samples of their voice and receive a personalized speech recognition model that more accurately understands their speech. It also encouraged the expansion of Project Euphonia to include additional languages like French, Japanese, and Spanish.

There’s still a lot to be done to develop ASR systems that can understand everyone’s voice — regardless of speech pattern. However, it’s clear that larger, more diverse datasets and collaboration with the communities we want to reach will help get us to where we want to go. That is why we’re making it easy for Project Euphonia participants to share copies of their recordings to the Speech Accessibility Project. Our hope is that by making these datasets available to research and development teams, we can help improve communication systems for everyone, including people with disabilities.