Tag Archives: The She Word

What drives Nithya Sambasivan’s fight for fairness

When Nithya Sambasivan was finishing her undergraduate degree in engineering, she felt slightly unsatisfied. “I wanted to know, ‘how will the technology I build impact people?’” she says. Luckily, she would soon discover the field of Human Computer Interaction (HCI) and pursue her graduate degrees. 

She completed her master’s and PhD in HCI focusing on technology design for low-income communities in India. “I worked with sex workers, slum communities, microentrepreneurs, fruit and vegetables sellers on the streetside...” she says. “I wanted to understand what their values, aspirations and struggles are, and how we can build with them in mind.” 

Today, Nithya is the founder of the HCI group at the Google Research India lab and an HCI researcher at PAIR, a multidisciplinary team at Google that explores the human side of AI by doing fundamental research, building tools, creating design frameworks, and working with diverse communities. She recently sat down to answer some of our questions about her journey to researching responsible AI, fairness and championing historically underrepresented technology users.

How would you explain your job to someone who isn't in tech?

I’m a human-computer interaction (HCI) researcher, which means I study people to better understand how to build technology that works for them. There’s been a lot of focus in the research community on building AI systems and the possibility of positively impacting the lives of billions of people. I focus on human-centered, responsible AI; specifically looking for ways it can empower communities in the Global South, where over 80% of the world’s population lives. Today, my research outlines a road map for fairness research in India, calling for re-contextualizing datasets and models while empowering communities and enabling an entire fairness ecosystem.

What originally inspired your interest in technology? 

I grew up in a middle class family, the younger of two daughters from the South of India. My parents have very progressive views about gender roles and independence, especially in a conservative society — this definitely influenced what and how I research; things like gender, caste and  poverty. In school, I started off studying engineering, which is a conventional path in India. Then, I went on to focus on HCI and designing with my own and other under-represented communities around the world.

Nithya smiling at a small child while working in the field.

How do Google’s  AI Principles inform your research? And how do you approach your research in general?

Context matters. A general theory of algorithmic fairness cannot be based on “Western” populations alone. My general approach is to research an important long-term, foundational problem. For example, our research on algorithmic fairness reframes the conversation on ethical AI away from focusing mainly on Western, meaning largely European or North American, perspectives. Another project revealed that AI developers have historically focused more on the model — or algorithm — instead of the data. Both deeply affect the eventual AI performance, so being so focused on only one aspect creates downstream problems. For example, data sets may fully miss sub-populations, so when they are deployed, they may  have much higher error rates or be unusable. Or they could make outcomes worse for certain groups, by misidentifying them as suspects for crimes or erroneously denying them bank loans they should receive.  

These insights not only enable AI systems to be better designed for under-represented communities; they also generate new considerations in the field of computing for humane and inclusive data collection, gender and social status representation, and privacy and safety needs of the most vulnerable. They are then  incorporated into Google products that millions of people use, such as Safe Folder on Files Go, Google Go’s incognito mode, Neighbourly‘s privacy, Safe Safer by Google Maps and Women in STEM videos. 

What are some of the questions you’re seeking to answer with your work?

How do we challenge inherent “West”-centric assumptions for algorithmic fairness, tech norms and make AI work better for people around the world?

For example, there’s an assumption that algorithmic biases can be fixed by adding more data from different groups. But in India, we've found that data can't always represent individuals or events for many different reasons like economics and access to devices. The data could come mostly from middle class Indian men, since they’re more likely to have internet access. This means algorithms will work well for them. Yet, over half the population — primarily women, rural and tribal communities — lack access to the internet and they’re left out. Caste, religion and other factors can also contribute to new biases for AI models. 

How should aspiring AI thinkers and future technologists prepare for a career in this field? 

It’s really important that Brown and Black people enter this field. We not only bring technical skills but also lived experiences and values that are so critical to the field of computing. Our communities are the most vulnerable to AI interventions, so it’s important we shape and build these systems. To members of this community: Never play small or let someone make you feel small. Involve yourself in the political, social and ecological aspects of the invisible, not on tech innovation alone. We can’t afford not to.

Chrome OS’s Jenn Chen on a decade of design

Ten years ago, Chrome OS principal designer Jenn Chen was hardly what you’d called a techie. “I was the last person I knew who got a smartphone,” she says, laughing. “I was a total Luddite! I didn’t want to do it!” But today, things are different — and not just for Jenn. The devices we use and how we use them have both changed dramatically over the years. “Technology plays a bigger part in our day to day,” she says. “So it’s increasingly important that we have a human, respectful approach in how we design and build products.” 

Chrome OS embraced that change, and Jenn’s seen the evolution from the inside. Originally, she was the only person on the team dedicated to Chrome OS user experience (UX) — now, she leads an entire team. We recently had the chance to talk to Jenn about a decade of Chrome OS, and what her path to design work was like. 

What kickstarted your interest in working in UX and design?

Growing up, I had a lot of different interests but never felt like they quite added up to a clear career path. I dabbled in biology because I loved marine life, read up on cognition because I was fascinated by how minds worked and even explored being a full-time pianist. One day in college, I tagged along with a friend who organized a visit to a design agency and I found it absolutely riveting. Here were different people with different professions — anthropologists, surgeons, engineers — all working together to solve a problem through a multifaceted, human-centered approach which I learned was called “design thinking.” This really sparked my interest in learning more about product design and building creative solutions to serve real user needs, which led to studying HCI (human-computer interaction) and user experience.

What’s the “movie version” of your job? How is it portrayed in pop culture, and how does that compare to reality? 

The perception is that UXers are in the lab all day, and that every user insight we learn immediately leads to a light bulb moment and design solution! There’s so much testing out ideas, learning that they won’t work and moving on — or years later, bringing that thing back and seeing there is something there, but the timing wasn't right or the tech wasn’t ready before. There’s a lot of constant failure. We designers call it “iteration,” but I think people forget that also means being wrong a lot — and being OK with being wrong, because it helps us learn. The movie version of my job glosses over all that.

Chrome OS was such a new idea. What were some of the early challenges of launching something so different?

Computers have been around much longer than Chromebooks, so people have established expectations and habits. The challenge is meaningfully rethinking what a computer can be while also meeting people where they are. I’ve been incredibly lucky to work with and learn from experts in this space as a part of the Chrome OS team and a part of the broader Google UX community.

One good example of this was that Chrome OS started out with a minimal approach when it came to task management: Users could only have full-screen windows with multiple tabs. We quickly learned that how people manage their tasks is personal, so flexibility is absolutely necessary. We introduced more window controls and tools over time. Today, we've expanded task management abilities for Desks to help people organize their apps, windows and tabs across virtual work spaces, but still benefit from a simplified, more constrained model when they only have a touchscreen handy. 

Early Chrome OS task management

Early Chrome OS task management

Chrome OS desks in 2021

Chrome OS desks in 2021

Jenn Chenn 10 years ago survey

What new launches are you excited about?

So many things! The team has been hard at work on a whole suite of features for Chrome OS’s 10th birthday. I’m really excited about the everyday efficiencies we’ve built, whether it’s helping you find that article you had open on your phone with Phone Hub or making screenshots and recordings more precise with Screen Capture — definitely things that I use daily as a designer. 

Ten years later, what keeps you interested in this work?

I came from the startup world, and to be totally honest I didn’t think I’d be at a larger company for this long. But one of the things I love about working on Chrome OS is that it’s kind of like a startup in a big company: We’ve come a long way after starting out as a little fish in this pond, there’s much more we aspire to do, and I get the huge privilege of being a part of the journey with an amazing team of people. 

What’s especially motivating for me is witnessing how computing impacts people’s economic and social mobility — whether it’s being part of the distance learning solution in a pandemic or supporting refugees in settling in to their new communities. I’m excited to see how some of the bets we’ve made play out, and to be a part of shaping the future of computing.

Meet the researcher creating more access with language

When you’ve got your hands full, so you use your voice to ask your phone to play your favorite song, it can feel like magic. In reality, it’s a more complicated combination of engineering, design and natural language processing at work, making it easier for many of us to use our smartphones. But what happens when this voice technology isn’t available in our own language? 

This is something Google India researcher Shachi Dave considers as part of her day-to-day work. While English is the most widely spoken language globally, it ranks third as the most widely spoken native language (behind Mandarin and Spanish)—just ahead of Hindi, Bengali and a number of other languages that are official in India. Home to more than one billion people and an impressive number of official languages—22, to be exact—India is at the cutting edge of Google’s language localization or L10n (10 represents the number of letters between ‘l’ and ‘n’) efforts. 

Shachi, who is a founding member of the Google India Research team, works on natural language understanding, a field of artificial intelligence (AI) which builds computer algorithms to understand our everyday speech and language. Working with Google’s AI principles, she aims to ensure teams build our products to be socially beneficial and inclusive. Born and raised in India, Shachi graduated with a master’s degree in computer science from the University of Southern California. After working at a few U.S. startups, she joined Google over 12 years ago and returned to India to take on more research and leadership responsibilities. Since she joined the company, she has worked closely with teams in Mountain View, New York, Zurich and Tel Aviv. She also actively contributes towards improving diversity and inclusion at Google through mentoring fellow female software engineers.

How would you explain your job to someone who isn't in tech?

My job is to make sure computers can understand and interact with humans naturally, a field of computer science we call natural language processing (NLP). Our research has found that many Indian users tend to use a mix of English and their native language when interacting with our technology, so that’s why understanding natural language is so important—it’s key to localization, our efforts to provide our services in every language and culture—while making sure our technology is fun to use and natural-sounding along the way.

What are some of the biggest challenges you’re tackling in your work now?


The biggest challenge is that India is a multilingual country, with 22 official languages. I have seen friends, family and even strangers struggle with technology that doesn’t work for them in their language, even though it can work so well in other languages. 

Let’s say one of our users is a shop owner and lives in a small village in the southern Indian state of Telangana. She goes online for the first time with her phone. But since she has never used a computer or smartphone before, using her voice is the most natural way for her to interact with her phone. While she knows some English, she is also more comfortable speaking in her native language, Telugu. Our job is to make sure that she has a positive experience and does not have to struggle to get the information she needs. Perhaps she’s able to order more goods for her shop through the web, or maybe she decides to have her services listed online to grow her business. 

So that’s part of my motivation to do my research, and that’s one of Google’s AI Principles, too—to make sure our technology is socially beneficial. 

Speaking of the AI Principles, what other principles help inform your research?

Another one of Google’s AI Principles is avoiding creating or reinforcing unfair bias. AI systems are good at recognizing patterns within data. Given that most data that we feed into training an AI system is generated by humans, it tends to have human biases and prejudices. I look for systematic ways to remove these biases. This requires constant awareness: being aware of how people have different languages, backgrounds and financial statuses. Our society has people from the entire financial spectrum, from super rich to low-income, so what works on the most expensive phones might not work on lower-cost devices. Also, some of our users might not be able to read or write, so we need to provide some audio and visual tools for them to have a better internet experience.

What led you to this career and inspired you to join Google?  

I took an Introduction to Artificial Intelligence course as an undergraduate, and it piqued my interest and curiosity. That ultimately led to research on machine translation at the Indian Institute of Technology Bombay and then an advanced degree at the University of Southern California. After that, I spent some time working at U.S. startups that were using NLP and machine learning. 

But I wanted more. I wanted to be intellectually challenged, solving hard problems. Since Google had the computing power and reputation for solving problems at scale, it became one of my top choices for places to work. 

Now you’ve been at Google for over 12 years. What are some of the most rewarding moments of your career?

Definitely when I saw the quality improvements I worked on go live on Google Search and Assistant, positively impacting millions of people. I remember I was able to help launch local features like getting the Assistant to play the songs people wanted to hear. Playing music upon request makes people happy, and it’s a feature that still works today. 

Over the years, I have gone through difficult situations as someone from an underrepresented group. I was fortunate to have a great support network—women peers as well as allies—who helped me. I try to pay it forward by being a mentor for underrepresented groups both within and outside Google.

How should aspiring AI researchers prepare for a career in this field? 

First, be a lifelong learner: The industry is moving at a fast pace. It’s important to carve out time to keep yourself well-read about the latest research in your field as well as related fields.

Second, know your motivation: When a problem is super challenging and super hard, you need to have that focus and belief that what you’re doing is going to contribute positively to our society.

Alana Karen’s new book shares stories from women in tech

When she read stories about women in tech, Alana Karen kept seeing the same theme, over and over. Generally, it seemed like they couldn’t find a sense of belonging in the industry, and as a result, would leave their jobs. But based on her 19 years at Google, and what she knew about her own colleagues, she suspected there was much more complexity and nuance to these women’s experiences. “I wanted to get beyond generic advice and get into the tough stories,” she says. And she wanted to focus on why women stay in tech, not just why they leave.

She captures a collection of those stories in her book, “The Adventures of Women in Tech: How We Got Here and Why We Stay.” I talked to Alana over Google Meet about how she approached writing the book while juggling a full-time job, managing a team and spending time with family—plus, what she’s learned as the Director of Special Projects for Search and from her own career in the tech industry.

The cover of "The Adventures of Women in Tech"

How do you explain your job to people who don’t work in tech?

I work on the Search team, focusing on the infrastructure behind the product, and I help engineers make things happen. That means helping the team set goals, track against those goals, share status updates and communicate with others.

What’s one habit that makes you successful?

I think of my work as tending to a garden: I’m the farmer who focuses on fertilizing the soil so that all of the flowers can grow. As a people-focused leader, I’m constantly thinking about how to motivate people and set them up to do their best work. 

Your book aims to represent a variety of stories of women who work in tech—and stay. Why do you think that's so important?

There’s a common narrative that women are having trouble finding a sense of belonging in the tech industry. And there I was, among so many powerful, dynamic, interesting women. It wasn’t that we hadn’t had struggles along the way (we had!), but we navigated them and we were still here.

It was important for me to show a breadth of these women’s stories for two reasons: one, to show women thinking about getting into tech—in any type of role, with any type of experience—that they belong, and this is doable. Two, I was curious if there was this silent, quiet problem where a lot of women were planning to leave tech. Was I just assuming everyone is OK?

In your conversations with the many women you interviewed, what surprised you the most?

One thing that did surprise me was we all had similar themes in our answers as to why we wanted to work in tech. We were all interested in changing the world, the opportunities our careers afforded us, liked the open and accepting culture, and the meaningful work.

You touch on the theme of inclusion and how essential it is for women to feel qualified and appreciated for their work. What advice would you give those grappling with feelings of self-doubt?

By publishing all of these different stories, I want to show women they aren’t alone. One piece of advice I’d give would be to lean on the people who have been touchstones in your career. That person doesn’t have to be a mentor or sponsor in the traditional sense, but can be something more informal. 

And second, remember you are worth that. It can take years for women to see that they deserve paying as much attention to themselves and setting their boundaries, and fund themselves with the same amount they spend on others. I hope the book can help instill in people that they’re worth that earlier on in their careers.

Who has been a strong female influence in your life?

My mom, who was the primary income-earner for my family working at Rutgers College. The year I was born, women made 58.9 cents to each dollar men earned. Growing up, I watched my mom navigate her career, find her voice and figure out how to be a strong career person and mother. She showed me ways I wanted to emulate her, as well as ways I wanted to do things differently. She gave me the perspective that careers are long, and you can have different phases of them along the way.

Fernanda Viégas puts people at the heart of AI

When Fernanda Viégas was in college, it took three years with three different majors before she decided she wanted to study graphic design and art history. And even then, she couldn’t have imagined the job she has today: building artificial intelligence and machine learning with fairness and transparency in mind to help people in their daily lives.  

Today Fernanda, who grew up in Rio de Janeiro, Brazil, is a senior researcher at Google. She’s based in London, where she co-leads the global People + AI Research (PAIR) Initiative, which she co-founded with fellow senior research scientist Martin M. Wattenberg and Senior UX Researcher Jess Holbrook, and the Big Picture team. She and her colleagues make sure people at Google think about fairness and values–and putting Google’s AI Principlesinto practice–when they work on artificial intelligence. Her team recently launched a seriesof “AI Explorables,"a collection of interactive articles to better explain machine learning to everyone. 

When she’s not looking into the big questions around emerging technology, she’s also an artist, known for her artistic collaborations with Wattenberg. Their data visualization art is a part of the permanent collection of the Museum of Modern Art in New York.  

I recently sat down with Fernanda via Google Meet to talk about her role and the importance of putting people first when it comes to AI. 

How would you explain your job to someone who isn't in tech?

As a research scientist, I try to make sure that machine learning (ML) systems can be better understood by people, to help people have the right level of trust in these systems. One of the main ways in which our work makes its way to the public is through the People + AI Guidebook, a set of principles and guidelines for user experience (UX) designers, product managers and engineering teams to create products that are easier to understand from a user’s perspective.

What is a key challenge that you’re focusing on in your research? 

My team builds data visualization tools that help people building AI systems to consider issues like fairness proactively, so that their products can work better for more people. Here’s a generic example: Let’s imagine it's time for your coffee break and you use an app that uses machine learning for recommendations of coffee places near you at that moment. Your coffee app provides 10 recommendations for cafes in your area, and they’re all well-rated. From an accuracy perspective, the app performed its job: It offered information on a certain number of cafes near you. But it didn’t account for unintended unfair bias. For example: Did you get recommendations only for large businesses? Did the recommendations include only chain coffee shops? Or did they also include small, locally owned shops? How about places with international styles of coffee that might be nearby? 

The tools our team makes help ensure that the recommendations people get aren’t unfairly biased. By making these biases easy to spot with engaging visualizations of the data, we can help identify what might be improved. 

What inspired you to join Google? 

It’s so interesting to consider this because my story comes out of repeated failures, actually! When I was a student in Brazil, where I was born and grew up, I failed repeatedly in figuring out what I wanted to do. After spending three years studying for different things—chemical engineering, linguistics, education—someone said to me, “You should try to get a scholarship to go to the U.S.” I asked them why I should leave my country to study somewhere when I wasn’t even sure of my major. “That's the thing,” they said. “In the U.S. you can be undecided and change majors.” I loved it! 

So I went to the U.S. and by the time I was graduating, I decided I loved design but I didn't want to be a traditional graphic designer for the rest of my life. That’s when I heard about the Media Lab at MIT and ended up doing a master's degree and PhD in data visualization there. That’s what led me to IBM, where I met Martin M. Wattenberg. Martin has been my working partner for 15 years now; we created a startup after IBM and then Google hired us. In joining, I knew it was our chance to work on products that have the possibility of affecting the world and regular people at scale. 

Two years ago, we shared our seven AI Principles to guide our work. How do you apply them to your everyday research?

One recent example is from our work with the Google Flights team. They offered users alerts about the “right time to buy tickets,” but users were asking themselves, Hmm, how do I trust this alert?  So the designers used our PAIR Guidebook to underscore the importance of AI explainability in their discussions with the engineering team. Together, they redesigned the feature to show users how the price for a flight has changed over the past few months and notify them when prices may go up or won’t get any lower. When it launched, people saw our price history graph and responded very well to it. By using our PAIR Guidebook, the team learned that how you explain your technology can significantly shape the user’s trust in your system. 

Historically, ML has been evaluated along the lines of mathematical metrics for accuracy—but that’s not enough. Once systems touch real lives, there’s so much more you have to think about, such as fairness, transparency, bias and explainability—making sure people understand why an algorithm does what it does. These are the challenges that inspire me to stay at Google after more than 10 years. 

What’s been one of the most rewarding moments of your career?

Whenever we talk to students and there are women and minorities who are excited about working in tech, that’s incredibly inspiring to me. I want them to know they belong in tech, they have a place here. 

Also, working with my team on a Google Doodle about the composer Johann Sebastian Bach last year was so rewarding. It was the very first time Google used AI for a Doodle and it was thrilling to tell my family in Brazil, look, there’s an AI Doodle that uses our tech! 

How should aspiring AI thinkers and future technologists prepare for a career in this field? 

Try to be deep in your field of interest. If it’s AI, there are so many different aspects to this technology, so try to make sure you learn about them. AI isn’t just about technology. It’s always useful to be looking at the applications of the technology, how it impacts real people in real situations.

Maab Ibrahim works each day to fight for racial justice

In her role at Google, Maab Ibrahim works to guide the company on the path toward creating a more just and equitable future. And she draws from her personal experience to guide her work.


Growing up in Richmond, Virginia, Maab reckoned with the city’s painful history and observed everyday injustices like racism and economic inequality. As the child of Black immigrants, she noticed racial inequity in her own backyard as her family and community navigated structural barriers. Today, as a grant portfolio manager for Google.org, Maab has spent the last four years working alongside nonprofit leaders to find solutions to address racial injustice—from centering community-led voices in the movement to using data to identify and analyze bias in the criminal justice system. 

How do you describe your job at a dinner party?

I’m a philanthropic portfolio manager for Google.org. My core focus is to provide support, such as grant funding and technical expertise, to nonprofits that are working to advance racial justice across a number of issue areas, including criminal justice reform, education, and economic opportunity. 

What inspired you to pursue racial justice work?

I draw a lot of inspiration for this work from Black and Latino communities where I grew up in Richmond, Virginia. Virginia has a deep and pained racial history. Our state welcomed the very first slave ships to America, housed the Confederacy, and was a battleground for historic civil rights cases such as Loving v. Virginia. As I learned about this history, I couldn’t help but notice the racial disparities that continue to persist in Black and Brown neighborhoods. Children in our classrooms were over-suspended and over-punished. Parents were attempting to be present in their kids’ educations, while overcoming language barriers or managing two or three jobs. Many families had a loved one behind bars due to biased policing and harsh sentencing.  

Over the years, I’ve learned strategies and tools, but my personal experience continues to deeply influence my approach to the work. As one of very few black women in philanthropy, I believe in trusting and supporting community leaders who are most proximate to racial injustice.

As one of very few black women in philanthropy, I believe in trusting and supporting community leaders who are most proximate to racial injustice.

Can you describe Google.org’s approach to racial justice grantmaking and racial justice work? 

We've primarily directed our grant funding to criminal justice work over the last five years, making more than $44 million in grants and giving more than 15,000 hours in pro bono services to nonprofits working in that space. Google has a deep appreciation for data science; it's a part of our DNA. So our largest grants in this space have been to nonprofits working to close data gaps across the criminal justice system. For example, we’re funding work to identify bias in policing practices and jail population trends in rural communities. 

Alongside the criminal justice data work, we’re also funding community-led solutions. We take to heart the importance of centering on the dignity of marginalized communities and affirming the flourishing of Black and Brown lives. What that means in practice is funding organizations that are led by and advocating on behalf of Black or Latino communities, such as the Black Lives Matter Movement.

What have been some of the biggest challenges this year?

We saw the most recent racial justice uprising happen in the wake of COVID-19. People from all kinds of communities came out on the streets in response to the death of George Floyd and demanded change in our justice system. As a result, reforms kicked off across cities in America. But behind the mobilization, the Black community continues to feel the loss of many loved ones due to COVID-19. The pandemic has exasperated systemic inequities in healthcare and in our economic system that leave Black communities most vulnerable. 

From the grantee perspective, that means organizations and their staff are dealing with two crises at one time. It's been very challenging but I’m proud that we’re able to support groups like The Satcher Health Institute at Morehouse School of Medicine that are working to address these disparities.

What keeps you motivated and positive? 

Racial justice work, at its core, requires a necessary discomfort that drives progress forward. But at this moment, I’m feeling energized by the catalytic shift the U.S. is experiencing in addressing systemic anti-Black racism. I am deeply inspired by the visionary leaders that drive community-led solutions. For me, it's a great honor to be in solidarity with their work.

What advice do you have for women starting out in their career? 

When you are early in your career, it can feel like there’s so much to learn from the people around you. I’d ask young women to consider just how much the world has to learn from them, too. Young people are the driving force behind social movements, the first adopters of new technologies and more willing to imagine the world differently. That perspective is invaluable to innovation and progress.

I’d ask young women to consider just how much the world has to learn from them, too.

Maggie Stanphill is making more mindful tech

If you’ve seen your weekly screen time go up over the past few months, you’re hardly alone. Maggie Stanphill, Google’s director of user experience (UX), has seen her stats go up, too. Maggie leads UX efforts for Google’s Digital Wellbeing initiative, and she’s noticed that the current state of the world requires an evolution in our understanding of how we can mindfully use tech. 

“The emphasis on screen time feels really tone deaf right now, right?” Maggie says. “Almost everyone who has access to tech is spending more and more time online. So many of us are getting those weekly screen time reports that say it’s gone up by some percent, and that might actually be aggravating.” But, she explains, there are ways digital tools can be more helpful. “We’re really trying to create a more nuanced approach. Look at sleep: We know queries for insomnia have gone way up, and we’re working on refining our tools to support that.”

I recently sat down with Maggie via Google Meet to talk more about her work in UX, and how Google’s Digital Wellbeing features are pivoting to meet people where they are. 

How would you describe your job at a dinner party to someone who doesn’t work in tech?

I’d say we conduct research with people around the world to better understand their needs and bring that perspective into the product design process to make tech more helpful and less intrusive. I also like to say that you can think of a UXer as the voice of the user in the room where decisions are being made. That’s where we play a role to advocate for people’s needs. 

What was your career path to UX like? 

I started as a journalist, actually; I got a degree in English. I loved storytelling, and I really found there was a natural transition when I moved into this field—storytelling was part of the design process. Narrative showed up in a variety of ways, like conducting ethnographic research and sharing people’s perspectives through tools like personas, which are character sketches that help UXers understand their core audience. Having that people-first lens is what’s really driven my career path. There was no “UX degree” when I was going through school, but I’ve found that focusing on understanding people’s needs and their goals translated really well in terms of what I needed to grow and be effective at this work. 

Have you heard of the Strengths Finder quiz?

I’ve heard of it but I haven’t taken it!

One of my primary strengths is “input,” information gathering and synthesizing. I’m organizing information every day, in my brain and in Google Docs! It’s part of my process. I have to internalize things to feel like I can be fluent and translate those concepts to make sure our products are building toward a shared strategy and are easy to use. 

What specifically are you working on at Google right now?

My focus is two-fold. I work in an advocate role for the company-wide digital wellbeing initiative, and I also manage our Fit UX team. My interest in people and human behavior is very much what drew me to digital wellbeing. For the past two years we’ve spent our time defining what “digital wellbeing” means. More recently, we’ve tried to pivot from a focus on screen time as the most important metric to really empowering people by default. What I mean by that is there are ways we as product designers can build wellbeing into our product experiences, so people don’t have so much to worry about when it comes to using tech. Because we’re a cross-Google team, we’re really focused on providing expertise from a research-based set of best practices. So we established a digital wellbeing toolkit, which includes four key tenets: empowerment, awareness, control and adaptability, and applied those in a variety of ways. 

How do those tenets show up in products?

Sleep is a great example of how people’s fundamental needs drive Google’s product priorities. Sleep is so critical to overall wellbeing, and we’ve heard from people all over the world that they’re not getting enough of it. We imagined we could help people get more sleep by making our products more adaptable. You look at Android, Google Assistant and YouTube and realize that if they were more coordinated they could work together to help people get more sleep. That’s when we came up with Bedtime mode, which uses Clock to set your preferred schedule. That schedule then activates features like Grayscale and Do Not Disturb to help you disconnect, and stay that way. 

The underlying tenet that makes this work is adaptability, where each experience takes the person’s preferences into account; in this case, that preference is their bedtime schedules. Then all the devices and apps adjust to support those needs: Your bedtime clock notifies you to wind down, apps turn to Grayscale, and when your phone docks, it goes into Do Not Disturb automatically. Google Assistant also has a Bedtime routine that follows this schedule. 

2 Clock Bedtime.png

In your digital wellbeing research, is there anything that’s really surprised you?

Maybe it’s not fully surprising in the canon of human behavior, but in our annual survey of user sentiment, it always strikes me that people have more concern about others’ use of tech than their own. There’s a higher percentage of care related to their loved ones’ tech use, but when it comes to reflecting on their own, that percentage of care is much lower. We have a hard time seeing and changing our own behavior.

Has quarantine changed your thinking about digital wellbeing at all, or made anything more clear to you?

Aside from shifting the focus from screen time to positive use of tech, the other thing that jumped off the page for me is the interplay between digital access and mental health. We’ve seen an increase in people’s feelings of disconnection from others due to social isolation, and therefore, the use of tech is seen as positive because it helps them feel more connected. On the other hand, we’ve seen early indicators that income and race may determine a person’s access to tech, and that access can play a role in wellbeing. For example, certain populations are suffering from shared grief, given some of the recent health and recent events, and they can benefit from more digital tools that help with communication, mental health and more. Yet this gap remains between access to information and tools to support wellbeing. And we’re looking into ways to bridge that divide.

Rachel Spivey helps Googlers find their “happy place”

Rachel Spivey has been at a crossroads more than once during her 10 years at Google. “It can be challenging for anyone to navigate a large company, but it can feel especially isolating for underrepresented employees who might not see representation in leadership, or have sponsorship or an existing support network to lean on,” she says. 

Today, Rachel leads a team of retention and progression consultants, a program she helped start two years ago that helps employees from underrepresented groups stay and thrive at Google. Since the program started, Rachel’s team has retained more than 84 percent of the program’s participants. 

We sat down with Rachel to discuss her role and the importance of diversity in the workplace. 

How do you explain your job at a dinner party?

I help underrepresented Googlers find “their happy place” at Google. It can be difficult to know where to go for career support, especially when you’re talking about a company as big and complicated as Google. But to make our products reflect our users, our employees need to be representative of all our users. Our team’s goal is to help ensure that, once they’re here, underrepresented Googlers stay and thrive. Sometimes, that means working with them to navigate a challenge in their current role, and other times, it’s connecting them to a new internal opportunity. 

You started the retention team two years ago. What was your inspiration?

The program started after we learned that Black+, Latinx+ and Native American+ employees in the U.S., where Google is able to report across race, were leaving Google at faster rates than the average, and women were leaving at a faster rate globally. Prior to this role, I was the Global Community Inclusion Lead for the Black Googler Network and HOLA (now Familia), two of our employee resource groups. During that time, I saw the retention challenges firsthand, which inspired me to champion our retention efforts. Through all my time at Google, mentors and others guided me through career highs and lows, and I wanted to help others the way they helped me. 

Talk more about the “attrition gap.” What is it and what does it mean for Google?

Attrition refers to the number of employees who are leaving a company. We spend a lot of time hiring, and once employees are here, we want them to stay. In order to improve overall representation, we need to improve retention. It’s our job to make sure underrepresented employees find satisfaction in their role, feel included at work and have opportunities to develop and grow. Right now, we’re the only company reporting attrition data externally and we’re using this data to inform how we approach our diversity and inclusion efforts. 

What specific things has your team done to improve retention?

If an underrepresented Googler is looking for support, they are referred to our team by People Operations, Employee Resource Groups, word of mouth or direct outreach. Each Googler in the program is then matched one-on-one with a retention and progression consultant to advocate on their behalf. The consultant might serve as a mentor or coach, connect the Googler to other support options, or locate internal mobility opportunities. 

For example, a Googler came to us thinking about leaving Google for a competitor. She enjoyed her role and team but wanted a career change. We connected her to a sponsor through the Black Leadership Advisory Group (BLAG), as well as the Mobility Experience team, which helps Googlers transfer to new roles, to help her find a new opportunity. Through this process, the Googler decided to stay at Google and got more involved in the Black+ leadership community. Today, she loves her new role and mentors other underrepresented Googlers on her new team. 

What are you most proud of? 

There is nothing more rewarding than helping Googlers find deeper career fulfillment at Google—whether in their current role, through internal mobility or making sure their feedback is heard. My team gets flooded with “thank you” notes, “you’ve changed my life” notes … Googlers seriously make me cry every day.

What’s next for your team?

As part of Google’s ongoing commitments to racial equity, our team will double in size, and each product area or function at the company will have a designated consultant. We’re also expanding our focus beyond retention to helping Googlers progress their careers.

What advice do you have for people that are a part of underrepresented groups starting out in their careers?

Stay focused on your purpose. No matter who you are, there will probably be many obstacles that may come your way, but staying focused on your north star will help you stay grounded.

Avni Shah wants to keep learning going for everyone

Growing up, Avni Shah’s father drove an hour and a half to work every day so she and her sister could enroll in a better school district in Alabama. She later watched her parents put away savings for years to be able to afford college tuition for their daughters. From a very young age, she came to understand the meaning and importance of a good education. 


Today, as the VP of Google for Education, Avni works every day to help build tools that make a  high-quality education available to everyone. That mission is especially important now as widespread school closures from the COVID-19 pandemic have challenged schools and families to quickly adjust to distance learning.  


Through it all, Avni remains optimistic about the future of education and the role technology can play in shaping it. She says that over the past few months the resiliency of teachers and students alike has inspired her, and that there have been “bright spots” of positivity.  


How do you explain your job at a dinner party?


My team builds tools for teachers, students, and education leaders to help improve teaching and learning at scale. One thing that’s great about working at Google is that describing my job is pretty easy, and always a great conversation starter—I get lots of feature requests (and bug reports ?) wherever I go. 


The use of technology in education is especially important now. What are you most excited about?


It’s been amazing to see the role technology has played to keep learning happening, no matter what. As I look ahead to the next six to twelve months, I’m excited about working alongside teachers, education leaders and students to build tools that can really meet their needs, both now—in this ever-changing situation—and for the future.


What we’ve seen in the past few months is an unexpected acceleration towards the digitization of education and learning. As that shift continues to happen, I see an opportunity longer-term to unlock even more of the potential of technology and the role it can play in being helpful to teachers, students and families. While tech is only part of the solution and there’s still a lot of work left to do, it’s clear that technology will have a unique part in shaping the future of education.


What has surprised you the most over the past six months?


The adaptability and resiliency of everyone, especially teachers. Teachers, schools and entire governments across the world had to quickly adjust to huge changes when schools started closing, and in many cases, the shift to distance learning happened in a matter of days.


I saw it with my own daughter. Back in March, her school closed on a Thursday, and by Monday, the whole school was up and running with a full virtual curriculum. They literally went from zero to distance learning in seventy-two hours. And that story isn’t unique—we hear stories like this from our teachers and students all over the world.


What’s more is that my daughter’s class continued to adapt and adjust. I remember their first video call and hearing twenty second graders talking all at once—it was definitely a bit chaotic. But the students, teachers and administrators quickly adapted. And now my daughter is teaching me things like what online classroom etiquette looks like. 


How have you and your team stayed motivated? 

Over the past few months, my team started a weekly tradition called “bright spots” where we share inspiring stories about how teachers, students and families use our tools. 


We’ve heard creative ways teachers turn their homes into virtual classrooms—including one teacher who used their shower as a whiteboard surface. There was also a family in New Zealand who sent a photo of a distance learning classroom they built on top of a hill so they could have access to satellite Wi-Fi; it was made out of a generator-powered farm trailer! 


Who has been a strong female influence in your life?


My mom. She’s incredibly hardworking and approaches life with this calm, yet tireless, optimism. When she and my dad moved to the U.S. from Mumbai, she worked multiple jobs to help make ends meet and taught herself English by watching Nick at Nite on TV. Later on, she worked full time while she studied (and passed) the CPA exam, and she moved in to help me when my first child was born—while still working herself. 


I started keeping a list of all the positive things that wouldn’t have happened if COVID wasn’t here to remind myself that there are nuggets of good. Getting to see my mom every day is definitely on the list. She lives in Alabama and I live in California, but since the pandemic, my kids video call her every day.


What advice do you have for women starting out in their careers?


Careers are not linear, and things that feel like sideways or downward movement can still be progress. Be open to opportunities that might be surprising or sound scary because that's usually where learning and growth happen. 


I’ve been at Google for seventeen years and moved around the company quite a bit, working on Search, Maps, Chrome and now Google for Education. Some transitions were harder than others. For example, there was the time when I moved to Zurich, took on a new role, became a manager for the first time, and got married—all within the span of three weeks. Every time I made a transition, I had to learn about an entirely new product, industry and team, and where I could fit in. But every transition helped me get used to feeling uncomfortable and learning new things again. And, in hindsight, I can see how each of those moves was extremely valuable.

Elise Roy wants products to be accessible for all

Elise Roy has been a design thinker since she was young. When she started to lose her hearing at the age of 10, she got creative about how she could adapt her environment and tools to work for her. 

As an adult, she channeled that creativity to pave the way for others with disabilities—from fighting to keep live captioning available at her college to working on the United Nations International Disability Rights Treaty during and after law school. Today, as a UX Accessibility and Inclusion Lead at Google, she helps product teams think about inclusive design because, as she puts it, “when we design for disabilities, we all benefit.”

On the 30th anniversary of the Americans with Disabilities Act—landmark legislation that helped protect the rights of people with disabilities—we talked with Elise about how to design better products, the power of negotiation, what her stint as a furniture maker taught her and more. 

How do you explain your job at a dinner party?

I help teams look at problems and design from the perspective of people who are often the most excluded. Ultimately, that helps us make better products for everyone. 

What’s the general gist of inclusive design? 

Inclusive design is about looking at people with differences—whether it’s different abilities, races, cultures or sexualities—and then designing products for their experiences. Human diversity is our greatest source of innovation. When we solve for the most excluded groups, we often develop solutions that are better for everyone. Email and closed captioning are two great examples of this. Both were created to help people who were deaf; now they are both widely used by everyone. 

How have your own differences molded your experience?

When I fail, I see it as an opportunity for growth and learning. This goes back to my own hearing loss that started when I was ten. At first, I failed a lot. I remember a pivotal moment when I was in fifth grade and got a 28 percent on a test. My mom sat me down, made me study, and when I took the next test I got a 98 percent. It was a critical moment for me: I learned I can fail, but I can also work hard and get back up and succeed. This past Mother’s Day I finally wrote my mom to thank her for that valuable lesson.

Human diversity is our greatest source of innovation.

You’ve worked in a lot of different rolesfrom law to tech to furniture making. What important lessons have you learned along the way? 

As a lawyer, I learned a lot about negotiation and stopped seeing it as a “win or lose” outcome. Negotiation is about listening to people’s needs and understanding them and finding the common ground. And finding that common ground  is wildly important when it comes to product development and inclusion. 

You haven’t always worked in law or tech. What about the time you spent repurposing home furnishings for an architectural salvage company? 

I spent about two years as a furniture fabricator doing woodworking and metalworking. With each piece of furniture, you’d get to see product development through the ages and understand what parts of the design worked and what parts didn’t. It was a good lesson in building things that stand the test of time. I’m always asking myself how we can build that timeliness into our products at Google. 

What’s something you want people to know about inclusive design?

There is immense business value in inclusive design. Too often people think that designing for excluded populations is simply “doing good”. Yet designing for disability has brought huge value to products, to business and to everyone. It’s one of our most valuable product design tools.

If we had designed products from the perspective of excluded populations to begin with, I think we would have been better suited to handle all of the changes and disruptions from COVID. The world had to quickly figure out how to get things done virtually and how to adapt to the fact that we were at risk of  getting seriously ill. Yet a lot of the needs that we all suddenly experienced due to COVID were needs people with disabilities had experienced for years. If we had tried to address these then, we would have been able to respond and adapt to the new COVID reality a lot quicker.