Tag Archives: Health

Unlocking the "Chemome" with DNA-Encoded Chemistry and Machine Learning



Much of the development of therapeutics for human disease is built around understanding and modulating the function of proteins, which are the main workhorses of many biological activities. Small molecule drugs such as ibuprofen often work by inhibiting or promoting the function of proteins or their interactions with other biomolecules. Developing useful “virtual screening” methods where potential small molecules can be evaluated computationally rather than in a lab, has long been an area of research. However, the persistent challenge is to build a method that works well enough across a wide range of chemical space to be useful for finding small molecules with physically verified useful interaction with a protein of interest, i.e., “hits”.

In “Machine learning on DNA-encoded libraries: A new paradigm for hit-finding”, recently published in the Journal of Medicinal Chemistry, we worked in collaboration with X-Chem Pharmaceuticals to demonstrate an effective new method for finding biologically active molecules using a combination of physical screening with DNA-encoded small molecule libraries and virtual screening using a graph convolutional neural network (GCNN). This research has led to the creation of the Chemome initiative, a cooperative project between our Accelerated Science team and ZebiAI that will enable the discovery of many more small molecule chemical probes for biological research.

Background on Chemical Probes
Making sense of the biological networks that support life and produce disease is an immensely complex task. One approach to study these processes is using chemical probes, small molecules that aren’t necessarily useful as drugs, but that selectively inhibit or promote the function of specific proteins. When you have a biological system to study (such as cancer cells growing in a dish), you can add the chemical probe at a specific time and observe how the biological system responds differently when the targeted protein has increased or decreased activity. But, despite how useful chemical probes are for this kind of basic biomedical research, only 4% of human proteins have a known chemical probe available.

The process of finding chemical probes begins similarly to the earliest stages of small molecule drug discovery. Given a protein target of interest, the space of small molecules is scanned to find “hit” molecules that can be further tested. Robotic assisted high throughput screening where up to hundred of thousands or millions of molecules are physically tested is a cornerstone of modern drug research. However, the number of small molecules you can easily purchase (1.2x109) is much larger than that, which in turn is much smaller than the number of small drug like molecules (estimates from 1020 to 1060). “Virtual screening” could possibly quickly and efficiently search this vast space of potentially synthesizable molecules and greatly speed up the discovery of therapeutic compounds.

DNA-Encoded Small Molecule Library Screening
The physical part of the screening process uses DNA-encoded small molecule libraries (DELs), which contain many distinct small molecules in one pool, each of which is attached to a fragment of DNA serving as a unique barcode for that molecule. While this basic technique has been around for several decades, the quality of the library and screening process is key to producing meaningful results.

DELs are a very clever idea to solve a biochemical challenge, which is how to collect small molecules into one place with an easy way to identify each. The key is to use DNA as a barcode to identify each molecule, similar to Nobel Prize winning phage display technology. First, one generates many chemical fragments, each with a unique DNA barcode attached, along with a common chemical handle (the NH2 in this case). The results are then pooled and split into separate reactions where a set of distinct chemical fragments with another common chemical handle (e.g., OH) are added. The chemical fragments from the two steps react and fuse together at the common chemical handles. The DNA fragments are also connected to build one continuous barcode for each molecule. The net result is that by performing 2N operations, one gets N2 unique molecules, each of which is identified by its own unique DNA barcode. By using more fragments or more cycles, it’s relatively easy to make libraries with millions or even billions of distinct molecules.
An overview of the process of creating a DNA encoded small molecule library. First, DNA “barcodes” (represented here with numbered helices) are attached to small chemical fragments (the blue shapes) which expose a common chemical “handle” (e.g. the NH2 shown here). When mixed with other chemical fragments (the orange shapes) each of which has another exposed chemical “handle” (the OH) with attached DNA fragments, reactions merge the sets of chemical and DNA fragments, resulting in a voluminous library of small molecules of interest, each with a unique DNA “barcode”.
Once the library has been generated, it can be used to find the small molecules that bind to the protein of interest by mixing the DEL together with the protein and washing away the small molecules that do not attach. Sequencing the remaining DNA barcodes produces millions of individual reads of DNA fragments, which can then be carefully processed to estimate which of the billions of molecules in the original DEL interact with the protein.

Machine Learning on DEL Data
Given the physical screening data returned for a particular protein, we build an ML model to predict whether an arbitrarily chosen small molecule will bind to that protein. The physical screening with the DEL provides positive and negative examples for an ML classifier. To simplify slightly, the small molecules that remain at the end of the screening process are positive examples and everything else are negative examples. We use a graph convolutional neural network, which is a type of neural network specially designed for small graph-like inputs, such as the small molecules in which we are interested.

Results
We physically screened three diverse proteins using DEL libraries: sEH (a hydrolase), ERα (a nuclear receptor), and c-KIT (a kinase). Using the DEL-trained models, we virtually screened large make-on-demand libraries from Mcule and an internal molecule library at X-Chem to identify a diverse set of molecules predicted to show affinity with each target. We compared the results of the GCNN models to a random forest (RF) model, a common method for virtual screening that uses standard chemical fingerprints, which we use as baseline. We find that the GCNN model significantly outperforms the RF model in discovering more potent candidates.
Fraction of molecules (“hit rates”) from those tested showing various levels of activity, comparing predictions from two different machine learned models (a GCNN and random forests, RF) on three distinct protein targets. The color scale on the right uses a common metric IC50 for representing the potency of a molecule. nM means “nanomolar” and µM means “micromolar”. Smaller values / darker colors are generally better molecules. Note that typical virtual screening approaches not built with DEL data normally only reach a few percent on this scale.
Importantly, unlike many other uses of virtual screening, the process to select the molecules to test was automated or easily automatable given the results of the model, and we did not rely on review and selection of the most promising molecules by a trained chemist. In addition, we tested almost 2000 molecules across the three targets, the largest published prospective study of virtual screening of which we are aware. While providing high confidence on the hit rates above, this also allows one to carefully examine the diversity of hits and the usefulness of the model for molecules near and far from the training set.

The Chemome Initiative
ZebiAI Therapeutics was founded based on the results of this research and has partnered with our team and X-Chem Pharmaceuticals to apply these techniques to efficiently deliver new chemical probes to the research community for human proteins of interest, an effort called the Chemome Initiative.

As part of the Chemome Initiative, ZebiAI will work with researchers to identify proteins of interest and source screening data, which our team will use to build machine learning models and make predictions on commercially available libraries of small molecules. ZebiAI will provide the predicted molecules to researchers for activity testing and will collaborate with researchers to advance some programs through discovery. Participation in the program requires that the validated hits be published within a reasonable time frame so that the whole community can benefit. While more validation must be done to make the hit molecules useful as chemical probes, especially for specifically targeting the protein of interest and the ability to function correctly in common assays, having potent hits is a big step forward in the process.

We’re excited to be a part of the Chemome Initiative enabled by the effective ML techniques described here and look forward to its discovery of many new chemical probes. We expect the Chemome will spur significant new biological discoveries and ultimately accelerate new therapeutic discovery for the world.

Acknowledgements
This work represents a multi-year effort between the Accelerated Science Team and X-Chem Pharmaceuticals with many people involved. This project would not have worked without the combined diverse skills of biologists, chemists, and ML researchers. We should especially acknowledge Eric Sigel (of X-Chem, now at ZebiAI) and Kevin McCloskey (of Google), the first authors on the paper and Steve Kearnes (of Google) for core modelling ideas and technical work.

Source: Google AI Blog


Learn more about anxiety with a self-assessment on Search

Editor’s note: This post is authored by Daniel H. Gillison, Jr., CEO of The National Alliance on Mental Illness.

Anxiety disorders affect 48 million adults in the U.S. Anxiety presents itself as a wide range of symptoms, and can be a result of biological factors or triggered by a change in environment or exposure to a stressful event. With COVID-19 introducing new points of stress, communities are seeing a rise in mental health issues and needs. New Census Bureau data released last week shows that a third of Americans are now showing signs of clinical anxiety or depression.

The National Alliance on Mental Illness (NAMI) is the nation’s largest grassroots mental health organization and we’re partnering with Google to provide access to mental health resources. Starting today when people in the U.S. search on Google for information about anxiety, we’ll provide access to a clinically-validated questionnaire called the GAD-7 (Generalized Anxiety Disorder-7). The GAD-7 will show up in the knowledge panel—the box of information that displays key facts when you search for something—and also has medically-validated information about anxiety, including symptoms and common treatments.

Anxiety self-assessment

This seven-question survey covers many of the same questions a health professional may ask, and your answers are private and secure (Google does not collect or share answers or results from the questionnaire). The GAD-7 helps people understand how their self-reported anxiety symptoms map to anxiety levels of people who completed the same questionnaire. The tool also provides access to resources developed by NAMI so people can learn more and seek help when needed. 

Anxiety self-assessment results

The GAD-7 is the third mental health screener available on Google Search. We’ve previously partnered with Google so that people who search for information on depression and PTSD can access relevant clinically-validated questionnaires that provide more information and links to resources about those conditions. The self-assessments are currently available in the U.S., and Google hopes to make them available in additional countries over time.

Anxiety can show up as a wide range of physical and emotional symptoms, and it can take decades for people who first experience symptoms to get treatment. By providing access to authoritative information, and the resources and tools to learn more about anxiety, we hope to empower more people to take action and seek help.

Source: Search


Learn more about anxiety with a self-assessment on Search

Editor’s note: This post is authored by Daniel H. Gillison, Jr., CEO of The National Alliance on Mental Illness.

Anxiety disorders affect 48 million adults in the U.S. Anxiety presents itself as a wide range of symptoms, and can be a result of biological factors or triggered by a change in environment or exposure to a stressful event. With COVID-19 introducing new points of stress, communities are seeing a rise in mental health issues and needs. New Census Bureau data released last week shows that a third of Americans are now showing signs of clinical anxiety or depression.

The National Alliance on Mental Illness (NAMI) is the nation’s largest grassroots mental health organization and we’re partnering with Google to provide access to mental health resources. Starting today when people in the U.S. search on Google for information about anxiety, we’ll provide access to a clinically-validated questionnaire called the GAD-7 (Generalized Anxiety Disorder-7). The GAD-7 will show up in the knowledge panel—the box of information that displays key facts when you search for something—and also has medically-validated information about anxiety, including symptoms and common treatments.

Anxiety self-assessment

This seven-question survey covers many of the same questions a health professional may ask, and your answers are private and secure (Google does not collect or share answers or results from the questionnaire). The GAD-7 helps people understand how their self-reported anxiety symptoms map to anxiety levels of people who completed the same questionnaire. The tool also provides access to resources developed by NAMI so people can learn more and seek help when needed. 

Anxiety self-assessment results

The GAD-7 is the third mental health screener available on Google Search. We’ve previously partnered with Google so that people who search for information on depression and PTSD can access relevant clinically-validated questionnaires that provide more information and links to resources about those conditions. The self-assessments are currently available in the U.S., and Google hopes to make them available in additional countries over time.

Anxiety can show up as a wide range of physical and emotional symptoms, and it can take decades for people who first experience symptoms to get treatment. By providing access to authoritative information, and the resources and tools to learn more about anxiety, we hope to empower more people to take action and seek help.

Source: Search


Exposure Notification API launches to support public health agencies

Note: The following is a joint statement from Apple and Google.

One of the most effective techniques that public health officials have used during outbreaks is called contact tracing. Through this approach, public health officials contact, test, treat and advise people who may have been exposed to an affected person. One new element of contact tracing is Exposure Notifications: using privacy-preserving digital technology to tell someone they may have been exposed to the virus. Exposure Notification has the specific goal of rapid notification, which is especially important to slowing the spread of the disease with a virus that can be spread asymptomatically.   

To help, Apple and Google cooperated to build Exposure Notifications technology that will enable apps created by public health agencies to work more accurately, reliably and effectively across both Android phones and iPhones. Over the last several weeks, our two companies have worked together, reaching out to public health officials, scientists, privacy groups and government leaders all over the world to get their input and guidance. 

Starting today, our Exposure Notifications technology is available to public health agencies on both iOS and Android. What we’ve built is not an app—rather public health agencies will incorporate the API into their own apps that people install. Our technology is designed to make these apps work better. Each user gets to decide whether or not to opt-in to Exposure Notifications; the system does not collect or use location from the device; and if a person is diagnosed with COVID-19, it is up to them whether or not to report that in the public health app. User adoption is key to success and we believe that these strong privacy protections are also the best way to encourage use of these apps.  

Today, this technology is in the hands of public health agencies across the world who will take the lead and we will continue to support their efforts. 

Source: Android


Exposure Notification API launches to support public health agencies

Note: The following is a joint statement from Apple and Google.

One of the most effective techniques that public health officials have used during outbreaks is called contact tracing. Through this approach, public health officials contact, test, treat and advise people who may have been exposed to an affected person. One new element of contact tracing is Exposure Notifications: using privacy-preserving digital technology to tell someone they may have been exposed to the virus. Exposure Notification has the specific goal of rapid notification, which is especially important to slowing the spread of the disease with a virus that can be spread asymptomatically.   

To help, Apple and Google cooperated to build Exposure Notifications technology that will enable apps created by public health agencies to work more accurately, reliably and effectively across both Android phones and iPhones. Over the last several weeks, our two companies have worked together, reaching out to public health officials, scientists, privacy groups and government leaders all over the world to get their input and guidance. 

Starting today, our Exposure Notifications technology is available to public health agencies on both iOS and Android. What we’ve built is not an app—rather public health agencies will incorporate the API into their own apps that people install. Our technology is designed to make these apps work better. Each user gets to decide whether or not to opt-in to Exposure Notifications; the system does not collect or use location from the device; and if a person is diagnosed with COVID-19, it is up to them whether or not to report that in the public health app. User adoption is key to success and we believe that these strong privacy protections are also the best way to encourage use of these apps.  

Today, this technology is in the hands of public health agencies across the world who will take the lead and we will continue to support their efforts. 

How AI could predict sight-threatening eye conditions

Age-related macular degeneration (AMD) is the biggest cause of sight loss in the UK and USA and is the third largest cause of blindness across the globe. The latest research collaboration between Google Health, DeepMind and Moorfields Eye Hospital is published in Nature Medicine today. It shows that artificial intelligence (AI) has the potential to not only spot the presence of AMD in scans, but also predict the disease’s progression. 

Vision loss and wet AMD

Around 75 percent of patients with AMD have an early form called “dry” AMD that usually has relatively mild impact on vision. A minority of patients, however, develop the more sight-threatening form of AMD called exudative, or “wet” AMD. This condition affects around 15 percent of patients, and occurs when abnormal blood vessels develop underneath the retina. These vessels can leak fluid, which can cause permanent loss of central vision if not treated early enough.

Macular degeneration mainly affects central vision, causing "blind spots" directly ahead

Macular degeneration mainly affects central vision, causing "blind spots" directly ahead (Macular Society).

Wet AMD often affects one eye first, so patients become heavily reliant upon their unaffected eye to maintain their normal day-to-day living. Unfortunately, 20 percent of these patientswill go on to develop wet AMD in their other eye within two years. The condition often develops suddenly but further vision loss can be slowed with treatments if wet AMD is recognized early enough. Ophthalmologists regularly monitor their patients for signs of wet AMD using 3D optical coherence tomography (OCT) images of the retina.

The period before wet AMD develops is a critical window for preventive treatment, which is why we set out to build a system that could predict whether a patient with wet AMD in one eye will go on to develop the condition in their second eye. This is a novel clinical challenge, since this it’s not a task that is routinely performed.

How AI could predict the development of wet AMD

In collaboration with colleagues at DeepMind and Moorfields Eye Hospital NHS Foundation Trust, we’ve developed an artificial intelligence (AI) model that has the potential to predict whether a patient will develop wet AMD within six months. In the future, this system could potentially help doctors plan studies of earlier intervention, as well as contribute more broadly to clinical understanding of the disease and disease progression. 

We trained and tested our model using a retrospective, anonymized dataset of 2,795 patients. These patients had been diagnosed with wet AMD in one of their eyes, and were attending one of seven clinical sites for regular OCT imaging and treatment. For each patient, our researchers worked with retinal experts to review all prior scans for each eye and determine the scan when wet AMD was first evident. In collaboration with our colleagues at DeepMind we developed an AI system composed of two deep convolutional neural networks, one taking the raw 3D scan as input and the other, built on our previous work, taking a segmentation map outlining the types of tissue present in the retina. Our prediction system used the raw scan and tissue segmentations to estimate a patient’s risk of progressing to wet AMD within the next six months. 

To test the system, we presented the model with a single, de-identified scan and asked it to predict whether there were any signs that indicated the patient would develop wet AMD in the following six months. We also asked six clinical experts—three retinal specialists and three optometrists, each with at least ten years’ experience—to do the same. Predicting the possibility of a patient developing wet AMD is not a task that is usually performed in clinical practice so this is the first time, to our knowledge, that experts have been assessed on this ability. 

While clinical experts performed better than chance alone, there was substantial variability between their assessments. Our system performed as well as, and in certain cases better than, these clinicians in predicting wet AMD progression. This highlights its potential use for informing studies in the future to assess or help develop treatments to prevent wet AMD progression.

Future work could address several limitations of our research. The sample was representative of practice at multiple sites of the world’s largest eye hospital, but more work is needed to understand the model performance in different demographics and clinical settings. Such work should also understand the impact of unstudied factors—such as additional imaging tests—that might be important for prediction, but were beyond the scope of this work.

What’s next 

These findings demonstrate the potential for AI to help improve understanding of disease progression and predict the future risk of patients developing sight-threatening conditions. This, in turn, could help doctors study preventive treatments.

This is the latest stage in our partnership with Moorfields Eye Hospital NHS Foundation Trust, a long-standing relationship that transitioned from DeepMind to Google Health in September 2019. Our previous collaborations include using AI to quickly detect eye conditions, and showing how Google Cloud AutoML might eventually help clinicians without prior technical experience to accurately detect common diseases from medical images. 

This is early research, rather than a product that could be implemented in routine clinical practice. Any future product would need to go through rigorous prospective clinical trials and regulatory approvals before it could be used as a tool for doctors. This work joins a growing body of research in the area of developing predictive models that could inform clinical research and trials. In line with this, Moorfields will be making the dataset available through the Ryan Initiative for Macular Research. We hope that models like ours will be able to support this area of work to improve patient outcomes. 


Dr. Karen DeSalvo on “putting information first” during COVID-19

Dr. Karen DeSalvo knows how to deal with a crisis. She was New Orleans Health Commissioner following Hurricane Katrina and a senior official at the Department of Health and Human Services when Ebola broke out. And now, as Google’s Chief Health Officer, she’s become the company’s go-to medical expert, advising our leaders on how to react to the coronavirus. Dr. DeSalvo has been a voice of reassurance for Googlers, but her expertise is helpful outside of Google, too. I recently spoke to Dr. DeSalvo about how we’ll get through the crisis, what Google is doing to help and what makes her optimistic despite the challenges we face. 


How is the coronavirus different from other public health crises you’ve dealt with? 
In my work in New Orleans, whether it was a hurricane, a fire or a power outage, we drew resources from other parts of the country if we needed help. In this case, the entire world has been impacted. Everyone is living with uncertainty, disrupted supply chains, impacts on travel and social infrastructure. While this creates a sense of community that I hope will continue beyond the pandemic, the downside is that we have less opportunity to send assistance to other places. Where there is opportunity, we’ve seen people paying it forward, like when California deployed ventilators to the East Coast. The sense of community that grows out of any disaster is the bright spot, for me.


How are industries sharing ideas and research in this global crisis?
Physicians are using technology to talk to each other constantly about what they’re seeing and doing, and in prior outbreaks this real-time communication wasn’t possible. It makes a huge difference in clinical care. In the medical community, you sometimes have to pay for a journal article. But now if you want to read about COVID-19, it’s free for any researcher, scientist, clinician or layperson. That’s putting information first, putting knowledge and science above proprietary interest. 

It’s happening in science, too. For instance, there’s a collaboration between competitors in the private sector on designing trials and assessing the outcome of drugs and vaccines. At Google, our Deepmind colleagues were able to use quantum computing to show protein folding, helping advance the thinking about therapeutics and vaccines. I don’t think we’ve seen this spirit of collaboration in the history of science, and it’s one of the reasons I’m so optimistic. 


What is Google doing to help curb misinformation?
In this historic moment, access to the right information at the right time will save lives. Period. This is why our Search teams design our ranking systems to promote the most relevant and reliable information available. We build these protections in advance so they’re ready when a crisis hits, and this approach serves as a strong defense against misinformation.  


When COVID-19 began to escalate, we built features on top of those fundamental protections to help people find information from local health authorities. We initially launched an SOS alert with the World Health Organization to make resources about COVID-19 easily discoverable. This has evolved into an expanded Search experience, providing easy access to more authoritative information, alongside new data and visualizations. 


We’re surfacing content that’s accessible to a whole range of communities, and there’s constant vigilance to remove misinformation on platforms like YouTube—this includes videos or other information that could be harmful to people.

Search COVID GIF

COVID-19 information on Search. 

What does it mean to be Google’s Chief Health Officer?
My role is to bring a holistic view of emotional, physical and social health and well-being to Google’s products and services, particularly under Google Health. During this pandemic, my team has also thought about how Google can assist public health efforts. This has meant anything from the Community Mobility Reports, a tool to help measure the impact of social distancing, to building playlists in partnership with YouTube geared towards clinicians, and showing testing sites for COVID-19 all over the world.


In the general public, what behaviors or mentalities have arisen that should continue in the future?
First, there are fundamental ways to reduce the transmission of communicable diseases like the flu or, in some communities, measles or tuberculosis. If you’re able to, it’s important to stay home if you’re sick, wash your hands, cough into your elbow—I call these the “Grandma rules.” Second, there are a lot of components to health: social health, emotional well-being, financial stability. Health is driven by more than just medical care, and this is a moment for us to remember that a holistic approach matters. 


What should business owners consider for when restrictions begin to lift?
They need to prepare for a world in which employees can work remotely as much as possible. Policies will still recommend social distancing, but we also need to create an environment where people who are sick feel comfortable staying home. That’s not realistic for every small business, so paying attention to the basic hygiene stuff—Do the Five—is also important. 


After Katrina, there was this time when the world was paying attention and trying to help, but the emotional and social impact on our community lasted for months. There will be some of that after this pandemic, because you can’t just flip a switch and have people go back to work. That’s the important thing—being patient as people put themselves back into a normal routine. 

Health is driven by more than just medical care, and this is a moment for us to remember that a holistic approach matters.

Taking off your Chief Health Officer hat, how do you reassure friends and family when they’re worried about this situation?
Medically, we need to be patient and let the scientists do their thing. It’s probably going to take until summer or early fall in the northern hemisphere to get clarity on what therapeutics work. The end game is to develop a vaccine so we can make sure everybody is protected. This is going to be a long journey with many months ahead, so we need to pace ourselves. 

Statistically, more people will have anxiety and depression from COVID-19 than will actually get COVID-19. To share tips on mental well-being, we recently launched the “Be Kind To Your Mind” PSA on Google Search.

Lastly, I remind those who are privileged to have a safe space to stay home when other people can’t. I think about my previous work with low income patients, and how this crisis impacts them as well as communities of color, non-native English speakers, and individuals with disabilities. Staying home is not safe, comfortable and financially feasible for everybody. We should all be doing what we can for our neighbors and our friends and the people who aren’t always seen.

Resources for mental health support during COVID-19

The coronavirus pandemic has disrupted lives around the world. In addition to the lives lost to the virus, as many communities enter the second and third month under stay-at-home orders, there is a rising mental health toll, too. In a national survey released by the American Psychiatric Association in March, 36 percent of respondents said that COVID-19 was seriously impacting their mental health; 48 percent were anxious about getting infected; and 57 percent reported concern that COVID-19 will seriously impact their finances.


As a trained psychiatrist, I know firsthand the importance of bringing out into the open the issue of mental health. While it might be years between the first onset of symptoms and someone seeking help, the internet is often the first place people turn to find out more about mental disorders. To help address the emerging mental health crisis we’re sharing “Be Kind to Your Mind," which includes resources on mental wellbeing from the Centers for Disease Control and Prevention (CDC). Whenever people in the U.S. search for information about coping with the pandemic, or on COVID-19 and mental health, we’ll show a public service announcement with tips to cope with stress during COVID-19. To raise awareness of the importance of mental wellbeing during these times, we'll highlight these resources on Google's homepage tomorrow.

be kind to your mind.jpg

Whenever people in the U.S. search for information about coping with the pandemic, we’ll show a public service announcement with tips to cope with stress during COVID-19.

With May being Mental Health Awareness Month, we want to highlight a few other resources and tools across Google and YouTube that promote mental wellbeing.


Self-assessment questionnaires for depression and PTSD

When people search on Google for information about mental health conditions we provide panels with information from authoritative sources like Mayo Clinic that detail symptoms, treatments, and provide an overview of the different types of specialists who can help. On the info panels for depression and post traumatic stress disorder (PTSD), we provide direct access to clinically-validated self-assessment questionnaires that ask some of the same types of questions a mental health professional might ask. Based on a person’s answers, these self-assessment tools provide information on risk, along with links to more resources. Results to these questionnaires are not logged. We hope they can provide insight and help people have a more informed conversation with their doctor. We will add more self-assessment  questionnaires over time to cover more conditions.


Self-care content on YouTube

Over the last few months, YouTube has seen a 35 percent increase in views of meditation videos, and growing popularity of mindfulness and wellbeing content. YouTube is making videos like these and other mental health resources more widely available to anyone around the world, for free, by spotlighting channels and playlists that have wellbeing and mindfulness-focused content. Countless YouTube creators, like Dr. Mike and Kati Morton, educate their communities as they help reduce the stigma associated with mental health. YouTube is also launching relevant YouTube Originals, including a “BookTube” episode featuring top authors like Melinda Gates and Elizabeth Gilbert offering their best book recommendations.

get by with me

Finding virtual care options, quickly

Because of stay-at-home orders and restrictions that limit in-person interactions, many mental health care providers (including therapists and psychiatrists) are now providing telehealth care, like conducting therapy sessions over video conference. To make these options easier to find, we now allow providers to highlight their virtual care services on their Google Business Profile. So now, when you search for a mental health provider in products like Search and Maps, you may see an “Online care” link that can take you to their virtual care page, or even schedule a virtual appointment.


While the stigma around mental health has lessened in recent years, many people still find it hard to reach out to get help. By providing access to mental health resources, services and information across our products, we hope to make it easier for people to seek help and receive proper care.


Healthcare AI systems that put people at the center

Over the past four years, Google has advanced its AI technologies to address critical problems in healthcare. We’ve developed tools to detect eye disease, AI systems to identify cardiovascular risk factors and signs of anemia, and to improve breast cancer screening.

For these and other AI healthcare applications, the journey from initial research to useful product can take years. One part of that journey is conducting user-centered research. Applied to healthcare, this type of research means studying how care is delivered and how it benefits patients, so we can better understand how algorithms could help, or even inadvertently hinder, assessment and diagnosis.

Our research in practice

For our latest research paper, "A Human-Centered Evaluation of a Deep Learning System Deployed in Clinics for the Detection of Diabetic Retinopathy," we built on a partnership with the Ministry of Public Health in Thailand to conduct field research in clinics across the provinces of Pathum Thani and Chiang Mai. It’s one of the first published studies examining how a deep learning system is used in patient care, and it’s the first study of its kind that looks at how nurses use an AI system to screen patients for diabetic retinopathy. 

Over a period of eight months, we made regular visits to 11 clinics. At each clinic, we observed how diabetes nurses handle eye screenings, and we interviewed them to understand how to refine this technology. We did our field research alongside a study to evaluate the feasibility and performance of the deep learning system in the clinic, with patients who agreed to be carefully observed and medically supervised during the study. 

A nurse operates the fundus camera, taking images of a patient’s retina.

A nurse operates the fundus camera, taking images of a patient’s retina.

The observational process

In our research, we provide key recommendations for continued product development, and provide guidance on deploying AI in real-world scenarios for other research projects.

Developing new products with a user-centered design process requires involving the people who would interact with the technology early in development. This means getting a deep understanding of people’s needs, expectations, values and preferences, and testing ideas and prototypes with them throughout the entire process. When it comes to AI systems in healthcare, we pay special attention to the healthcare environment, current workflows, system transparency, and trust.

The impact of environment on AI

In addition to these factors, our fieldwork found that we must also factor in environmental differences like lighting, which vary among clinics and can impact the quality of images. Just as an experienced clinician might know how to account for these variables in order to assess it, AI systems also need to be trained to handle these situations.

For instance, some images captured in screening might have issues like blurs or dark areas. An AI system might conservatively call some of these images “ungradable” because the issues might obscure critical anatomical features that are required to provide a definitive result. For clinicians, the gradability of an image may vary depending on one’s own clinical set-up or experience. Building an AI tool to accommodate this spectrum is a challenge, as any disagreements between the system and the clinician can lead to frustration. In response to our observations, we amended the research protocol to have eye specialists review such ungradable images alongside the patient’s medical records, instead of automatically referring patients with ungradable images to an ophthalmologist. This helped to ensure a referral was necessary, and reduced unnecessary travel, missed work, and anxiety about receiving a possible false positive result. 

Finally, alongside evaluating the performance, reliability, and clinical safety of an AI system, the study also accounts for the human impacts of integrating an AI system into patient care. For example, the study found that the AI system could empower nurses to confidently and immediately identify a positive screening, resulting in quicker referrals to an ophthalmologist.

So what does all of this mean? 

Deploying an AI system by considering a diverse set of perspectives in the design and development process is just one part of introducing new health technology that requires human interaction. It's important to also study and incorporate real-life evaluations in the clinic, and engage meaningfully with clinicians and patients, before the technology is widely deployed. That’s how we can best inform improvements to the technology, and how it is integrated into care, to meet the needs of clinicians and patients. 

Connecting people to virtual care options


To prevent the spread of COVID-19, many healthcare providers are reducing or stopping in-person visits for a variety of patient needs, from the treatment of chronic conditions to mental health services to evaluating cough and cold symptoms. Yet, people need a way to continue getting medical care from the comfort and safety of their own homes. Since the beginning of the pandemic, we’ve seen interest in virtual care and telehealth rise dramatically. Health consultations over the phone or by video conference not only help alleviate strain on doctors’ offices and emergency rooms but are also recommended as an important way to protect patients and staff against COVID-19. 

To help individuals and health care providers connect, we’re focused on providing individuals with access to high-quality and authoritative information and supporting them throughout their health journey. Over the coming week, we’re beginning to roll out two new features in Search and Maps that make it easier for people to connect to virtual healthcare options, whether it’s to a doctor’s office down the street, the hospital across town, or a national telehealth platform.

Local healthcare providers, now virtual

Healthcare providers like hospitals, doctors, and mental health professionals can now enter a virtual care offering in their Business Profile, so that people searching for their local provider, for instance, might see a “get online care” link on Search and Maps. Clicking this link will take people to that provider’s virtual care website where they can find more information, and in many cases, schedule a virtual healthcare visit with a provider. 

The pandemic has affected many healthcare providers’ operating hours and walk-in visit policies. To help communicate changes that might affect someone’s visit, we’re automatically surfacing a link directly to health providers’ COVID-19 information page on Search and Maps, and we've assembled best practices on how healthcare providers can update their websites and provide COVID-19 information on Google My Business.

New health information about Hillview Hospital

With health providers limiting in-person visits due to COVID-19, we’re making it easier for people to discover virtual care options.

Find virtual healthcare, anywhere, anytime

Beginning as a pilot in the U.S., we’ll also begin showing widely-available virtual care platforms directly on Search so people can more easily access virtual visits. For example, when people search for “immediate care”, we’ll be able to also present available virtual care options and related information such as the out-of-pocket price charged for a visit (for those without insurance) and an easy way for people to directly connect with the virtual care platform. The visit between the patient and provider will take place on the healthcare provider’s platform of choice.

On Search, you can see information about virtual healthcare platforms

People will be able to see widely-available virtual healthcare platforms directly on Search as well as the out-of-pocket cost for a visit.

For healthcare providers, helping you keep up with demand

In addition to helping healthcare providers with information about how to keep their online business information up-to-date, Google Cloud is supporting providers with technology infrastructure and solutions to assist with care delivery and operations. This includes helping doctors support patients remotely with HIPAA-compliant G Suite products (including using Google Meet for telehealth or virtual visits),deploying virtual agents to field questions related to COVID-19, and helping with capacity-planning and demand forecasting of key medical supplies to better manage their supply chains.

The global COVID-19 pandemic and the response to prevent its spread has changed the way individuals access and connect to health care. Across our products, we’ll continue to ensure that healthcare providers have the tools to connect with patients, and that anyone can access the information and care they need to stay healthy.

Source: Search