2021 Year in Review: Google Quantum AI

Google’s Quantum AI team has had a productive 2021. Despite ongoing global challenges, we’ve made significant progress in our effort to build a fully error-corrected quantum computer, working towards our next hardware milestone of building an error-corrected quantum bit (qubit) prototype. At the same time, we have continued our commitment to realizing the potential of quantum computers in various applications. That's why we published results in top journals, collaborated with researchers across academia and industry, and expanded our team to bring on new talent and expertise.

An update on hardware

The Quantum AI team is determined to build an error-corrected quantum computer within the next decade, and to simultaneously use what we learn along the way to deliver helpful—and even transformational—quantum computing applications. This long-term commitment is expanded broadly into three key questions for our quantum hardware:

  1. Can we demonstrate that quantum computers can outperform the classical supercomputers of today in a specific task? We demonstrated beyond-classical computation in 2019.
  2. Can we build a prototype of an error-corrected qubit? In order to use quantum computers to their full potential, we will need to realize quantum error correction to overcome the noise that is present during our computations. As a key step in this direction, we aim to realize the primitives of quantum error correction by redundantly encoding quantum information across several physical qubits, demonstrating that such redundancy leads to an improvement over using individual physical qubits. This is our current target.
  3. Can we build a logical qubit which does not have errors for an arbitrarily long time? Logical qubits encode information redundantly across several physical qubits, and are able to reduce the impact of noise on the overall quantum computation. Putting together a few thousand logical qubits would allow us to realize the full potential of quantum computers for various applications.

Progress toward building an error-corrected qubit prototype

The distance between the noisy quantum computers of today and the fully error-corrected quantum computers of the future is vast. In 2021, we made significant progress in closing this gap by working toward building a prototype logical qubit whose errors are smaller than those of the physical qubits on our chips.

This work requires improvements across the entire quantum computing stack. We have made chips with better qubits, improved the methods that we use to package these chips to better connect them with our control electronics, and developed techniques to calibrate large chips with several dozens of qubits simultaneously.

These improvements culminated in two key results. First, we are now able to reset our qubits with high fidelity, allowing us to reuse qubits in quantum computations. Second, we have realized mid-circuit measurement that allows us to keep track of computation within quantum circuits. Together, the high-fidelity resets and mid-circuit measurements were used in our recent demonstration of exponential suppression of bit and phase flip errors using repetition codes, resulting in 100x suppression of these errors as the size of the code grows from 5 to 21 qubits.

Chart chronicling repetition code

Suppression of logical errors as the number of qubits in the repetition code is increased. As we increase the code size from 5 to 21 qubits, we see 100x reduction in logical. Image acknowledgement: Kevin Satzinger/Google Quantum AI

Repetition codes, an error correction tool, enable us to trade-off between resources (more qubits) and performance (lower error) which will be central in guiding our hardware research and development going forward. This year we showed how error decreases as we increase the number of included qubits for a 1-dimensional code. We are currently running experiments to extend these results to two-dimensional surface codes which will correct errors more comprehensively.

Applications of quantum computation

In addition to building quantum hardware, our team is also looking for clear margins of quantum advantage in real world applications. With our collaborators in academia and industry, we are exploring fields where quantum computers can provide significant speedups, with realistic expectations that error-corrected quantum computers will likely require better than quadratic speedups for meaningful improvements.

As always, our collaborations with academic and industry partners were invaluable in 2021. One notable collaboration with Caltech showed that, under certain conditions, quantum machines can learn about physical systems from exponentially fewer experiments than what is conventionally required. This novel method was validated experimentally using 40 qubits and 1300 quantum operations, demonstrating a substantial quantum advantage even with the noisy quantum processors we have today. This paves the way to more innovation in quantum machine learning and quantum sensing, with potential near-term use cases.

In collaboration with researchers at Columbia University, we combined one of the most powerful techniques for chemical simulation, Quantum Monte Carlo, with quantum computation. This approach surpasses previous methods as a promising quantum approach to ground state many-electron calculations, which are critical in creating new materials and understanding their chemical properties. When we run a component of this technique on a real quantum computer, we are able to double the size of prior calculations without sacrificing accuracy of the measurements, even in the presence of noise on a device with up to 16 qubits. The resilience of this method to noise is an indication of its potential for scalability even on today’s quantum computers.

We continue to study how quantum computers can be used to simulate quantum physical phenomena—as was most recently reflected in our experimental observation of a time crystal on a quantum processor (Ask a Techspert: What exactly is a time crystal?). This was a great moment for theorists, who’ve pondered the possibility of time crystals for nearly a century. In other work, we also explored the emergence of quantum chaotic dynamics by experimentally measuring out-of-time-ordered correlations on one of our quantum computers, which was done jointly with collaborators at the NASA Ames Research Center; and experimentally measuring the entanglement entropy of the ground state of the Toric code Hamiltonian by creating its eigenstates using shallow quantum circuits with collaborators at the Technical University of Munich.

Our collaborators contributed to, and even inspired, some of our most impactful research in 2021. Quantum AI remains committed to discovering and realizing meaningful quantum applications in collaboration with scientists and researchers from across the world in 2022 and beyond as we continue our focus on machine learning, chemistry, and many-body quantum physics.

You can find a list of all our publications here.

Continuing investment in the quantum computing ecosystem

This year, at Google’s annual developer conference, Google I/O, we reaffirmed our commitment to the roadmap and investments required to make a useful quantum computer within the decade. While we were busy growing in Santa Barbara, we also continue to support the enablement of researchers in the quantum community through our open source software. Our quantum programming framework, Cirq, continues to improve with contributions from the community. 2021 also saw the release of specialized tools in collaboration with partners in the ecosystem. Two examples of these are:

  • The release of a new Fermionic Quantum Simulator for quantum chemistry applications in collaboration with QSimulate, taking advantage of the symmetry in quantum chemistry problems to provide efficient simulations.
  • A significant upgrade to qsim which allows for simulation of noisy quantum circuits on high performance processors such as GPUs via Google Cloud, and qsim integration with NVIDIA’s cuQuantum SDK to enable qsim users to make the most of NVIDIA GPUs when developing quantum algorithms and applications.

We also released an open-source tool called stim, which provides a 10000x speedup when simulating error correction circuits.

You can access our portfolio of open-source software here.

Looking toward 2022

Resident quantum scientist Qubit the Dog taking part in a holiday sing-along.

Resident quantum scientist Qubit the Dog taking part in a holiday sing-along led by team members Jimmy Chen and Ofer Naaman.

Through teamwork, collaboration, and some innovative science, we are excited about the progress that we have seen in 2021. We have big expectations for 2022 as we focus on progressing through our hardware milestones, the discovery of new quantum algorithms, and the realization of quantum applications on the quantum processors of today. To tackle our difficult mission, we are growing our team, building on our existing network of collaborators, and expanding our Santa Barbara campus. Together with the broader quantum community, we are excited to see the progress that quantum computing makes in 2022 and beyond.

Tools to help you tackle your New Year’s resolution

You always hear the standard New Year resolutions: Work out more. Run a marathon. Learn a new language. For me this year, it’s to learn three new party tricks (I’m optimistically hoping for more social interaction in 2022!). No matter what the goal is, it often feels that by February, I’ve lost some steam. Resolutions take time, and new habits and skills are (let’s admit) hard to build.

So this year, my New Year's resolution is to stick to a New Year’s resolution. So I did a little digging, and found a few tools that I have at my fingertips to get that resolution to stick.

First things first: Write down your goal

Don’t justthink about your resolution — write it down. If you live by your inbox, schedule send a January 1 New Year’s resolution email to yourself. What better way to kickstart the new year than with an email to your future self?

If you’re not into email, Google Keep is a great way to jot down resolution ideas. If you’re on the go when inspiration strikes, you can even create a Google Keep note with your voice.

And don’t forget good ol’ pen and paper. Recording something on paper is easy, and the physical movement of writing something down can make it stick in a certain way. So write it down, literally.

Next, create reminders

The hard part about keeping resolutions for me is changing my daily routine. So I decided to

break down my resolution into smaller goals, and set up check-ins on Google Calendar. Twice a month, I put aside time to learn a party trick (my first one is going to be rolling a coin across my knuckles), and half way through the year I set up a “dry run” performance with friends (whether that ends up being in-person or virtual) to keep myself accountable.

Aside from checkpoints, crossing items off a checklist also keeps me on track. So I further broke down my twice-a-month trick-learning efforts using Tasks. This means my smaller, bite-sized agenda items will show up everywhere, from Gmail to Google Slides (so I can’t ignore them!).

Screenshot of Google Slides with a right-hand side bar with the Tasks feature popped up.

If you wrote down your resolution on Google Keep, that’s also a good place to create a to-do list and hit your smaller target goals on your way to your resolution. You can even set up timed reminders for each of the items to make sure you hit your goals.

Build satisfaction by tracking your progress

You can track your progress anywhere, like Keep or even Google Docs, but if you’re looking for more, try AppSheet . With AppSheet, you can build custom apps without any coding required. Need a custom app to track your workout progress? Looking for a journaling app on the go? AppSheet has a few templates you can try — or you can build your own if you want to get hyper-specific.

Make sure to reward yourself along the way

New habits and skills are hard to build, especially when you don’t see immediate results. So celebrating mini-milestones along the way (practiced 10 sessions ☑...rehearsed for my dry run ☑) help me stay motivated.

How you reward yourself is up to you — maybe it’s taking a day for self care, or simply exchanging words of encouragement with your friends and family – a little kudos goes a long way. And if at any point along the way toward your goal you begin to feel a little weary, try some of the advice from our resilience expert at Google, who talks about breaking tasks into smaller challenges that are easier to tackle.

Here’s to 2022 — and sticking with our New Year resolutions.

Chrome Beta for Android Update

Hi everyone! We've just released Chrome Beta 97 (97.0.4692.70) for Android: it's now available on Google Play.

You can see a partial list of the changes in the Git log. For details on new features, check out the Chromium blog, and for details on web platform updates, check here.

If you find a new issue, please let us know by filing a bug.

Krishna Govind
Google Chrome

Prediction Framework, a time saver for Data Science prediction projects

Posted by Álvaro Lamas, Héctor Parra, Jaime Martínez, Julia Hernández, Miguel Fernandes, Pablo Gil

Acquiring high value customers using predicted Lifetime Value, taking specific actions on high propensity of churn users, generating and activating audiences based on machine learning processed signals…All of those marketing scenarios require of analyzing first party data, performing predictions on the data and activating the results into the different marketing platforms like Google Ads as frequently as possible to keep the data fresh.

Feeding marketing platforms like Google Ads on a regular and frequent basis, requires a robust, report oriented and cost reduced ETL & prediction pipeline. These pipelines are very similar regardless of the use case and it’s very easy to fall into reinventing the wheel every time or manually copy & paste structural code increasing the risk of introducing errors.

Wouldn't it be great to have a common reusable structure and just add the specific code for each of the stages?

Here is where Prediction Framework plays a key role in helping you implement and accelerate your first-party data prediction projects by providing the backbone elements of the predictive process.

Prediction Framework is a fully customizable pipeline that allows you to simplify the implementation of prediction projects. You only need to have the input data source, the logic to extract and process the data and a Vertex AutoML model ready to use along with the right feature list, and the framework will be in charge of creating and deploying the required artifacts. With a simple configuration, all the common artifacts of the different stages of this type of projects will be created and deployed for you: data extraction, data preparation (aka feature engineering), filtering, prediction and post-processing, in addition to some other operational functionality including backfilling, throttling (for API limits), synchronization, storage and reporting.

The Prediction Framework was built to be hosted in the Google Cloud Platform and it makes use of Cloud Functions to do all the data processing (extraction, preparation, filtering and post-prediction processing), Firestore, Pub/Sub and Schedulers for the throttling system and to coordinate the different phases of the predictive process, Vertex AutoML to host your machine learning model and BigQuery as the final storage of your predictions.

Prediction Framework Architecture

To get involved and start using the Prediction Framework, a configuration file needs to be prepared with some environment variables about the Google Cloud Project to be used, the data sources, the ML model to make the predictions and the scheduler for the throttling system. In addition, custom queries for the data extraction, preparation, filtering and post-processing need to be added in the deploy files customization. Then, the deployment is done automatically using a deployment script provided by the tool.

Once deployed, all the stages will be executed one after the other, storing the intermediate and final data in the BigQuery tables:

  • Extract: this step will, on a timely basis, query the transactions from the data source, corresponding to the run date (scheduler or backfill run date) and will store them in a new table into the local project BigQuery.
  • Prepare: immediately after the extract of the transactions for one specific date is available, the data will be picked up from the local BigQuery and processed according to the specs of the model. Once the data is processed, it will be stored in a new table into the local project BigQuery.
  • Filter: this step will query the data stored by the prepare process and will filter the required data and store it into the local project BigQuery. (i.e only taking into consideration new customers transactionsWhat a new customer is up to the instantiation of the framework for the specific use case. Will be covered later).
  • Predict: once the new customers are stored, this step will read them from BigQuery and call the prediction using Vertex API. A formula based on the result of the prediction could be applied to tune the value or to apply thresholds. Once the data is ready, it will be stored into the BigQuery within the target project.
  • Post_process: A formula could be applied to the AutoML batch results to tune the value or to apply thresholds. Once the data is ready, it will be stored into the BigQuery within the target project.

One of the powerful features of the prediction framework is that it allows backfilling directly from the BigQuery user interface, so in case you’d need to reprocess a whole period of time, it could be done in literally 4 clicks.

In summary: Prediction Framework simplifies the implementation of first-party data prediction projects, saving time and minimizing errors of manual deployments of recurrent architectures.

For additional information and to start experimenting, you can visit the Prediction Framework repository on Github.

Prediction Framework, a time saver for Data Science prediction projects

Posted by Álvaro Lamas, Héctor Parra, Jaime Martínez, Julia Hernández, Miguel Fernandes, Pablo Gil

Acquiring high value customers using predicted Lifetime Value, taking specific actions on high propensity of churn users, generating and activating audiences based on machine learning processed signals…All of those marketing scenarios require of analyzing first party data, performing predictions on the data and activating the results into the different marketing platforms like Google Ads as frequently as possible to keep the data fresh.

Feeding marketing platforms like Google Ads on a regular and frequent basis, requires a robust, report oriented and cost reduced ETL & prediction pipeline. These pipelines are very similar regardless of the use case and it’s very easy to fall into reinventing the wheel every time or manually copy & paste structural code increasing the risk of introducing errors.

Wouldn't it be great to have a common reusable structure and just add the specific code for each of the stages?

Here is where Prediction Framework plays a key role in helping you implement and accelerate your first-party data prediction projects by providing the backbone elements of the predictive process.

Prediction Framework is a fully customizable pipeline that allows you to simplify the implementation of prediction projects. You only need to have the input data source, the logic to extract and process the data and a Vertex AutoML model ready to use along with the right feature list, and the framework will be in charge of creating and deploying the required artifacts. With a simple configuration, all the common artifacts of the different stages of this type of projects will be created and deployed for you: data extraction, data preparation (aka feature engineering), filtering, prediction and post-processing, in addition to some other operational functionality including backfilling, throttling (for API limits), synchronization, storage and reporting.

The Prediction Framework was built to be hosted in the Google Cloud Platform and it makes use of Cloud Functions to do all the data processing (extraction, preparation, filtering and post-prediction processing), Firestore, Pub/Sub and Schedulers for the throttling system and to coordinate the different phases of the predictive process, Vertex AutoML to host your machine learning model and BigQuery as the final storage of your predictions.

Prediction Framework Architecture

To get involved and start using the Prediction Framework, a configuration file needs to be prepared with some environment variables about the Google Cloud Project to be used, the data sources, the ML model to make the predictions and the scheduler for the throttling system. In addition, custom queries for the data extraction, preparation, filtering and post-processing need to be added in the deploy files customization. Then, the deployment is done automatically using a deployment script provided by the tool.

Once deployed, all the stages will be executed one after the other, storing the intermediate and final data in the BigQuery tables:

  • Extract: this step will, on a timely basis, query the transactions from the data source, corresponding to the run date (scheduler or backfill run date) and will store them in a new table into the local project BigQuery.
  • Prepare: immediately after the extract of the transactions for one specific date is available, the data will be picked up from the local BigQuery and processed according to the specs of the model. Once the data is processed, it will be stored in a new table into the local project BigQuery.
  • Filter: this step will query the data stored by the prepare process and will filter the required data and store it into the local project BigQuery. (i.e only taking into consideration new customers transactionsWhat a new customer is up to the instantiation of the framework for the specific use case. Will be covered later).
  • Predict: once the new customers are stored, this step will read them from BigQuery and call the prediction using Vertex API. A formula based on the result of the prediction could be applied to tune the value or to apply thresholds. Once the data is ready, it will be stored into the BigQuery within the target project.
  • Post_process: A formula could be applied to the AutoML batch results to tune the value or to apply thresholds. Once the data is ready, it will be stored into the BigQuery within the target project.

One of the powerful features of the prediction framework is that it allows backfilling directly from the BigQuery user interface, so in case you’d need to reprocess a whole period of time, it could be done in literally 4 clicks.

In summary: Prediction Framework simplifies the implementation of first-party data prediction projects, saving time and minimizing errors of manual deployments of recurrent architectures.

For additional information and to start experimenting, you can visit the Prediction Framework repository on Github.

“New normal” and other words we used a lot this year

There’s a lot to think about at the end of each year. What we accomplished, what we didn’t — what we made time for, or what we took a break from. At Google, the Search team looks at what sort of questions the world asked, and what answers we really needed. And of course, what momentary trends completely captivated us (looking at you, “tiktok pasta”).

As a writer, something I’ve been thinking about in the last few weeks of 2021 are the words we used this year. 2020 was the year of “now more than ever,” a phrase that began to feel meaningless as the “now more than ever”-worthy moments kept coming (and admittedly, as we all kept calling them that). If 2020 was the year of “now more than ever,” then what was 2021?

Once again, I turned to Ngrams, a Google tool launched in 2009 by part of the Google Books team. Ngrams shows how books and other pieces of literature have used certain words or phrases over time, and you can chart their popularity throughout the years. One caveat: Ngrams currently tracks data from 1800 to 2019 — prior to 2020, Ngrams’ data ranged from 1800 to 2012, but the team added a huge new dataset about two years ago. So while it remains to be seen how some sayings took over writing throughout 2020 and 2021, I wanted to see how the words we’re hearing and saying and writing today have shown up over time.

My first nomination: “new normal.” This is a phrase that I personally have heard…well, now more than ever, I suppose. This isn’t the first time “new normal” appeared in the lexicon, though: You can see it began to see small bursts of usage in literature and other writing in the mid-19th century — though if you use Ngrams to see some of the examples of how it showed up, “new normal” was often in reference to types of academic institutions. And then “new normal” just sort of faded away…until the aughts, when it dramatically rose. Michael Ballback, who works on Google Books, told me that a lot of post-2000s data added comes from e-books, whereas older data mostly came from libraries, so perhaps this could account for some of the jump. In any case, today it now completely permeates our writing. (Which raises the question: Is there such a thing as normal if they’re constantly new?)

Google Books Ngrams Viewer chart showing the use over time of the phrase “new normal,” which hits a peak in the post-2000s.

Then of course, I thought of “vaccine,” which actually began its Ngrams debut on a high, falling sharply between 1800 and 1813…only to rise again in the early to mid 1900s, when many scholarly articles were published about things like typhoid, cholera and pertussis vaccinations. Then it goes up and down, up and down, to an all-time high in 2003. It’s since slightly fallen off — but remember, Ngrams’ data goes up until 2019, so I have my own assumptions about how it’s fared the past two years.

Google Books Ngrams Viewer chart showing the use over time of the phrase “vaccine,” which rises consistently beginning in 1900.

Google Books Ngrams Viewer chart showing the use over time of the phrase “vaccine,” which rises consistently beginning in 1900.

Lastly, I took a look at “hybrid.” Obviously it’s a word that’s been around for awhile (according to Ngrams, it’s been in use since at least the year 1800, which is how far the tool’s data goes back) and has gently, steadily risen since. It spiked in the early ‘80s, though, but in browsing snippets from Google Books from this time period, it was used similarly to how it is now. Later in the aughts, we start seeing it used to describe cars, and today…well, you probably already know.

Google Books Ngrams Viewer chart showing the use over time of the phrase “hybrid,” which gently rises over time to a high point in 2019.

What “hybrid” means hasn’t really changed, but it’s the situations we’re applying it to that have — there’s a much wider scope of daily life that falls under this category. “Hybrid” didn’t change, but how we live has. 2020 felt in many ways like a pause on life, and this year we began finding new, creative ways to adapt — a little of our old methods, mixed with the new. And that, to me, feels distinctly 2021.

Looking back on an interesting year

2021 is coming to a close, and what a year it has been. 


As the pandemic has continued to shape what normal looks like and we all figure out how to work, learn, connect, and be in the world now, quality internet and ensuring access for more people has become a central focus not just for Google Fiber, but for many of our communities across the country. That’s a huge opportunity and responsibility for us and we’re working to make 2022 and beyond even more connected.

Taking it farther faster

In 2021, we built to more households than in any other year. Many of the communities we announced this year around the country already have service, including South Salt Lake, Holladay, Taylorsville, Millcreek and North Salt Lake in Utah; Concord and Matthews in North Carolina; and Leon Valley in Texas. 

While we still have a lot of work to do in many of our communities to bring access to as many people as possible, we continue to make our build processes more efficient and less disruptive. This will be a focus area for our teams across the country leading into 2022, as we expect to expand even more next year.

More internet for everything

In 2021, the rollout of our new 2 Gig service demonstrated just how much demand for internet had increased. It wasn’t just those working at home and gamers opting to up to double their download speeds (although they liked it, too!), we saw households of all varieties taking advantage of the opportunity to get more out of their internet. 

With increased demand across all our products, we worked to ensure our network was there when our customers needed it, increasing capacity across all points of the network right up to improving the in-home Wi-Fi experience. In 2022, we’ll continue to work to make our customers’ fast, reliable internet even better.

Helping communities thrive

While a lot of great things happened in 2021, the pandemic continued to pose challenges for many of our customers and our cities. With the internet’s increasingly central role in our daily lives, we saw many more organizations stepping into digital equity work. To meet that demand, we expanded our partnership with NTEN to support 11 fellows in eight Google Fiber cities. We’ve continued to work with partners across the country to help more people access the internet and develop the skills to take advantage of online opportunities. 

This year, thousands of people participated in Google Fiber-funded programs in our communities through over 170 different local organizations across the country, from trainings to device distributions to STEM events. 

We also provided gigabit internet at no cost to more than 440 organizations this year to allow them to meet the needs of their clients and their work in the community through our Community Connections program, and provided gigabit internet at no cost to over 3,500 households through the Gigabit Community program. 

Growing the Google Fiber team

This year, we’ve grown both our central and our local city teams to help keep up with our expanded efforts across the country. We recently launched a new Google Fiber careers site to help candidates find us, and we’re still hiring! We have hundreds of open roles, so if all this sounds like an interesting, rewarding way to make a difference, then maybe you should join us. One thing is certain, 2022 is not going to be dull around here. 

Posted by the Google Fiber Team

~~~~

Author: Google Fiber Team

Title:

category: company_news

The year in review: Take a bite out of 2021

Last year, as we wrapped up 2020, so many of us looked around and wondered what the world would be like now. Would many people return to the office? Would kids return to classrooms? Would we return to restaurants, concerts, football games? The COVID-19 vaccine helped move many of us in that direction, but as 2021 ends, we’re still grappling with many of those same questions.

On The Keyword, and at Google as a whole, we focused on those questions, too. We shared updates around vaccine equity and the hybrid workplace, but also returned to hosting events like Google I/O — digitally. And we had some fun along the way, too. Here’s a look at what we were up to in 2021.

1. Through 2021, the world’s focus was still on COVID-19, and that was also the case at Google. We announced new tools to support vaccine access and distribution, ways we’re helping get vaccines to more people around the world and technology to improve searches for vaccine information. We also stayed committed to vaccine equity, and equity around health. Within Google, we gave a preview of our hybrid approach to work, and shared several updates about our approach to returning to the office.

2. Of course, hybrid work wasn’t just top of mind for Googlers. The nature of work changed for many people around the world in 2021, and we responded with new products and initiatives to prepare for our “new normal.” We opened up Workspace for everyone, and shared hybrid work tips from our own productivity expert. We provided resources for businesses on the road to recovery, from small businesses to LGBTQ+ spaces. And we expanded our Grow with Google Career Certificate program to help job seekers at community colleges and in the military community.

3. Teachers and students were particularly affected by the pandemic, and we were hard at work improving the virtual and hybrid learning experience. We launched more than 50 new education features for products like Classroom, Meet and Cloud, and launched Workspace for Education to give educators and administrators more flexibility. Throughout the year, we also took the time to give teachers the appreciation they deserve.

4. 2021 marked the return of Google I/O, which went fully digital this year — and was free for everyone. We previewed new software, including Android 12, and new technologies like LaMDA, MUM and Project Starline. I/O also included a glimpse into our new Quantum AI campus, and gave Googlers who attended a chance to finally meet one another in person.

An animation showing a woman using Project Starline, and how 3D imaging renders her image for the display.

The technology behind Project Starline

5. I/O wasn’t the only time we unveiled new products. This year marked the launch of the Pixel 6 and Pixel 6 Pro, which feature the Google Tensor chip and come in a fresh new set of colors. This was also the year Google acquired Fitbit, which launched the Charge 5 and Luxe and even partnered with Will Smith. We also introduced the new Nest Hub (which required some sleeping on the job), and a new set of Nest Cams and Doorbells.

6. We shared how AI is making information more useful in our second annual Search On event, and showed how AI is redefining what a map can be as well as helping map buildings in Africa. The Keyword spoke to Googler Marian Croak, who brought together our Responsible AI team, after she was honored by the National Inventors Hall of Fame for her work in advancing Voice over Internet Protocol (VoIP) technologies. And we spotlighted another Googler who created a crossword puzzle you can play to learn more about responsible AI.

7. Throughout the year, The Keyword highlighted Googlers who do fascinating things, both at work and during their free time. From Chrome OS design to children’s books, from interns to Olympians, the people behind our products stayed busy in 2021.

8. In one of The Keyword team’s favorite posts of the year, a Googler shared his story of communicating with his parents who are deaf using Google products. It was just one of many accessibility updates this year, including how we’re making Android more accessible and testing a new project to make communication more accessible for people with speech impairments. We spotlighted how an autistic Googler communicated with his manager, and a drummer who used AI to build a prosthetic arm.

9. As we make technological advances, we always kept our impact on the planet in mind. On The Keyword, we took a look at a water-saving entrepreneur and previewed our “dragonscale” solar panels. We announced new progress toward our 24/7 carbon-free energy goal, talked about how climate change was the next big moonshot — and used Google Earth timelapses to show just how much work there is left to do.

10. We continued to focus on diversity, equity and inclusion, posting updates on the racial equity commitments we announced in 2020 and launching new funds for Black and Latino entrepreneurs. We made our inclusive marketing toolkit available to everyone, partnered with HBCUs to address the diversity gap in technology and invested in Black-led startups and investment firms. And this focus extended to our products as well: The Pixel 6 comes with a more equitable camera that can better reflect all skin tones.

11. Google opened new offices around the world in 2021, from Taiwan to Ireland to New Zealand. We also launched new initiatives to promote the digital future across the globe, including through our Google for Africa event, our Digital Future Initiative in Australia and our partnership with Jio in India. Google News Showcase expanded around the world, and the Google News Initiative’s Innovation Challenge sparked new ideas in the global news industry. And Google.org’s Impact Challenge for Women and Girls, with help from none other than Shakira, backed 34 organizations around the world.

Two musicians play in front of a large screen, with the multi-colored animated “Blob Opera” singing on screen.

The Blob Opera performs with Tune-Yards at I/O.

12. When they weren’t going on a world tour with the Blob Opera, which made a splash performing at I/O, the team at Google Arts & Culture were helping us match our pets with works of art — oh, and reviving long-lost masterpieces by Gustav Klimt. In other artistic pursuits, one Doodle this year paid tribute to the late DJ Avicii — and another, created by Doodle for Google winner Milo, was all about finding hope and resilience.

13. As parts of the world began to open up again, that meant the return of sports, too. The 2020 Olympic and Paralympic games actually took place in the summer of 2021, and Google was along for the ride. And in the U.S. we announced partnerships with the NBA and the WNBA — and perhaps most importantly of all, looked at the Search trends for the most im-paw-tant sports event of the year: the Puppy Bowl.


This is just a short list of everything Google was up to in 2021. And in 2022, as things (hopefully) move closer toward whatever “normal” means now, we’re looking forward to sharing more new discoveries, updates and stories.

More from this Collection

Oh snow helpful: Holiday tips from Google

A collection of holiday tips, tricks and tools from Google.

View all 12 articles

Beta Channel Update for Chrome OS

The Beta channel is being updated to 97.0.4692.63 (Platform version: 14324.49.0) for most Chrome OS devices.

If you find new issues, please let us know by visiting our forum or filing a bug. Interested in switching channels Find out how. You can submit feedback using ‘Report an issue...’ in the Chrome menu (3 vertical dots in the upper right corner of the browser). 


Cole Brown,
Google Chrome OS 

5 Google for Creators highlights in 2021

Before we jump into 2022, the Google for Creators team is looking back at some of our favorite moments from this year. Check out our top five highlights from 2021.

Photographs of two women and a man with an animal perched on his shoulders and floating colorful shapes surround a bubble with the words “Google for Creators.”

The Google for Creators website features guides, event listings, a blog and more to help creators learn and grow.

1. Launching Google for Creators

In October, we launched Google for Creators, a hub for content creators looking for information and inspiration. Along with guides for creating a content strategy, expanding your audience and choosing a monetization approach, you’ll find upcoming events, tips from seasoned creators and blog posts with the latest updates from the creator economy.

2. Celebrating women of color creators

On November 19, Women’s Entrepreneurship Day, we launched The Conversation, a new video series celebrating women of color creators. Guests, like beauty and lifestyle creator Tyla-Lauren Gilmore and fashion model and creative director Hannah Mussette, talk about how their backgrounds have shaped their brands and share their personal successes and struggles as full-time creators. Stay tuned for more episodes of The Conversation in 2022.

A screenshot of the homepage of Mochi Magazine shows a grid of images and headlines.

Google for Creators interviewed Giannina Ong, the Editor in Chief of Mochi Magazine, the longest-running online publication for Asian American women.

3. Interviewing inspiring creators

We’ve interviewed so many fascinating people on our blog, including the editor in chief of the longest-running Asian American women’s online magazine; a queer automotive educator, journalist and influencer who started an inclusive car blog; a former lawyer who became a full-time vegan food blogger; and a ceramicist whose content showcases her artistic process. Their stories show the endless topics, communities and niches that you can create content for and about on the web.

4. Tapping into expert insights

For our Creator Insights YouTube series, we asked some of our favorite creators — like Eden Hagos and Elle Asiedu of BLACK FOODIE, and beauty blogger Keiko Lynn — to share their top insights and strategies for creating content, building a brand and making money as a creator. Some topics included how to find your niche, avoid burnout and pitch yourself to brands.

A screenshot of an Instagram post from @googleforcreators displays a designed  prompt that says, “Tag a woman identifying creator who is doing a great job.”

Follow Google for Creators on Instagram and Twitter, where we regularly connect with the creator community and post advice and insights.

5. Connecting with creators on social

We hope you’ve been following Google for Creators on Twitter and Instagram, where we share everything we’re up to. We also love connecting with the creator community on our social channels. If you’ve ever wanted to ask us a question, or answer one of ours with your own tips, that’s where to do it!

We had a blast sharing stories and insights from all the creators we spoke to in 2021, and we can’t wait to connect with even more next year. See you in 2022!