Increasing Diversity: Cloud Study Jam for Women Techmakers in Europe

Posted by Franziska Hauck, DevRel Ecosystem Regional Lead DACH

When we look at the community landscape in programming in 2019, we find people of all backgrounds and with expertise as varied as the people themselves. There are developer groups for every imaginable interest out there. What becomes apparent, though, is that the allocation is not as equally balanced as it might be. In Europe, we observe that more programming women are in front-end development and active in the associated groups.

But what about in cloud? Recently, Global Knowledge published a ranking that showed that Google Cloud Certification is the most coveted achievement in the labor market. We knew that the interest was there. How could we capture it and get more women and diverse poeple involved? [Indeed, we had seen women succeed and in this chosen field at that. It was time to contribute to seeing more success stories coming our way.]

Immediately the Cloud Study Jam came to mind. This campaign is a self-study, highly individualized study jam for Google Developer Groups (GDGs) and other tech meetups. Organizers get access to study materials to help them prepare for their event, register it on the global map and conduct the activity with their attendees in any location they choose. Attendees receive free Qwiklabs credits to complete a number of courses of their choice. The platform even offers a complete Google Cloud environment - the best training ground for aspiring and advanced programmers!

GDGs form one pillar of our community programs. One of the other cornerstones is the Women Techmakers program with which we engage and involve organizers interested in increasing diversity worldwide. Cloud Study Jams in the local groups, with dedicated Women Techmakers, seemed like the most natural fit for us. And, as we soon realized, so thought the organizers.

For us - Almo, Abdallah and Franziska - that was the start of a great initiative and an even bigger road trip. Together with local volunteers from Google and the groups, we held 11 Cloud Study Jams all over Europe in March and April.

Over 450 attendees, 80 % of them women, learned about Cloud technologies.

This was some of their feedback:

“This made me aim for the Cloud Certificate exam as my next goal in my career!”

“I found useful everything! The labs are interesting... and I would like to have more meetups like this.”

“The labs are interesting, at least both that we did. I would like to have more meetups like this!”

As surmised, many attendees were indeed front-end developers. It was amazing to see that, with the courses, they “converted” to Cloud and are now going forward as ambassadors. We also saw quite a big number of data scientists and back-end developers. All in all, it was a great mix of enthusiastic participants.

Cloud Study Jams are a great way to engage group members by guided materials. The way they are designed makes it easy for the organizers to focus on the participants. Since attendees follow their chosen courses on their own organizers act as facilitators. They need only jump in when organizational questions arise.

If you would like to hold a Cloud Study Jam with your group or organization you will find more information here. Register your event via the link to get access to the free Qwiklabs credits for your attendees.

We are very much looking forward to supporting you!

Almo, Abdallah, Franziska & the European DevRel Ecosystem

Google fosters the open source hardware community

Open source silicon promises new challenges and opportunities for both industry and the open source community. To take full advantage of open silicon we will need new design methodologies, new governance models, and increased collaborations between industry, academia, and not for profits. A vibrant free and open source software community has been vital to both Google and our customer’s success. We look forward to supporting the new domain of open source silicon to similarly benefit all participants.

Working through its Open Source Programs Office (OSPO), Google is actively engaged in helping seed the open silicon space. Specifically by providing funding, strategic, and legal support to key open hardware efforts including lowRISC and CHIPS alliance.

lowRISC

lowRISC is a leader in open silicon community outreach, technical documentation, and advancing the goal of a truly open source system on a chip. We have long supported lowRISC’s mission of transparently implemented silicon and robust engagement of the open source silicon community, providing funding, advice, and recognizing their open source community leadership by selecting them as a Google Summer of Code mentoring organization.

Similar to the benefits of open source software, we believe our users will derive great outcomes from open source silicon. Besides enabling and encouraging innovation, chip designs derived from a common, open baseline will provide the benefit of implementation choice while still guaranteeing software compatibility and a set of common interfaces. With regards to security, the transparency of an open source approach is critical to both bugfinding and establishing implementation trustworthiness.

"Google has encouraged and supported lowRISC since the very start. They clearly share our optimism for what open source hardware can offer and our community-driven vision of the future. We are excited by the expanding open source RISC-V ecosystem and look forward to lowRISC community IP being deployed in the real world,” said Alex Bradbury, Co-founder and Director. “We believe lowRISC can act as an important catalyst for open source silicon, providing a shared engineering resource to ensure quality, provide support and help to maintain IP from a range of partners.”
lowRISC board members (L to R): Dominic Rizzo (Google), Alex Bradbury (lowRISC), Gavin Ferris (lowRISC), Dr Robert Mullins (University of Cambridge), Prof. Luca Benini (ETH Zürich), and Ron Minnich (Google, not pictured).
A first example of Google’s ongoing collaboration with ETH Zürich and lowRISC is the recently released “Ibex” RISC-V core. ETH Zürich donated their Zero-riscy core as a starting point and technical work to extend the core was done across all three organizations. You can learn more about Google’s collaboration with lowRISC on the RISC-V core here.

Furthermore, Google is excited to announce that it is joining the board of lowRISC, with the appointment of Dominic Rizzo and Ronald Minnich as corporate directors.

CHIPS Alliance 

Along with our increased funding, support and collaboration with lowRISC, we are also happy to announce our status as a founding member of the Linux Foundation’s CHIPS Alliance project. CHIPS Alliance features an industry-driven, collaborative model to release high-quality silicon IP and supporting technical collateral. Most recently, in collaboration with CHIPS Alliance, we released a Universal Verification Methodology (UVM) instruction stream generator to aid in the verification of RISC-V cores. We believe such open sourcing of verification tools will prove critical to the long-term success of the open source silicon community.

Google has been an early, strong supporter of the open silicon community. We believe deeply in a future where transparent, trustworthy open source chip designs are commonplace. To get there, we are committed to establishing a collaborative, community-focused, open source basis for free and open silicon design.

By Parthasarathy Ranganathan, Distinguished Engineer, Google and Dominic Rizzo, Open Silicon Tech Lead, Google 

Breaking down barriers to VR

YouTube is where people go to experience VR videos. With over one million VR videos and experiences, YouTube VR offers a diverse library of immersive content for everyone to enjoy and explore the world from a new perspective.

But to make VR for everyone, we have to continue breaking down barriers on how people create and watch VR content on YouTube. To do this, we’re focused on offering YouTube VR on even more platforms, celebrating award-winning VR content and improving creator education programs.

Offering YouTube VR on even more platforms


Since the initial launch of the YouTube VR app in November 2016, we’ve been focused on bringing the app to as many people with a VR headset as possible. It’s already available on Daydream View, HTC Vive, Playstation VR, Samsung Gear VR, Oculus Go and Oculus Rift. And when Oculus Quest becomes available on Tuesday, May 21, the YouTube VR app will be available as a launch title.

Celebrating award-winning VR content on YouTube


VR allows creators to transport their audiences to new, amazing and even impossible places. We’ve partnered with creators to bring immersive experiences to YouTube. And, over the last six months, these VR videos have been recognized with a number of standout awards, including Emmy®, Webby and Streamy awards.



Baobab Studios recently nabbed multiple Emmy® awards for the animated short film, “Crow: The Legend VR.” With a star-studded cast  including John Legend, Oprah, Liza Koshy and Constance Wu  this immersive short film is animated VR content at its best.



But the Emmy® awards didn’t stop there. NASA Jet Propulsion Laboratory won an Emmy® for their “Cassini's Grand Finale 360°” videos and NASA's first 360° livestream. These 360-degree videos transport viewers to space, unlocking an out-of-this-world experience.



Isle of Dogs: Behind the Scenes (in Virtual Reality)” won two Webby Awards and the Clio Entertainment Gold award. The immersive video takes the audience behind-the-scenes of the film, featuring on-set interviews with the cast and an inside look at the unique craft of stop-motion animation.

Improving creator education through the YouTube VR Creator Lab


As part of our efforts to continue democratizing VR content creation, we’re currently accepting applications for the European edition of the YouTube VR Creator Lab. This three-month, learning and production intensive helps creators embrace YouTube’s VR180 format.

Selected participants get to attend a three-day boot camp at a YouTube Space and receive advanced education from leading VR instructors and filmmakers, ongoing mentoring, a shiny VR180 camera to keep, and $20,000 USD in funding toward the production of their dream projects.



Since the program launched in 2017, we’ve hosted six YouTube VR Creator Labs with over 60 creators across the globe in Los Angeles, London and Tokyo. Participants have gone on to win Emmy and Streamy awards for their VR content created during the lab.

We’re excited to see where VR will bring us next!

Posted by Julia Hamiton Trost, Head of VR/AR Content & Partnerships, who recently watched “Cirque du Soleil's VOLTA Hair Suspension in VR180,” and Kurt Wilms, Product Lead, VR, who recently watched “Engineering for Mars: Building the Mars 2020 Mission (360 video)



Three new machine learning courses

Many years ago, I took a dance lesson in Budapest to learn the csárdás, a Hungarian folk dance. The instructor shouted directions to me in enthusiastic Hungarian, a language I didn't understand, yet I still learned the dance by mimicking the instructor and the expert students. Now, I do love clear directions in a lesson—I am a technical writer, after all—but it’s remarkable what a person can learn by emulating the experts.  


In fact, you can learn a lot about machine learning by emulating the experts. That’s why we’ve teamed with ML experts to create online courses to help researchers, developers, and students. Here are three new courses:

  • Clustering: Introduces clustering techniques, which help find patterns and related groups in complex data. This course focuses on k-means, which is the most popular clustering algorithm. Although k-means is relatively easy to understand, defining similarity measures for k-means is challenging and fascinating.
  • Recommendation Systems: Teaches you how to create ML models that suggest relevant content to users, leveraging the experiences of Google's recommendation system experts. You'll discover both content-based and collaborative filtering, and uncover the mathematical alchemy of matrix factorization. To get the most out of this course, you'll need at least a little background in linear algebra.
  • Testing and Debugging: Explains the tricks that Google's ML experts use to test and debug ML models. Google's ML experts have spent thousands of hours deciphering the signals that faulty ML models emit. Learn from their mistakes.    
These new courses are engaging, practical, and helpful. They build on a series of courses we released last year, starting with Machine Learning Course Crash (MLCC), which teaches the fundamentals of ML. If you enjoyed MLCC, you're ready for these new courses. They will push you to think differently about the way you approach your work. Take these courses to copy the moves of the world's best ML experts.


Make your smart home more accessible with new tutorials

I’m legally blind, so from the moment I pop out of bed each morning, I use technology to help me go about my day. When I wake up, I ask my Google Assistant for my custom-made morning Routine which turns on my lights, reads my calendar and plays the news. I use other products as well, like screen readers and a refreshable braille display, to help me be as productive as possible.

I bring my understanding of what it's like to have a disability to work with me, where I lead accessibility for Google Search, Google News and the Google Assistant. I work with cross-functional teams to help fulfill Google’s mission of building products for everyone—including those of us in the disabled community.

The Assistant can be particularly useful for helping people with disabilities get things done. So today, Global Accessibility Awareness Day, we’re releasing a series of how-to videos with visual and audible directions, designed to help the accessibility community set up and get the most out of their Assistant-enabled smart devices.

You can find step-by-step tutorials to learn how to interact with your Assistant, from setting up your Assistant-enabled device to using your voice to control your home appliances, at our YouTube playlist which we’ll continue to update throughout the year.

Intro to Assistant Accessibility Videos

This playlist came out of conversations within the team about how we can use our products to make life a little easier. Many of us have some form of disability, or have a friend, co-worker or family member who does. For example, Stephanie Wilson, an engineer on the Google Home team, helped set up her parents’ smart home after her dad was diagnosed with Parkinson’s disease.

In addition to our own teammates, we're always listening to suggestions from the broader community on how we can make our products more accessible. Last week at I/O, we showed how we’re making the Google Assistant more accessible, using AI to improve products for people with a speech impairment, and added Live Caption in Android Q to give the Deaf community automatic captions for media that’s playing audio on your phone. All these changes were made after receiving feedback from people like you.

Head over to our Accessibility website to learn more, and if you have questions or feedback on accessibility within Google products, please share your feedback with us via our dedicated Disability Support team.

Building for all learners with new apps, tools, and resources

Everyone deserves access to a quality education—no matter your background, where you live, or your abilities. We’re recognizing this on Global Accessibility Awareness Day, an effort to promote digital accessibility and inclusion for people with disabilities, by sharing new features, training, and partners, along with the many new products announced at Google I/O.

Since everyone learns in different ways, we design technology that can adapt to a broad range of needs and learning styles. For example, you can now add captions in Slides and turn on live captions in Hangouts Meet, and we’ve improved discoverability in the G Suite toolbar. By making these features available—with even more in the works—teachers can help students learn in ways that work best for them.

Working with our partners to expand access

We’re not the only ones trying to make learning more accessible, so we’ve partnered with companies who are building apps to make it easier for teachers to communicate with all students.

One of our partners, Crick Software, just launched Clicker Communicator, a child-friendly communication tool for the classroom: bridging the gap between needs/wants and curriculum access, empowering non-verbal students with the tools to initiate and lead conversations, and enabling proactive participation in the classroom. It’s one of the first augmentative and alternative communication (AAC) apps specifically created for Chromebook users.

Learn more about the Clicker Communicator for Chromebooks, one of the first augmentative and alternative communication (AAC) apps specifically created for Chromebook users.

Learn more about Clicker Communicator, an AAC app for Chromebooks.

Assessing with accessibility in mind

Teachers use locked mode when giving Quizzes in Google Forms, only on managed Chromebooks, to eliminate distractions. Locked mode is now used millions of times per month, and many students use additional apps for accommodations when taking quizzes. We’ve been working with many developers to make sure their tools work with locked mode. One of those developers is our partner Texthelp®. Coming soon, when you enable locked mode in Quizzes in Google Forms, your students will be able to access Read&Write for Google Chrome and EquatIO® for Google that they rely on daily.

Another partner, Don Johnston, supports students with their apps including Co:Writer for word prediction, translation, and speech recognition and Snap&Read for read aloud, highlighting, and note-taking. Students signed into these extensions can use them on the quiz—even in locked mode. This integration will be rolling out over the next couple of weeks.

Learn more about the accessibility features available in locked mode, including ChromeVox, select-to-speak, and visual aids including high contrast mode and magnifiers.

Tools, training, and more resources

Assistive technology has the power to transform learning for more students, but educators need training, support, and tutorials to help their students get the most from the technology.

The new Accessibility section on our Google for Education website has information on Chromebooks and G Suite for Education, a module on our Teacher Center and printable flashcards, and EDU in 90 YouTube videos on G Suite and Chromebook accessibility features. Check out our accessibility tools and find training on how to use them to create more engaging, accessible learning experiences.

EDU in 90 video of Chromebook accessibility features

Watch the EDU in 90 on Chrome accessibility features.

We love hearing stories of how technology is making learning more accessible for more learners, so please share how you're using accessibility tools to support all types of learners, and requests for how we can continue to improve to meet the needs of more learners.

Search at Google I/O 2019

Google I/O is our yearly developer conference where we have the pleasure of announcing some exciting new Search-related features and capabilities. A good place to start is Google Search: State of the Union, which explains how to take advantage of the latest capabilities in Google Search:

We also gave more details on how JavaScript and Google Search work together and what you can do to make sure your JavaScript site performs well in Search.

Try out new features today

Here are some of the new features, codelabs, and documentation that you can try out today:
The Google I/O sign at Shoreline Amphitheatre at Mountain View, CA

Be among the first to test new features

Your help is invaluable to making sure our products work for everyone. We shared some new features that we're still testing and would love your feedback and participation.
A large crowd at Google I/O

Learn more about what's coming soon

I/O is a place where we get to showcase new Search features, so we're excited to give you a heads up on what's next on the horizon:
Two people posing for a photo at Google I/O, forming a heart with their arms

We hope these cool announcements help & inspire you to create even better websites that work well in Search. Should you have any questions, feel free to post in our webmaster help forums, contact us on Twitter, or reach out to us at any of the next events we're at.

We hear you: updates to Works with Nest

Last week we announced that we would stop supporting the Works with Nest (WWN) program on August 31, 2019 and transition to the Works with Google Assistant platform (WWGA). The decision to retire WWN was made to unify our efforts around third-party connected home devices under a single platform for developers to build features for a more helpful home. The goal is to simplify the experience for developers and to give you more control over how your data is shared. Since the announcement, we’ve received a lot of questions about this transition. Today we wanted to share our updated plan and clarify our approach.


First, we’re committed to supporting the integrations you value and minimizing disruptions during this transition, so here’s our updated plan for retiring WWN:

  • Your existing devices and integrations will continue working with your Nest Account, however you won’t have access to new features that will be available with a Google Account. If we make changes to the existing WWN connections available to you with your Nest Account, we will make sure to keep you informed.

  • We’ll stop accepting new WWN connections on August 31, 2019. Once your WWN functionality is available on the WWGA platform you can migrate with minimal disruption from a Nest Account to a Google Account.

Second, we want to clarify how this transition will work for you. Moving forward, we’ll deliver a single consumer and developer experience through the Google Assistant. WWGA already works with over 3,500 partners and 30,000 devices, and integrates seamlessly with Assistant Routines. Routines allow anyone to quickly customize how their smart devices work together based on simple triggers—whether you’re leaving home or going to bed.


One of the most popular WWN features is to automatically trigger routines based on Home/Away status. Later this year, we'll bring that same functionality to the Google Assistant and provide more device options for you to choose from. For example, you’ll be able to have your smart light bulbs automatically turn off when you leave your home. Routines can be created from the Google Home or Assistant apps, and can be created using the hardware you already own. Plus we’re making lots of improvements to setup and managing Routines to make them even easier to use.

We recognize you may want your Nest devices to work with other connected ecosystems. We’re working with Amazon to migrate the Nest skill that lets you control your Nest thermostat and view your Nest camera livestream via Amazon Alexa. Additionally, we’re working with other partners to offer connected experiences that deliver more custom integrations.

For these custom integrations, partners will undergo security audits and we’ll control what data is shared and how it can be used. You’ll also have more control over which devices these partners will see by choosing the specific devices you want to share. For example, you’ll be able to share your outdoor cameras, but not the camera in your nursery, with a security partner.

We know we can't build a one-size-fits-all solution, so we're moving quickly to work with our most popular developers to create and support helpful interactions that give you the best of Google Nest. Our goal remains to give you the tools you need to make your home, and those of other Nest users, helpful in the ways that matter most to you.


Affirming the identities of teachers and students in the classroom through #ISeeMe

Editor’s note: We’re thrilled to have Kristina Joye Lyles from DonorsChoose.org as a guest author, sharing about teaming up with Google.org to launch the #ISeeMe campaign.

I joined DonorsChoose.org in 2013 and have long been working with organizations like Google.org who share our belief in the power of teachers. To date, Google.org has provided over $25 million to support classrooms on DonorsChoose.org, and last week, they committed an additional $5 million to teachers, with a focus on supporting diverse and inclusive classrooms. Together, we’re kicking off #ISeeMe, a new effort to enable teachers and students across the country to celebrate their identities in their classrooms.

As a military brat, I attended many public schools across the U.S. but only had two teachers of color from kindergarten through twelfth grade. My teachers and professors of color had a particularly strong impact on me as mentors and role models; I was encouraged to see them as leaders in our school community, and their presence alone showed me that diversity and representation matter.

My story is like those of so many others. Research shows that students benefit from seeing themselves in their teachers and learning resources. For example, black students who have just one black teacher between third and fifth grade are 33 percent more likely to stay in school. Girls who attend high schools with a higher proportion of female STEM teachers are 19 percent more likely to graduate from college with a science or math major.

With this support from Google.org, teachers who are underrepresented in today’s public school classrooms--like teachers of color and female math and science teachers-- as well as all teachers looking to create more inclusive classrooms will get the support they need and deserve. Teachers from all backgrounds can take steps toward creating classrooms that reflect their students, whether they’re selecting novels with diverse characters to discuss or taking trainings to learn more about meeting the needs of students from culturally diverse backgrounds. And we’re eager to help them bring their ideas to life so that more students can see themselves reflected in their classrooms.

I’m thrilled that many teachers on DonorsChoose.org are already coming up with inspiring ways to foster classroom environments where every student can feel important and included.  Mr. Yung sees the power of food to bring his students together across different cultural backgrounds. Ms. McLeod is determined to bring her students from Lumberton, North Carolina, to the National Museum of African-American History and Culture in Washington, D.C. Mrs. Toro-Maysaspires to bring her bilingual students books with culturally relevant heroes and heroines.

We hope you’ll join us and the philanthropists of various backgrounds who have lit the torch for #ISeeMe today. If you are a public school teacher, you can set up an #ISeeMe classroom project right now at DonorsChoose.org. You can also access free inclusive classroom resources and ideas created for educators, by educators at any time in Google’s Teacher Center. And for those of you who have been inspired by a teacher, we invite you to explore classroom projects that are eligible for Google.org’s #ISeeMe donation matching—we would love to have your support for these teachers and classrooms.

New features to make audio more accessible on your phone

Smartphones are key to helping all of us get through our days, from getting directions to translating a word. But for people with disabilities, phones have the potential to do even more to connect people to information and help them perform everyday tasks. We want Android to work for all users, no matter their abilities. And on Global Accessibility Awareness Day, we’re taking another step toward this aim with updates to Live Transcribe, coming next month.


Available on 1.8 billion Android devices, Live Transcribe helps bridge the connection between the deaf and the hearing via real-time, real-world transcriptions for everyday conversations. With this update, we’re building on our machine learning and speech recognition technology to add new capabilities.


First, Live Transcribe will now show you sound events in addition to transcribing speech. You can see, for example, when a dog is barking or when someone is knocking on your door.  Seeing sound events allows you to be more immersed in the non-conversation realm of audio and helps you understand what is happening in the world. This is important to those who may not be able to hear non-speech audio cues such as clapping, laughter, music, applause, or the sound of a speeding vehicle whizzing by.


Second, you’ll now be able to copy and save transcripts, stored locally on your device for three days. This is useful not only for those with deafness or hearing loss—it also helps those who might be using real-time transcriptions in other ways, such as those learning a language for the first time or even, secondarily, journalists capturing interviews or students taking lecture notes. We’ve also made the audio visualization indicator bigger, so that users can more easily see the background audio around them.

New features of Live Transcribe

Caption: See sound events, like whistling or a dog barking, in the bottom left corner of the updated Live Transcribe.

With billions of active devices powered by Android, we’re humbled by the opportunity to build helpful tools that make the world’s information more accessible in the palm of everyone’s hand. As long as there are barriers for some people, we still have work to do. We’ll continue to release more features to enrich the lives of our accessibility community and the people around them.