Author Archives: Kaylin Trychon

Fuzzing Java in OSS-Fuzz

Posted by Jonathan Metzman, Google Open Source Security Team

OSS-Fuzz, Google’s open source fuzzing service, now supports fuzzing applications written in Java and other Java Virtual Machine (JVM) based languages (e.g. Kotlin, Scala, etc.). Open source projects written in JVM based languages can add their project to OSS-Fuzz by following our documentation.

The Google Open Source Security team partnered with Code Intelligence to integrate their Jazzer fuzzer with OSS-Fuzz. Thanks to their integration, open source projects written in JVM-based languages can now use OSS-Fuzz for continuous fuzzing.

OSS-Fuzz has found more than 25,000 bugs in open source projects using fuzzing. We look forward to seeing how this technique can help secure and improve code written in JVM-based languages.

What can Jazzer do?

Jazzer allows users to fuzz code written in JVM-based languages with libFuzzer, as they already can for code written in C/C++. It does this by providing code coverage feedback from JVM bytecode to libFuzzer. Jazzer already supports important libFuzzer features such as:

  • FuzzedDataProvider for fuzzing code that doesn’t accept an array of bytes.
  • Evaluation of code coverage based on 8-bit edge counters.
  • Value profile.
  • Minimization of crashing inputs.
The intent for Jazzer is to support all libFuzzer features eventually.

What Does Jazzer Support?

Jazzer supports all languages that compile to JVM bytecode, since instrumentation is done on the bytecode level. This includes:
  • Java
  • Kotlin
  • Scala
  • Clojure
Jazzer can also provide coverage feedback from native code that is executed through JNI. This can uncover interesting memory corruption vulnerabilities in memory unsafe native code.

Why Fuzz Java/JVM-based Code?

As discussed in our post on Atheris, fuzzing code written in memory safe languages, such as JVM-based languages, is useful for finding bugs where code behaves incorrectly or crashes. Incorrect behavior can be just as dangerous as memory corruption. For example, Jazzer was used to find CVE-2021-23899 in json-sanitizer which could be exploited for cross-site scripting (XSS). Bugs causing crashes or incorrect exceptions can sometimes be used for denial of service. For example, OSS-Fuzz recently found a denial of service issue that could have been used to take “a major part of the ethereum network offline”.

When fuzzing memory safe code, you can use the same classic approach for fuzzing memory unsafe code: passing mutated input to code and waiting for crashes. Or you can take a more unit test like approach where your fuzzer verifies that the code is behaving correctly (example).

Another way fuzzing can find interesting bugs in JVM-based code is through differential fuzzing. With differential fuzzing, your fuzzer passes mutated input from the fuzzer to multiple library implementations that should have the same functionality. Then it compares the results from each library to find differences.
Check out our documentation to get started. We will explore this more during our OSS-Fuzz talk at FuzzCon Europe.

Introducing sigstore: Easy Code Signing & Verification for Supply Chain Integrity



One of the fundamental security issues with open source is that it’s difficult to know where the software comes from or how it was built, making it susceptible to supply chain attacks. A few recent examples of this include dependency confusion attack and malicious RubyGems package to steal cryptocurrency.

Today we welcome the announcement of sigstore, a new project in the Linux Foundation that aims to solve this issue by improving software supply chain integrity and verification.

Installing most open source software today is equivalent to picking up a random thumb-drive off the sidewalk and plugging it into your machine. To address this we need to make it possible to verify the provenance of all software - including open source packages. We talked about the importance of this in our recent Know, Prevent, Fix post.

The mission of sigstore is to make it easy for developers to sign releases and for users to verify them. You can think of it like Let’s Encrypt for Code Signing. Just like how Let’s Encrypt provides free certificates and automation tooling for HTTPS, sigstore provides free certificates and tooling to automate and verify signatures of source code. Sigstore also has the added benefit of being backed by transparency logs, which means that all the certificates and attestations are globally visible, discoverable and auditable.

Sigstore is designed with open source maintainers, for open source maintainers. We understand long-term key management is hard, so we've taken a unique approach of issuing short-lived certificates based on OpenID Connect grants. Sigstore also stores all activity in Transparency Logs, backed by Trillian so that we can more easily detect compromises and recover from them when they do occur. Key distribution is notoriously difficult, so we've designed away the need for them by building a special Root CA just for code signing, which will be made available for free.

We have a working prototype and proof of concepts that we're excited to share for feedback. Our goal is to make it seamless and easy to sign and verify code:


It has been fun collaborating with the folks from Red Hat and the open source community on this project. Luke Hinds, one of the lead developers on sigstore and Security Engineering Lead at Red Hat says, "I am very excited about sigstore and what this means for improving the security of software supply chains. sigstore is an excellent example of an open source community coming together to collaborate and develop a solution to ease the adoption of software signing in a transparent manner." We couldn’t agree more.

Mike Malone, the CEO of Smallstep, helped with the overall design of sigstore. He adds, “In less than a generation, open source has grown from a niche community to a critical ecosystem that powers our global economy and institutions of society and culture. We must ensure the security of this ecosystem without undermining the open, decentralized collaboration that makes it work. By building on a clever composition of existing technologies that respect privacy and work at scale, sigstore is the core infrastructure we need to solve this fundamental problem. It’s an ambitious project with potential for global impact. I’m impressed by the rapid progress that’s been made by Google, Red Hat, and Linux Foundation over the past few months, and I’m excited to hear feedback from the broader community.”

While we are happy with the progress that has been made, we know there is still work to be done before this can be widely relied upon. Upcoming plans for sigstore include: hardening the system, adding support for other OpenID Connect providers, updating documentation and responding to community feedback.

Sigstore is in its early days, but we're really excited about its future. Now is a great time to provide feedback, try out the tooling and get involved with the project as design details are still being refined.

#ShareTheMicInCyber: Rob Duhart

Posted by Matt Levine, Director, Risk Management

In an effort to showcase the breadth and depth of Black+ contributions to security and privacy fields, we’ve launched a series in support of #ShareTheMicInCyber that aims to elevate and celebrate the Black+ voices in security and privacy we have here at Google.

Today, we will hear from Rob Duhart, he leads a cross functional team at Google that aims to enable and empower all of our products, like Chrome, Android and Maps, to mature their security risk journey.

Rob’s commitment to making the internet a safer place extends far beyond his work at Google, he is a member of the Cyber Security Executive Education Advisory Board of Directors at Washington University in St. Louis, where he helps craft the future of cyber security executive education globally. Rob also sits on the board of the EC-Council and has founded chapters of the International Consortium of Cybersecurity Professionals (ICMCP) across the country.

Rob is passionate about securing the digital world and supporting Black+, women, and underrepresented minorities across the technology landscape.


Why do you work in security or privacy?

I have been in the cyber world long enough to know how important it is for security and privacy to be top of mind and focus for organizations of all shapes and sizes. My passion lies in keeping users and Googlers safe. One of the main reasons I joined Google is its commitment to security and privacy.


Tell us a little bit about your career journey to Google...

I was fortunate to begin my cybersecurity career in the United States Government working at the Department of Energy, FBI, and the Intelligence Community. I transitioned to the private sector in 2017 and have been fortunate to lead talented security teams at Cardinal Health and Ford Motor Company.

My journey into cybersecurity was not traditional. I studied Political Science at Washington University in St. Louis, completed graduate education at George Mason University and Carnegie Mellon University. I honed my skills and expertise in this space through hands on experience and with the support of many amazing mentors. It has been the ride of a lifetime and I look forward to what is next.

To those thinking about making a career change or are just starting to get into security, my advice is don’t be afraid to ask for help.


What is your security or privacy "soapbox"?

At Google, we implement a model known as Federated Security, where our security teams partner across our Product Areas to enable security program maturity Google wide. Our Federated Security team believes in harnessing the power of relationship, engagement, and community to drive maturity into every product. Security and privacy are team sports – it takes business leaders and security leaders working together to secure and protect our digital and physical worlds.

If you are interested in following Rob’s work here at Google and beyond, please follow him on Twitter @RobDuhart. We will be bringing you more profiles over the coming weeks and we hope you will engage with and share these with your network.

If you are interested in participating or learning more about #ShareTheMicInCyber, click here.

Celebrating the influence and contributions of Black+ Security & Privacy Googlers

Posted by Royal Hansen, Vice President, Security

Black History Month may be coming to a close, but our work to build sustainable equity for Google’s Black+ community, and externally is ongoing. Currently, Black Americans make up less than 12% of information security analysts in the U.S. In an industry that consistently requires new ideas to spark positive change and stand out against the status quo, it is necessary to have individuals who think, speak, and act in diverse ways. Diverse security teams are more innovative, produce better products and enhance an organization's ability to defend against cyber threats.

In an effort to amplify the contributions of the Black+ community to security and privacy fields, we’ll be sharing profiles of Black+ Googlers working on innovative privacy and security solutions over the coming weeks, starting with Camille Stewart, Google’s Head of Security Policy for Google Play and Android.

Camille co-founded #ShareTheMicInCyber, an initiative that pairs Black security practitioners with prominent allies, lending their social media platforms to the practitioners for the day. The goal is to break down barriers, engage the security community, and promote sustained action. The #ShareTheMicInCyber campaign will highlight Black women in the security and privacy sector on LinkedIn and Twitter on March 19, 2021 and throughout March 2021 in celebration of Women's History Month. Follow the #ShareTheMicInCyber on March 19th to support and amplify Black women in security and privacy.

Read more about Camille’s story below 

#ShareTheMicInCyber: Camille Stewart


Today, we will hear from Camille Stewart, she leads security, privacy, election integrity, and dis/misinformation policy efforts for Google's mobile business. She also spearheads a cross-Google security initiative that sets the strategic vision and objectives for Google’s engagement on security and privacy issues.

In her (not so) spare time, Camille is co-founder of the #ShareTheMicInCyber initiative – which aims to elevate the profiles, work, and lived experiences of Black cyber practitioners. This initiative has garnered national and international attention and has been a force for educating and bringing awareness to the challenges Black security practitioners face in industry. Camille is also a cybersecurity fellow at Harvard University, New America and Truman National Security Project. She sits on the board of the International Foundation for Electoral Systems and of Girl Security, an organization that is working to close the gender gap in national security through learning, training, and mentoring support for girls.





Why do you work in security or privacy?

I work in this space to empower people in and through technology by translating and solving the complex challenges that lie at the intersection of technology, security, society, and the law.

Tell us a little bit about your career journey to Google

Before life at Google, I managed cybersecurity, election security, tech innovation, and risk issues at Deloitte. Prior to that, I was appointed by President Barack Obama to be the Senior Policy Advisor for Cyber Infrastructure & Resilience Policy at the Department of Homeland Security. I was the Senior Manager of Legal Affairs at Cyveillance, a cybersecurity company after working on Capitol Hill.

What is your security or privacy "soapbox"?

Right now, I have a few. Users being intentional about their digital security similar to their physical security especially with their mobile devices and apps. As creators of technology, we need to be more intentional about how we educate our users on safety and security. At Google, security is core to everything we do and build, it has to be. We recently launched our Safer With Google campaign which I believe is a great resource for helping users better understand their security and privacy journey.

As an industry, we need to make meaningful national and international progress on digital supply chain transparency and security.

Lastly, the fact that systemic racism is a cybersecurity threat. I recently penned a piece for the Council on Foreign Relations that explores how racism influences cybersecurity and what we must do as an industry to address it.

If you are interested in following Camille’s work here at Google and beyond, please follow her on Twitter @CamilleEsq. We will be bringing you more profiles over the coming weeks and we hope you will engage with and share these with your network. 

If you are interested in participating or learning more about #ShareTheMicInCyber, click here.

Mitigating Memory Safety Issues in Open Source Software


Memory-safety vulnerabilities have dominated the security field for years and often lead to issues that can be exploited to take over entire systems. 


A recent study found that "~70% of the vulnerabilities addressed through a security update each year continue to be memory safety issues.” Another analysis on security issues in the ubiquitous `curl` command line tool showed that 53 out of 95 bugs would have been completely prevented by using a memory-safe language.


Software written in unsafe languages often contains hard-to-catch bugs that can result in severe security vulnerabilities, and we take these issues seriously at Google. That’s why we’re expanding our collaboration with the Internet Security Research Group to support the reimplementation of critical open-source software in memory-safe languages. We previously worked with the ISRG to help secure the Internet by making TLS certificates available to everyone for free, and we're looking forward to continuing to work together on this new initiative.


It's time to start taking advantage of memory-safe programming languages that prevent these errors from being introduced. At Google, we understand the value of the open source community and in giving back to support a strong ecosystem. 


To date, our free OSS-Fuzz service has found over 5,500 vulnerabilities across 375 open source projects caused by memory safety errors, and our Rewards Program helps encourage adoption of fuzzing through financial incentives. We've also released other projects like Syzkaller to detect bugs in operating system kernels, and sandboxes like gVisor to reduce the impact of bugs when they are found.


The ISRG's approach of working directly with maintainers to support rewriting tools and libraries incrementally falls directly in line with our perspective here at Google. 


The new Rust-based HTTP and TLS backends for curl and now this new TLS library for Apache httpd are an important starting point in this overall effort. These codebases sit at the gateway to the internet and their security is critical in the protection of data for millions of users worldwide. 


We'd like to thank the maintainers of these projects for working on such widely-used and important infrastructure, and for participating in this effort.


We're happy to be able to support these communities and the ISRG to make the Internet a safer place. We appreciate their leadership in this area and we look forward to expanding this program in 2021.


Open source security is a collaborative effort. If you're interested in learning more about our efforts, please join us in the Securing Critical Projects Working Group of the Open Source Security Foundation.