Category Archives: Open Source Blog

News about Google’s open source projects and programs

Google Summer of Code 2021 Mentoring Orgs announced!

Google Summer of Code Header

Today, we are pleased to welcome 202 open source projects as our Google Summer of Code (GSoC) 2021 mentoring organizations. While many of the organizations have participated in GSoC in previous years, we are excited to welcome 31 organizations for their first summer mentoring GSoC students.

For a complete list of the accepted organizations visit the program website where each organization has their own page with details about their org and the all important list of Project Ideas that they wish for students to work on this summer.

Are you a student interested in participating in GSoC this year?
Student applications will open on Monday, March 29, 2021 at 19:00 UTC and the deadline to submit your application is Tuesday, April 13, 2021 at 19:00 UTC.

The most successful applications come from students who start preparing now. 
So remember to prepare early! Here are some proactive tips for students to accomplish before the application period begins:
  • Watch our short videos: What is GSoC? and Being a GSoC Student
  • Check out the Student Guide and Student Advice doc
  • Review the list of accepted organizations and reach out to the two or three that interest you the most now. All contact information for orgs is available on their organization page on the program site.
  • Now is the perfect time to read the Project Ideas of the orgs you are interested in and start asking questions of the mentors so you can understand the project and write a quality proposal as part of your application.
You can find more information on our website which includes a full timeline of important dates. We also highly recommend perusing the FAQ and Program Rules and watching some of our other videos with more details about GSoC for students and mentors.

A big congratulations—and thank you—to all of our mentor organizations! We look forward to working with all of you during Google Summer of Code 2021.

By Stephanie Taylor, Google Open Source

Season of Docs 2020: 5 Technical communication learnings as an open source contributor

Open source contributions have always intrigued me as they are a good way for developing skills needed in the real world. When I stumbled upon Season of Docs (SoD) 2020, while watching Amruta Ranades technical writing videos, I was thrilled to find an opportunity that serves as a bridge between technical writers and different open source organizations. I was intrigued by how there is an open source software or tool addressing different industry needs (eg: HR, video editing, education, robotics, etc), and how the lack of good documentation moderates the user adoption.

Figure 1: Open source projects are resourceful for developing new skills and building new industry connections

This blog post summarizes my technical communication learnings while working as an open source contributor with CircuitVerse.

Documentation audit is key: To prepare my technical writer application, I audited available documentation of five organizations for the following factors:

  • What documentation is available?
  • Who is the target audience?
  • Does it cover all the functionality?
  • Does it cover end-user needs?
  • Is the documentation any good?
Based on my findings, I further narrowed down my choice to two organizations. While preparing the SoD proposal for CircuitVerse(CV), I drafted a content proposal plan that included a mixed bag of video prototypes, tutorials and existing content improvement and remapping to illustrate my ability to understand real world problems and tech integration capabilities. You can find my final project proposal, which got me selected as a participant for Season of Docs 2020 with CircuitVerse here.*

*A special shout out to Audrey Tavares (a past-participator of SoD 2019, Oppia) for answering my queries and guiding me through the process.

Know your audience: When SoD concluded in December 2020, I had produced a series of video tutorials and rewritten the complete documentation for the CV simulator. You can find the complete project report here.

Audience analysis is key to the success of a documentation project. Do your research and ask enough questions to understand your audience and discover vital facts.

In my case, I concluded from my initial findings that the primary audience were students, but the mentors corrected me that the primary audience are educators. This provided a cue for the team that the message is not clear and we revised the content layout to cater to the primary audience.

Secondly, avoid assumptions, and be prepared with agreeing to disagree––conflicts can be healthy!

Write documentation for an evolving platform: Documentation empowers users to feel confident about the product and build trust. One of the key pain points of working on open source documentation is that the platform is continuously upgraded with new features and functionalities. So how do you strike a balance?

While the CV videos had some UI discrepancies, I focused on making sure that the user guide content (that is live) is detailed enough, and gives users clear instructions on how to accomplish a task. I learned that videos play a key role in demonstrating a workflow while the text documentation must be detailed and updated frequently.

Build up developer and documentation tools proficiency: Contributing to open source projects expands one’s familiarity with real world practices, including working with different tools like Adobe Camtasia, GitHub and Markdown. While my comfort level with GitHub grew, I learned better practices for working with Markdown for a large data set. I used the Docs to Markdown add-on for Google Docs to transpose the content in markdown before uploading it to GitHub.

Focus on fluid communication skills while working with subject matter experts: The SoD opportunity allowed me to experience working in a distributed, collaborative environment across borders and geographies––replicates the traditional corporate world.

While my mentors were receptive to my suggestions, I made an effort to keep them apprised about the progress and missing deadlines of the project. For instance, I improvised the documentation deliverable midway with their consent. I realized that it was important to have good, clear documentation available for the available popular topics before adding new content.

When my mentors and I were in doubt, we reached out to the CV Slack community for user feedback on different aspects.

Warming up as an Open Source Contributor
Although my project with CircuitVerse has been successfully completed, I look forward to my continued journey with CircuitVerse, and continued open source contributions with other organizations in 2021. If this is your first time applying for Season of Docs, refer the FAQ for technical writers to gather more insights into the program. You can also give a shout out to the extremely helpful program admins at [email protected] or post your queries on the Season of Docs Slack channel.

Guest Post by Pragati Chaplot Jain – Season of Docs Participant

SoD and technical documentation in an open source organization

Featured image


Documentation in open source organizations is a complicated job because there are so many new edits and issues occurring daily, that without a dedicated team, they become challenging to manage. Since open source organizations mostly rely on volunteers it is not unusual for a small task to take longer than if full-time team members were dedicated to it. Time is of the essence when improving documentation; since as contributors continue to add value to the organization, chances are there will be more work content to continuously work through. Season of Docs (SoD) aims to aid with documentation in an effective way.

SoD creates an environment where freelance technical writers can work with an open source organization for 3–5 months. The technical writers can get paid and the organizations get a dedicated individual to take care of their documentation —a win-win for everyone.

I had the opportunity to work with ESLint under SoD 2020, where I was able to learn quite a lot, with the aim to improve and organize the Configuration Documentation of ESLint. From understanding the work of ESLint and the structure of the existing documentation to managing a short-term project and collaborating with other volunteers, the project was filled with learning experiences. The best aspect was that I realized the worth of my contribution, but also felt appreciated all along. Often, technical documentation and communication are not given much attention but with SoD it was different.

The Positives

A different perspective

A freelance technical writer, in most cases, is a person who is not a part of the organization. An external perspective with the existing documentation can point out some issues which may otherwise go unnoticed. Additionally, since the freelance writer is entirely dedicated to the task they’re able to solely focus on that task.

Collaborative environment

One of the best things about open source organizations is the level of collaboration. While working in such an environment, where everyone is so willing to help and to give valuable input, a freelance contributor does not feel alienated at all. There is a lot of valuable feedback and the work of a technical writer is both respected and appreciated.

Some Challenges

As in any other project, documentation in an open source organization is not free of some hiccups.

Understanding the content

Freelance technical writers have limited time to get acquainted with the objectives and the content of the open source organization, making things a little hard if the writers have not previously interacted with (or heard of) the organizations they are working with. Reflecting on my own experience, I feel that this was a major concern for me since I had no previous experience with linting software.

Thanks to the 'community bonding period' however, which lasts for almost a month before the project officially begins, the freelance writers can get some understanding of the organization and the content.

Time

Since most of the contributors are working voluntarily, their engagements can prolong the process of review and feedback, which can make meeting the project deadline feel challenging at times

Overcoming the Challenges

It doesn't matter if you're working under the SoD umbrella, contributing to strengthen your portfolio, or trying to gain more practical experience, the following tips can be helpful.
  • Communication is key. It is important to convey your concerns regarding time, commitments, and other engagements so that the expectations are met.
  • Ask questions! You won’t know everything about the project.
  • Be flexible. Your project might change after you start working on it, and things don't always go as you planned.
  • Use the 'community bonding period' to interact with your mentor and other collaborators, indulge in small tasks, and get to know the people and the organization.
  • Value the work and feedback of others. Everyone who is a part of the community is trying to add some value to the organization.
SoD serves as an excellent platform in bringing technical writers and open source organizations closer.

Guest Post by Khawar Latif Khan – Season of Docs Participant

Basis Universal Textures – Khronos Ratification and Support

In 2019, Google partnered with Binomial to open source the Basis Universal texture codec with the goal to make high-quality textures more efficient for network transmission and graphics processing unit (GPU) memory usage. The Basis Universal texture format is 6-8 times smaller than JPEG on the GPU, yet has similar storage size as JPEG—making it a great alternative to current GPU compression methods that are inefficient and don’t operate cross platform. The format is intended for a variety of use cases: games, virtual and augmented reality, maps, photos, small videos, and more.

the Basis Universal texture codec
Over the past year, several exciting developments have been made to make Basis Universal more useful. A new high-quality mode was introduced, allowing the codec to use the highest quality formats modern GPUs support, finally bringing the web up to modern GPU texture standards—with cross platform support. Additionally, the Basis encoder now has an option to build a WebAssembly version, allowing for innovative web applications to take advantage of outputting to the super-compressed format. Lastly, the Khronos Group has announced and ratified the Basis Universal texture extension to glTF format, allowing for compressed assets that can be shipped and displayed everywhere in a KTX 2.0 container. This will have profound impacts on how models are distributed via the web and advance applications like eCommerce, making it easy to take advantage of 3D content on any platform.

In addition to these new features, developers worldwide have been making it easier to take advantage of Basis Universal. <model-viewer> has just added support for glTF files with universal textures, making it as easy as two lines of JavaScript to have beautiful, interactive 3D models on your page and in the coming months, the <model-viewer> editor will add support for encoding to universal textures. Additionally, 3D engines like Three.js, Babylon.js, Godot, Archilogic, and Playcanvas have added support for Basis Universal, with more engine support coming. Basis Universal is already in applications many use every day.

We look forward to seeing Basis Universal adoption soar as it has never been easier to distribute 3D assets. Check out the code and demo on GitHub, let us know what you think, and how you plan to use it!

By Stephanie Hurlburt, Binomial and Jamieson Brettle, Chrome Media

A new resource for coordinated vulnerability disclosure in open source projects

One of the joys of open source is the freedom it gives you to create: contributors get to build the projects they want how they want; it’s up to them. Of course, blank slates don’t come with directions, which makes more niche areas of software development and management a challenge for contributors. Vulnerability disclosure is one of those areas.

Google doesn’t restrict its open source work to one team, instead we teach any and all Googlers about open source: how to release, how to contribute, how to use, and, in general, how to be a good open source citizen. This approach scales well, and gives people the knowledge to be lifelong open source community members. This includes sharing knowledge about open source security, a topic that isn’t new, but is finally getting the industry attention it deserves.

The intimidating blank slate and a lack of time for contributors to develop policies means many open source projects have no documented vulnerability reporting information, much less a plan for how to handle and disclose a reported vulnerability. We recently updated our guidance for coordinated vulnerability disclosure in open source projects that come out of Google and have published it in hopes that other projects will find this helpful for their project security practices.

The new guide has three sections:
It’s a myth that if a project hasn’t received a vulnerability report yet, it doesn’t need a disclosure policy. It’s also a myth that you need to be “a security person” to implement a vulnerability disclosure policy. A successful coordinated vulnerability disclosure frequently comes down to good process management and clear, thoughtful communication. You don’t have to be an expert in operating systems capabilities to understand how a reporter manipulated it to cause an account privilege escalation through your project. A predetermined policy, some templates, and a well-executed runbook will take you through discovering, patching, and disclosing most kinds of vulnerabilities.

Coordinated Vulnerability Disclosure in Open Source Projects

Vulnerability disclosure is part of Fix in the Know, Prevent, Fix framework we proposed recently for open source vulnerability management. In today’s industry, with all of our supply chain dependencies, improving open source project security in even one project can have a multiplying effect. Vulnerability disclosure is a key aspect of that overall security posture. Our hope is that projects will take this guide, remix and adapt to their projects, and share their changes with others so we can collectively increase our open source security.

By Anne Bertucio, Google Open Source

Updates on the Tsunami Security Scanning Engine


Several months ago, we open sourced the Tsunami security scanner: a false-positive-free infrastructure scanning engine focusing on high severity, actively exploited vulnerabilities. Today, we are releasing the first major update for Tsunami.

In the last few months, we have done a lot of work in the background to prepare Tsunami for the next step and focused on the following:
  • Vulnerability research: In order to keep Tsunami's detection capabilities up-to-date, we kicked-off various projects to research the exploitation of vulnerabilities in the wild. We will soon publish more information about our initiatives in this space—stay tuned.
  • New detection capabilities: Based on our research, we have added 15 new detector plugins to Tsunami for actively exploited vulnerabilities.
  • Continuous Integration pipeline for our open-source builds: We set up a CI/CD pipeline that automatically mirrors and tests changes between our internal version management system and the open source repository. This will enable us to easily merge internal and external contributions.
  • Test bed for end-to-end testing: This summer we hosted an intern (Yuxin Wu), who built and open-sourced a test bed for Tsunami. The test bed can automatically deploy arbitrary versions of off-the-shelf software based on docker images. We are using the test bed to automatically check whether a Tsunami detector is working for all vulnerable versions of a software and keeps functioning for future versions.
  • Web application fingerprinting: We added Web application fingerprinting capabilities to Tsunami. Tsunami, now detects popular off-the-shelve Web applications. This information can be used by Tsunami for more precise and less intrusive vulnerability verification. Furthermore, it enables security teams to build a software inventory based on Tsunami scans. We'll keep working on refining our fingerprinting approach and extending our fingerprinting database.

Today, we are releasing the new detectors and the fingerprinting capabilities. You can find the new detectors and the web fingerprinter in our plugin repository.

If you are adopting Tsunami within your organization and if you have questions or would like to contribute, feel free to contact us at any time at [email protected].

By Guoli Ma, Claudio Criscione & Sebastian Lekies, Vulnerability Management Team

The 2021 Season of Docs application for organizations is open!

Season of docs icon

Google Open Source is delighted to announce Season of Docs 2021!

The 2019 Season of Docs brought together open source organizations and technical writers to create 44 successful documentation projects. In 2020, we had 64 successful standard-length technical writing projects and are still awaiting long-running project results.

In 2021, the Season of Docs program will continue to support better documentation in open source and provide opportunities for skilled technical writers to gain open source experience. In addition, building on what we’ve learned from the successful 2019 and 2020 projects, we’re expanding our focus to include learning about effective metrics for evaluating open source documentation.

What are the 2021 program changes?

Season of Docs 2021 will allow open source organizations to apply for a grant based on their documentation needs. If selected, open source organizations will use their grant to hire a technical writer directly to complete their documentation project. Organizations will have up to six months to complete their documentation project. Keep reading for more information about the organization application or visit the Season of Docs site.

Technical writers interested in working with accepted open source organizations will be able to share their contact information via the Season of Docs GitHub repository; or they may submit proposals directly to the organizations and will not need to submit a formal application through Season of Docs.

Participating organizations will help broaden our understanding of effective documentation practices and metrics in open source by submitting a final case study upon completion of the program. The project case study will outline the problem the documentation project was intended to solve, what metrics were used to judge the effectiveness of the documentation, and what the organization learned for the future. All the project case studies will be published on the Season of Docs site at the end of the program.

How does it work?

February 9 - March 26 Open source organizations apply to take part in Season of Docs
April 16 Google publishes the list of accepted organizations, along with their project proposals and doc development can begin.
June 16 Organization administrators begin to submit monthly evaluations to report on the status of their project.
November 30 Organization administrators submit their case study and final project evaluation.
December 14 Google publishes the 2021 case studies and aggregate project data.
May 2, 2022 Organizations begin to participate in post-program followup surveys.

See the timeline for details.

Organization applications

Organization applications are now open! The deadline to apply is March 26, 2021 at 18:00 UTC.

To apply, first read the guidelines for creating an organization application on the Season of Docs website.

Take a look at the examples of project ideas, then create a project proposal based on your open source project’s actual documentation needs. Your goal is to attract technical writers to your organization, making them feel comfortable about approaching the organization and excited about what they can achieve.

Organizations can submit their applications here: http://goo.gle/3qVxArQ. Organization applications close on March 26th at 18:00 UTC.

Technical writers interested in participating in the 2021 Season of Docs should read our guide for technical writers on the Season of Docs website.

If you have any questions about the program, please email us at [email protected].

Join us

Explore the Season of Docs website at g.co/seasonofdocs to learn more about participating in the program. Use our logo and other promotional resources to spread the word. Check out the timeline and FAQ, and get ready to apply!

By Kassandra Dhillon and Erin McKean, Google Open Source Programs Office

Google joins the Rust Foundation

Droidstacean: Rust mascot Ferris, with Android mascot color/aspects
Droidstacean by Ivan Lozano, based on a design by Karen Rustad Tölva.
Rust is a systems programming language that combines low-level control over performance with modern language features and a focus on memory safety. Memory safety has been an enduring challenge for software developers, particularly those working on systems programs. Google has begun using Rust in settings where memory safety and performance are key considerations, including in key Android systems.

The Rust Core Team recently completed its work to build a new home for Rust: The Rust Foundation. Building on Google’s longstanding investments in C/C++ and the compilers and toolchains, we are delighted to announce our membership in the Rust Foundation. We look forward to participating more in the Rust community, in particular working across the industry on key issues including interoperability with C++, coordinating security reviews and decreasing the costs of crate updates, and continuing to grow our investments in existing Rust projects.

Memory safety security defects frequently threaten device safety, especially for applications and operating systems. For example, on Android, we’ve found that more than half of the security vulnerabilities we addressed in 2019 resulted from memory safety bugs. And this is despite significant efforts from Google and other contributors to the Android Open Source Project to either invest in or invent a variety of technologies, including AddressSanitizer, improved memory allocators, and numerous fuzzers and other code checking tools. Rust has proven effective at providing an additional layer of protection beyond even these tools in many other settings, including browsers, games, and even key libraries. We are excited to expand both our usage of Rust at Google and our contributions to the Rust Foundation and Rust ecosystem.

Today, some examples of projects where Google is either already using Rust or contributing to the Rust ecosystem include:
  • Operating system modules in Android, including bluetooth and Keystore 2.0
  • Low-level projects, such as the crosvm virtual machine monitor and drivers (alternative to QEMU) used in ChromeOS
  • Contributing to open source projects that we use and use Rust, such as the Mercurial source code control system
  • Firmware for FIDO security key support
And, there are many additional projects that are evaluating the use of Rust for new libraries or products. Some examples include:
We are also excited to support key Rust projects and their maintainers, such as:
  • Adding Rust code to curl
  • Working with ISRG to add a Rust TLS module to the Apache HTTP Server Project
We can’t wait to work across the industry to contribute to and support existing projects and libraries as well as help build out key areas such as C++ interoperability and security review.

By Lars Bergstrom, Director of Engineering, Android Platform Programming Languages

Writing fuzz tests with ease using Bazel

We are announcing Bazel support for developing and testing fuzz tests, with OSS-Fuzz integration, through the new rules_fuzzing Bazel library.

Fuzzing is an effective, well-known testing technique for finding security and stability bugs in software. But writing and testing fuzz tests can be tedious. Developers typically need to:
  • Implement a fuzz driver function, which exercises the API under test;
  • Build the code with the proper instrumentation (such as Address Sanitizer);
  • Link it with one of the available fuzzing engine libraries (libFuzzer, AFL++, Honggfuzz, etc.) that provide the core test generation logic;
  • Run the fuzz test binary with the right set of flags (e.g., to specify corpora or dictionaries);
  • Package the fuzz test and its resources for consumption by fuzzing infrastructures, such as OSS-Fuzz.
Unfortunately, build systems don't traditionally offer any support beyond the core primitives of producing executables, so projects adopting fuzzing often end up reimplementing fuzz test recipes.

Bazel is a versatile and extensible build system, focused on scalable, reliable, and reproducible builds. Originally designed to scale to Google's entire monolithic repository, it now underpins large enterprises and key open source Internet infrastructure projects.

We are pleased to announce that projects using Bazel can get advanced fuzzing support through the new rules_fuzzing extension library. The new fuzzing rules take care of all the boilerplate needed to build and run fuzz tests. Developers simply write the fuzz driver code and define a build target for it (example driver and target for RE2). Fuzz tests can be built and run using a number of fuzzing engines provided out-of-the-box, such as libFuzzer and Honggfuzz, as well as sanitizers. The rule library also provides the ability to define additional fuzzing engines.

You can integrate the fuzzing library with around 10 LOC in your Bazel WORKSPACE file. Defining a fuzz test in Bazel is as easy as writing the following in your BUILD file:

load("@rules_fuzzing//fuzzing:cc_deps.bzl, "cc_fuzz_test")
cc_fuzz_test(
   name = "my_fuzz_test",
   srcs = ["my_fuzz_test.cc"],
   deps = [":my_library"],
)


You can easily test the fuzzer locally by invoking its launcher:

$ bazel run --config=asan-libfuzzer //:my_fuzz_test_run

To improve the effectiveness of test case generation, fuzz tests also support seed corpora and dictionaries, through additional rule attributes. They will automatically be validated and included in fuzz test runs. Fuzz tests also serve as regression tests on the seed corpus. For example, you can add previously found and fixed crashes to the corpus and have them tested in your CI workflows:

$ bazel test --config=asan-replay //:my_fuzz_test

The fuzzing rules provide built-in support for OSS-Fuzz, our continuous fuzzing service for open source projects. The OSS-Fuzz support drastically simplifies writing the build scripts in project integration by automatically packaging the fuzz test and its dependencies using the expected OSS-Fuzz structure.

The Envoy Proxy project is one of the early adopters of the fuzzing rules library. As a large, mature C++ codebase, Envoy has maintained its own custom implementation of fuzzing support for its over 50 fuzz targets written so far. By switching to the new Bazel fuzzing rules, Envoy's fuzz targets automatically gained new features, such as local running and testing tools and support for multiple fuzzing engines. At the same time, Envoy simplified its OSS-Fuzz integration scripts. Moreover, it will automatically gain future functionality (e.g., more effective fuzzing engines, better coverage tracking, improved corpus management) as the Bazel fuzzing rules library evolves.

The Bazel rules for fuzzing draw from Google's experience providing effective fuzzing tools to our internal developers. We hope the new Bazel support for fuzzing will lower the barrier to fuzzing adoption in open source communities, further increasing the security and reliability of many projects. To learn more about integrating the fuzzing rules into your project, take a look at the Getting Started section in the documentation.

By Stefan Bucur, Software Analysis, Asra Ali, Envoy, and Abhishek Arya, OSS-Fuzz – Google

Launching OSV – Better vulnerability triage for open source

Open Source Vulnerabilities logo


We are excited to launch OSV (Open Source Vulnerabilities), our first step towards improving vulnerability triage for developers and consumers of open source software. The goal of OSV is to provide precise data on where a vulnerability was introduced and where it got fixed, thereby helping consumers of open source software accurately identify if they are impacted and then make security fixes as quickly as possible. We have started OSV with a data set of fuzzing vulnerabilities found by the OSS-Fuzz service. OSV project evolved from our recent efforts to improve vulnerability management in open source ("Know, Prevent, Fix" framework).

Vulnerability management can be painful for both consumers and maintainers of open source software, with tedious manual work involved in many cases.

For consumers of open source software, it is often difficult to map a vulnerability such as a Common Vulnerabilities and Exposures (CVE) entry to the package versions they are using. This comes from the fact that versioning schemes in existing vulnerability standards (such as Common Platform Enumeration (CPE)) do not map well with the actual open source versioning schemes, which are typically versions/tags and commit hashes. The result is missed vulnerabilities that affect downstream consumers.

Similarly, it is time consuming for maintainers to determine an accurate list of affected versions or commits across all their branches for downstream consumers after a vulnerability is fixed, in addition to the process required for publication. Unfortunately, many open source projects, including ones that are critical to modern infrastructure, are under resourced and overworked. Maintainers don't always have the bandwidth to create and publish thorough, accurate information about their vulnerabilities even if they want to.

These challenges result in open source consumers not incorporating important security fixes promptly. OSV aims to:
  1. Reduce the work required by maintainers to publish vulnerabilities, and
  2. Improve the accuracy of vulnerability queries for downstream consumers by providing precise vulnerability metadata in an easy-to-query database (complementing existing vulnerability databases).

Automation

OSV aims to simplify the vulnerability reporting process for an open source package maintainer by accurately determining the list of affected versions and commits. This requires providing both the commits that introduce and fix the bugs. If that information is not available, OSV requires providing a reproduction test case and steps to generate an application build, and then it performs bisection to find these commits in an automated fashion. OSV takes care of the rest of the analysis to figure out impacted commit ranges (accounting for cherry picks) and versions/tags.

How OSV works


OSV automates the triage workflow for an open source package consumer by providing an API to query for vulnerabilities. A typical OSV workflow for a package consumer looks like the picture above:
  1. A package consumer sends a query to OSV with a package version or commit hash as input.
    curl -X POST -d \
    '{"commit": "6879efc2c1596d11a6a6ad296f80063b558d5e0f"}' \
    'https://api.osv.dev/v1/query?key=$API_KEY'

     curl -X POST -d \
      '{"version": "1.0.0", "package": {"name": "pkg", "ecosystem""pypi"}' \
      'https://api.osv.dev/v1/query?key=$API_KEY'
  1. OSV looks up the set of vulnerabilities affecting that particular version and returns a list of vulnerabilities impacting the package. The vulnerability metadata is returned in a machine-readable JSON format.
  2. The package consumer uses this information to either cherry-pick security fixes (based on precise fix metadata) or update to a later version.

Ongoing work

OSV currently provides access to thousands of vulnerabilities from 380+ critical OSS projects integrated with OSS-Fuzz. We are planning to work with open source communities to extend with data from various language ecosystems (e.g. NPM, PyPI) and work out a pipeline for package maintainers to submit vulnerabilities with minimal work.

Our goal with OSV is to rethink and promote better, scalable vulnerability tracking for open source. In an ideal world, vulnerability management should be done closer to the actual open source development process, aided by automated infrastructure. Projects that depend on open source should be promptly notified and fixes uptaken quickly when a vulnerability is reported.

You can access the OSV website and documentation at https://osv.dev. You can explore the open source repo or contribute to the project on GitHub, and join the mailing list to stay up to date with OSV and share your thoughts on vulnerability tracking. 

By Oliver Chang and Kim Lewandowski, Google Security Team