Tag Archives: Google+

Summer updates from Coral

Posted by the Coral Team

Summer has arrived along with a number of Coral updates. We're happy to announce a new partnership with balena that helps customers build, manage, and deploy IoT applications at scale on Coral devices. In addition, we've released a series of updates to expand platform compatibility, make development easier, and improve the ML capabilities of our devices.

Open-source Edge TPU runtime now available on GitHub

First up, our Edge TPU runtime is now open-source and available on GitHub, including scripts and instructions for building the library for Linux and Windows. Customers running a platform that is not officially supported by Coral, including ARMv7 and RISC-V can now compile the Edge TPU runtime themselves and start experimenting. An open source runtime is easier to integrate into your customized build pipeline, enabling support for creating Yocto-based images as well as other distributions.

Windows drivers now available for the Mini PCIe and M.2 accelerators

Coral customers can now also use the Mini PCIe and M.2 accelerators on the Microsoft Windows platform. New Windows drivers for these products complement the previously released Windows drivers for the USB accelerator and make it possible to start prototyping with the Coral USB Accelerator on Windows and then to move into production with our Mini PCIe and M.2 products.

New fresh bits on the Coral ML software stack

We’ve also made a number of new updates to our ML tools:

  • The Edge TPU compiler is now version 14.1. It can be updated by running sudo apt-get update && sudo apt-get install edgetpu, or follow the instructions here
  • Our new Model Pipelining API allows you to divide your model across multiple Edge TPUs. The C++ version is currently in beta and the source is on GitHub
  • New embedding extractor models for EfficientNet, for use with on-device backpropagation. Embedding extractor models are compiled with the last fully-connected layer removed, allowing you to retrain for classification. Previously, only Inception and MobileNet were available and now retraining can also be done on EfficientNet
  • New Colab notebooks to retrain a classification model with TensorFlow 2.0 and build C++ examples

Balena partners with Coral to enable AI at the edge

We are excited to share that the Balena fleet management platform now supports Coral products!

Companies running a fleet of ML-enabled devices on the edge need to keep their systems up-to-date with the latest security patches in order to protect data, model IP and hardware from being compromised. Additionally, ML applications benefit from being consistently retrained to recognize new use cases with maximum accuracy. Coral + balena together, bring simplicity and ease to the provisioning, deployment, updating, and monitoring of your ML project at the edge, moving early prototyping seamlessly towards production environments with many thousands of devices.

Read more about all the benefits of Coral devices combined with balena container technology or get started deploying container images to your Coral fleet with this demo project.

New version of Mendel Linux

Mendel Linux (5.0 release Eagle) is now available for the Coral Dev Board and SoM and includes a more stable package repository that provides a smoother updating experience. It also brings compatibility improvements and a new version of the GPU driver.

New models

Last but not least, we’ve recently released BodyPix, a Google person-segmentation model that was previously only available for TensorFlow.JS, as a Coral model. This enables real-time privacy preserving understanding of where people (and body parts) are on a camera frame. We first demoed this at CES 2020 and it was one of our most popular demos. Using BodyPix we can remove people from the frame, display only their outline, and aggregate over time to see heat maps of population flow.

Here are two possible applications of BodyPix: Body-part segmentation and anonymous population flow. Both are running on the Dev Board.

We’re excited to add BodyPix to the portfolio of projects the community is using to extend our models far beyond our demos—including tackling today’s biggest challenges. For example, Neuralet has taken our MobileNet V2 SSD Detection model and used it to implement Smart Social Distancing. Using the bounding box of person detection, they can compute a region for safe distancing and let a user know if social distance isn’t being maintained. The best part is this is done without any sort of facial recognition or tracking, with Coral we can accomplish this in real-time in a privacy preserving manner.

We can’t wait to see more projects that the community can make with BodyPix. Beyond anonymous population flow there’s endless possibilities with background and body part manipulation. Let us know what you come up with at our community channels, including GitHub and StackOverflow.

________________________

We are excited to share all that Coral has to offer as we continue to evolve our platform. For a list of worldwide distributors, system integrators and partners, including balena, visit the Coral partnerships page. Please visit Coral.ai to discover more about our edge ML platform and share your feedback at [email protected].

Bringing internet access to millions more Indians with Jio



Today we signed an agreement to invest $4.5 billion (INR 33,737 crore) in Jio Platforms Ltd, taking a 7.73 percent stake in the company, pending regulatory review in India. This is the first investment from the Google For India Digitization Fund announced earlier this week, which aims to accelerate India’s digital economy over the next five to seven years through a mix of equity investments, partnerships, and operational, infrastructure and ecosystem investments. 


Google and Jio Platforms have entered into a commercial agreement to jointly develop an entry-level affordable smartphone with optimizations to the Android operating system and the Play Store. Together we are excited to rethink, from the ground up, how millions of users in India can become owners of smartphones. This effort will unlock new opportunities, further power the vibrant ecosystem of applications and push innovation to drive growth for the new Indian economy.


This partnership comes at an exciting but critical stage in India’s digitization. It’s been amazing to see the changes in technology and network plans that have enabled more than half a billion Indians to get online. At the same time, the majority of people in India still don’t have access to the internet, and fewer still own a smartphone—so there’s much more work ahead. 


Our mission with Android has always been to bring the power of computing to everyone, and we’ve been humbled by the way Indians have embraced Android over recent years. We think the time is right to increase our commitment to India significantly, in collaboration with local companies, and this partnership with Jio is the first step. We want to work with Jio and other leaders in the local ecosystem to ensure that smartphones—together with the apps and services in the Play Store—are within reach for many more Indians across the country. And we believe the pace of Indian innovation means that the experiences we create for India can ultimately be expanded to the rest of the world.  


For Google, our work in India goes to the heart of our efforts to organize the world’s information and make it universally accessible. We opened our first Indian campuses in Bangalore and Hyderabad in 2004. Since then, we’ve made India central to our Next Billion Users initiative—designed to ensure the internet is useful for people coming online for the first time. We’ve improved our apps and services so they’re relevant in more Indian languages and created offline versions for those facing network constraints. We’ve extended our tools to small businesses, sought to close digital divides with initiatives like Internet Saathi, and we’re increasingly focused on helping India harness AI. More and more, apps we create for India—like Google Pay or our Read Along language-learning app—influence what we do globally. 


Jio, for its part, has made an extraordinary contribution to India’s technological progress over the past decade. Its investments to expand telecommunications infrastructure, low-cost phones and affordable internet have changed the way its hundreds of millions of subscribers find news and information, communicate with one another, use services and run businesses. Today, Jio is increasing its focus on the development of areas like digital services, education, healthcare and entertainment that can support economic growth and social inclusion at a critical time in the country’s history. 


In partnership, we can draw on each other’s strengths. We look forward to bringing smartphone access to more Indians—and exploring the many ways we can work together to improve Indians’ lives and advance India’s digital economy.

Posted by Sanjay Gupta, Country Head & VP, India, and Sameer Samat, VP, Product Management

Supporting local communities for Pride 2020

In August 1966, trans women, drag queens, and other members of the LGBTQ+ community fought for their rights and fair treatment outside Compton’s Cafeteria in San Francisco's Tenderloin neighborhood. Three years later on June 28, 1969, the LGBTQ+ community, once again, rose up against inequitable treatment and police misconduct at the Stonewall Inn. For both of these historic moments, LGBTQ+ people of colour—and in particular Black trans women and trans women of colour—helped lead the fight against hate and injustice. In many respects, the modern day LGBTQ+ movement for equality was born from these rebellious acts and the many events preceding them.

Pride should still be a protest. For those within the BIPOC and LGBTQ+ community—especially Black+ trans women—the injustices we're seeing today are a reminder of past and present struggles for equity, justice, and equality under the law. We believe communities must show up for one another, and we stand in solidarity with the Black+ community across the world, honoring the longstanding Pride tradition of unity.

This year for Pride, we’re focusing on helping local organizations in our community that are creating change for LGBTQ+ people of colour, trans and non-binary communities, LGBTQ+ families, and many more.

Supporting local organizations

Local LGBTQ+ organizations are providing critical services for those in need, whether they're helping someone find a bed in a shelter, offering skills and training services, or advocating for more inclusive and equitable policies. Lives depend on these organizations.

One of these organizations is Pride Toronto. Founded in 1981, Pride Toronto has a legacy of purposeful activism for eqality and sharing the diverse stories and perspectives of the LGBTQ+ community. This year, we’re proud to be a Gold sponsor for Pride Toronto and support their work to bring the annual pride celebrations online. The diverse programming has everything from trivia, workouts with olympians, club nights, and an online Pride parade on June 28.

Digital skills training for local businesses

To support LGBTQ+ small businesses and professionals, we’re partnering with Venture Out and Tech Proud on Digital Skills Pride Week from June 22 - 26. In collaboration with other tech companies, we’ll host sessions for small businesses around maintaining productivity, best practices for remote working, creating a website, and building an online presence. Businesses can learn more and register here.

Together, virtually

This year, Pride will feel different for many of us. We’re finding ways to bring people together virtually, including a toolkit that helps organizations host remote Pride events.

We’ve also launched a collection of videos on our YouTube Canada channel to elevate LGBTQ+ voices and share historic Canadian civil rights moments. Dive into interviews from The Queer Network, and listen to personal stories from a collection of Canadian creators, including Julie Vu, GigiGorgeous and AsapSCIENCE.


While Pride is usually marked by jubilant marches and beautiful parade floats, it’s much more than that. For us, Pride is about the ongoing struggle for equity, visibility, and acceptance. We’ll be spending Pride as allies to our the Black+ community members, reflecting on the many LGBTQ+ people of colour who started our liberation movement decades ago, and finding ways to remedy systemic injustices.

Supporting Educators as they teach from home

Editor’s note: This guest post is authored by Michelle Armstrong, Director of EdTechTeam Canada.

“Post your questions in the Chat. We’re here to help.” This is a phrase we’ve gotten used to saying several times a day as our team supports teachers across the country through virtual learning.

A few months ago, schools, universities and colleges across Canada closed down because of concerns over the transmission of COVID-19. The entire Canadian education system had to quickly address the logistical challenges brought about by not being together in a physical classroom. Our facilitators at EdTechTeam Canada geared up immediately, and worked with Google to help parents, teachers and students make the most out of the digital resources available to them.

Since then, we’ve had the opportunity to connect with teachers from all across the country through our live virtual training sessions. With significant funding from our friends at Google Canada, these interactive workshops cover the basics around fundamental learning tools including Google Classroom, Google Meet, Docs and student engagement. By offering up to ten sessions a day along with personal follow-ups, we’ve now hosted hundreds of live workshops and impacted tens of thousands of educators across the country, representing over 350 Divisions, Districts and School Boards. We’ve always prided ourselves in delivering engaging, interactive Professional Development. Thankfully we are still able to achieve that, we just happen to be joining face-to-face from our own living rooms.

Through these interactive discussions with educators, we've seen incredible resilience and tenacity, and a desire to accommodate the needs of learners during this unprecedented time. Here are a few of our key takeaways for educators to consider when it comes to virtual learning.

Learn now, thrive long-term

While some may be scrambling now, being thrust into virtual teaching has created an opportunity for educators to learn new digital skills that will help both inside and outside the classroom. We’ve seen teachers sign up for introductory sessions, as well as the more advanced sessions like using Jamboard, creating quizzes and more. Teachers can check the EdTechTeam Canada website to sign up for live workshops, or review recorded sessions.

Explore new ways to engage students

When you aren’t physically in the classroom with students, it can be challenging to measure student engagement. Simple tricks like using comments within Google Docs and Classroom are great to have a two-way dialogue with students as you share feedback. Using Google Sites can also help keep students and parents up-to-date with important reminders. You can take a look at more resources, tools and tips here.

Supporting learners is more important than ever

With parents now wearing multiple hats from parent to teacher and everything in between, we understand how important it is to support families that are learning at home. Share resources with parents and students who need a bit of practice with digital learning. Some helpful workshops include Get Started with Google Meet, and Google Classroom for Parents.

Our team has been inspired by the remarkable work of Canadian educators and school leaders who continue to adapt and innovate their processes through remote learning. To our educators - we thank you.


For any educators looking to join one of our live workshops, sign up here. We’ll continue to offer these workshops for the next few weeks, as the 2020 school year comes to an end. For more ideas to support educators during this time, try Teach from Home, a central hub of Google for Education tips and tools to help educators keep teaching, even when they aren’t in the classroom.

Building a more resilient world together

Posted by Billy Rutledge, Director of the Coral team

UNDP Hackster.io COVID19 Detect Protect Poster

Recently, we’ve seen communities respond to the challenges of the coronavirus pandemic by using technology in new ways to effect positive change. It’s increasingly important that our systems are able to adapt to new contexts, handle disruptions, and remain efficient.

At Coral, we believe intelligence at the edge is a key ingredient towards building a more resilient future. By making the latest machine learning tools easy-to-use and accessible, innovators can collaborate to create solutions that are most needed in their communities. Developers are already using Coral to build solutions that can understand and react in real-time, while maintaining privacy for everyone present.

Helping our communities stay safe, together

As mandatory isolation measures begin to relax, compliance with safe social distancing protocol has become a topic of primary concern for experts across the globe. Businesses and individuals have been stepping up to find ways to use technology to help reduce the risk and spread. Many efforts are employing the benefits of edge AI—here are a few early stage examples that have inspired us.

woman and child crossing the street

In Belgium, engineers at Edgise recently used Coral to develop an occupancy monitor to aid businesses in managing capacity. With the privacy preserving properties of edge AI, businesses can anonymously count how many customers enter and exit a space, signaling when the area is too full.

A research group at the Sathyabama Institute of Science and Technology in India are using Coral to develop a wearable device to serve as a COVID-19 cough counter and health monitor, allowing medical professionals to better care for low risk patients in an outpatient capacity. Coral's Edge TPU enables biometric data to be processed efficiently, without draining the limited power resources available in wearable devices.

All across the US, hospitals are seeking solutions to ensure adherence to hygiene policy amongst hospital staff. In one example, a device incorporates the compact, affordable and offline benefits of the Coral modules to aid in handwashing practices at numerous stations throughout a facility.

And around the world, members of the PyImageSearch community are exploring how to train a COVID-19: Face Mask Detector model using TensorFlow that can be used to identify whether people are wearing a mask. Open source frameworks can empower anyone to develop solutions, and with Coral components we can help bring those benefits to everyone.

Eliciting a global response

In an effort to rally greater community involvement, Coral has joined The United Nations Development Programme and Hackster.io, as a sponsor of the COVID-19 Detect and Protect Challenge. The initiative calls on developers to build affordable and reproducible solutions that support response efforts in developing countries. All ideas are welcome—whether they use ML or not—and we encourage you to participate.

To make edge ML capabilities even easier to integrate, we’re also announcing a price reduction for the Coral products widely used for experimentation and prototyping. Our Dev Board will now be offered at $129.99, the USB Accelerator at $59.99, the Camera Module at $19.99, and the Enviro Board at $14.99. Additionally, we are introducing the USB Accelerator into 10 new markets: Ghana, Thailand, Singapore, Oman, Philippines, Indonesia, Kenya, Malaysia, Israel, and Vietnam. For more details, visit Coral.ai/products.

We’re excited to see the solutions developers will bring forward with Coral. And as always, please keep sending us feedback at [email protected]

Code Search with Cross References for the Android Open Source Project


Posted by Jeff Bailey, AOSP Engineering Manager; Ally Sillins, AOSP Program Manager; Kris Hildrum, Open Source Code Search Tech Lead; Jay Sachs, Kythe Tech Lead/Manager
Android Screenshot
Searching for "it's all about the code" open source on Google returns more than a million hits. Today we’re introducing a public code search tool for the Android Open Source Project (AOSP).
Link: https://cs.android.com
The Android repository is made up of a collection of git repositories which are managed together using our ‘repo’ tool. Because of this, most tools (such as github, gitweb, etc) can’t see the source code the way that it’s laid out when it’s checked out on the system. In partnership with our colleagues who run Google’s internal Code Search and Kythe, we’re pleased to present a code search tool that presents a view of all of the Android source code as you actually use it.
Here are some features you can take advantage of starting today:
  • View the source code
  • Navigate cross-references across the entire code base that allow you to click through one part of the source code to another
  • Switch between Android’s open source branches (not all branches will have cross-reference information)
  • View tool documentation on https://source.android.com/setup/contribute/code-search
This is the beginning of our journey, and while today not all parts of the Android code base are cross-referenced, you can expect to see this grow over time.
We hope this makes it easier to engage with the Android code base!

Updates from Coral: Mendel Linux 4.0 and much more!

Posted by Carlos Mendonça (Product Manager), Coral TeamIllustration of the Coral Dev Board placed next to Fall foliage

Last month, we announced that Coral graduated out of beta, into a wider, global release. Today, we're announcing the next version of Mendel Linux (4.0 release Day) for the Coral Dev Board and SoM, as well as a number of other exciting updates.

We have made significant updates to improve performance and stability. Mendel Linux 4.0 release Day is based on Debian 10 Buster and includes upgraded GStreamer pipelines and support for Python 3.7, OpenCV, and OpenCL. The Linux kernel has also been updated to version 4.14 and U-Boot to version 2017.03.3.

We’ve also made it possible to use the Dev Board's GPU to convert YUV to RGB pixel data at up to 130 frames per second on 1080p resolution, which is one to two orders of magnitude faster than on Mendel Linux 3.0 release Chef. These changes make it possible to run inferences with YUV-producing sources such as cameras and hardware video decoders.

To upgrade your Dev Board or SoM, follow our guide to flash a new system image.

MediaPipe on Coral

MediaPipe is an open-source, cross-platform framework for building multi-modal machine learning perception pipelines that can process streaming data like video and audio. For example, you can use MediaPipe to run on-device machine learning models and process video from a camera to detect, track and visualize hand landmarks in real-time.

Developers and researchers can prototype their real-time perception use cases starting with the creation of the MediaPipe graph on desktop. Then they can quickly convert and deploy that same graph to the Coral Dev Board, where the quantized TensorFlow Lite model will be accelerated by the Edge TPU.

As part of this first release, MediaPipe is making available new experimental samples for both object and face detection, with support for the Coral Dev Board and SoM. The source code and instructions for compiling and running each sample are available on GitHub and on the MediaPipe documentation site.

New Teachable Sorter project tutorial

New Teachable Sorter project tutorial

A new Teachable Sorter tutorial is now available. The Teachable Sorter is a physical sorting machine that combines the Coral USB Accelerator's ability to perform very low latency inference with an ML model that can be trained to rapidly recognize and sort different objects as they fall through the air. It leverages Google’s new Teachable Machine 2.0, a web application that makes it easy for anyone to quickly train a model in a fun, hands-on way.

The tutorial walks through how to build the free-fall sorter, which separates marshmallows from cereal and can be trained using Teachable Machine.

Coral is now on TensorFlow Hub

Earlier this month, the TensorFlow team announced a new version of TensorFlow Hub, a central repository of pre-trained models. With this update, the interface has been improved with a fresh landing page and search experience. Pre-trained Coral models compiled for the Edge TPU continue to be available on our Coral site, but a select few are also now available from the TensorFlow Hub. On the site, you can find models featuring an Overlay interface, allowing you to test the model's performance against a custom set of images right from the browser. Check out the experience for MobileNet v1 and MobileNet v2.

We are excited to share all that Coral has to offer as we continue to evolve our platform. For a list of worldwide distributors, system integrators and partners, visit the new Coral partnerships page. We hope you’ll use the new features offered on Coral.ai as a resource and encourage you to keep sending us feedback at [email protected].

Why Diversity is Important in Open Source: Google’s Sponsorship of OSSEU

The Open Source Summit + Embedded Linux Conference is taking place in Lyon, France, which the Google Open Source Programs Office is sponsoring. The Linux Foundation supports shared technology through open source, while the conference provides a space for developers and technologists in open source to meet, network, and share knowledge with one another in order to advance the community. Why is this of utmost importance to Google OSS? Google has been rooted in the open source community for many years, supporting programs, projects, and organizations to help advance open source software and technology—we understand the necessity of sustaining open source and the developer community in order to advance technology as a whole.

Sponsoring OSSEU is more than just providing funds, but really pushing the diversity initiative in open source. We need diversity across all levels in open source whether it’s contributors, maintainers, doc writers, or anyone supporting the project. As said recently by the Open Source Initiative, “Many perspectives makes better software.” Having previously funded diversity initiatives such as scholarships or lunches at OSS conferences, Google continues to support this cause by sponsoring the diversity lunch at OSSEU.
In particular, sessions and events that Google will be hosting while at OSSEU include a keynote on Documentation by Megan Byrd-Sanicki and the Women in Open Source Lunch, both on Tuesday, October 29, 2019. The keynote on Docs highlights the importance of doc stars and why their contributions are essential to the growth of the open source community. Our support of the women in open source lunch is especially important as we look to increase the diversity of the open source community by supporting women and non-binary persons to get more involved and have the opportunity to connect with each other at an event of this scale.

If you’re attending OSSEU, stop by the keynote, and we hope to see you at the lunch as well. If you aren’t attending this year, and are interested in getting more involved in the open source community, the summits hosted by the Linux Foundation are one of the best ways to learn more about OSS and meet passionate people involved in different OSS projects and organizations.

By Radha Jhatakia, Google OSPO

Coral moves out of beta

Posted by Vikram Tank (Product Manager), Coral Team

microchips on coral colored background

Last March, we launched Coral beta from Google Research. Coral helps engineers and researchers bring new models out of the data center and onto devices, running TensorFlow models efficiently at the edge. Coral is also at the core of new applications of local AI in industries ranging from agriculture to healthcare to manufacturing. We've received a lot of feedback over the past six months and used it to improve our platform. Today we’re thrilled to graduate Coral out of beta, into a wider, global release.

Coral is already delivering impact across industries, and several of our partners are including Coral in products that require fast ML inferencing at the edge.

In healthcare, Care.ai is using Coral to build a device that enables hospitals and care centers to respond quickly to falls, prevent bed sores, improve patient care, and reduce costs. Virgo SVS is also using Coral as the basis of a polyp detection system that helps doctors improve the accuracy of endoscopies.

In a very different use case, Olea Edge employs Coral to help municipal water utilities accurately measure the amount of water used by their commercial customers. Their Meter Health Analytics solution uses local AI to reduce waste and predict equipment failure in industrial water meters.

Nexcom is using Coral to build gateways with local AI and provide a platform for next-gen, AI-enabled IoT applications. By moving AI processing to the gateway, existing sensor networks can stay in service without the need to add AI processing to each node.

From prototype to production

Microchips on white background

Coral’s Dev Board is designed as an integrated prototyping solution for new product development. Under the heatsink is the detachable Coral SoM, which combines Google’s Edge TPU with the NXP IMX8M SoC, Wi-Fi and Bluetooth connectivity, memory, and storage. We’re happy to announce that you can now purchase the Coral SoM standalone. We’ve also created a baseboard developer guide to help integrate it into your own production design.

Our Coral USB Accelerator allows users with existing system designs to add local AI inferencing via USB 2/3. For production workloads, we now offer three new Accelerators that feature the Edge TPU and connect via PCIe interfaces: Mini PCIe, M.2 A+E key, and M.2 B+M key. You can easily integrate these Accelerators into new products or upgrade existing devices that have an available PCIe slot.

The new Coral products are available globally and for sale at Mouser; for large volume sales, contact our sales team. By the end of 2019, we'll continue to expand our distribution of the Coral Dev Board and SoM into new markets including: Taiwan, Australia, New Zealand, India, Thailand, Singapore, Oman, Ghana and the Philippines.

Better resources

We’ve also revamped the Coral site with better organization for our docs and tools, a set of success stories, and industry focused pages. All of it can be found at a new, easier to remember URL Coral.ai.

To help you get the most out of the hardware, we’re also publishing a new set of examples. The included models and code can provide solutions to the most common on-device ML problems, such as image classification, object detection, pose estimation, and keyword spotting.

For those looking for a more in-depth application—and a way to solve the eternal problem of squirrels plundering your bird feeder—the Smart Bird Feeder project shows you how to perform classification with a custom dataset on the Coral Dev board.

Finally, we’ll soon release a new version of the Mendel OS that updates the system to Debian Buster, and we're hard at work on more improvements to the Edge TPU compiler and runtime that will improve the model development workflow.

The official launch of Coral is, of course, just the beginning, and we’ll continue to evolve the platform. Please keep sending us feedback at [email protected].

Bazel Reaches 1.0 Milestone!

We're excited to announce the first General Availability release of Bazel, an open source build system designed to support a wide variety of programming languages and platforms.

Bazel was born of Google's own needs for highly scalable builds. When we open sourced Bazel back in 2015, we hoped that Bazel could fulfill similar needs in the software development industry. A growing list of Bazel users attests to the widespread demand for scalable, reproducible, and multi-lingual builds. Bazel helps Google be more open too: several large Google open source projects, such as Angular and TensorFlow, use Bazel. Users have reported 3x test time reductions and 10x faster build speeds after switching to Bazel.
With the 1.0 release we’re continuing to implement Bazel's vision:
  • Bazel builds are fast and correct. Every build and test run is incremental, on your developers’ machines and on your CI test system.
  • Bazel supports multi-language, multi-platform builds and tests. You can run a single command to build and test your entire source tree, no matter which combination of languages and platforms you target.
  • Bazel provides a uniform extension language, Starlark, to define builds for any language or platform.
  • Bazel works across all major development platforms (Linux, macOS, and Windows).
  • Bazel allows your builds to scale—it connects to distributed remote execution and caching services.
The key features of the 1.0 GA release are:
  • Semantic Versioning
Starting with Bazel 1.0, we will use semantic versioning for all Bazel releases. For example, all 1.x releases will be backwards-compatible with Bazel 1.0. We will have a window of at least three months between major (breaking) releases. We'll continue to publish minor releases of Bazel every month, cutting from GitHub HEAD.
  • Long-Term Support
Long-Term Support (LTS) releases give users confidence that the Bazel team has the capacity and the process to quickly and safely deliver fixes for critical bugs, including vulnerabilities.
  • Well-rounded features for Angular, Android, Java, and C++
The new features include end-to-end support for remote execution and caching, and support for standard package managers and third-party dependencies.
New to Bazel? Try the tutorial for your favorite language to get started.

With the 1.0 release we still have many exciting developments ahead of us. Follow our blog or Twitter account for regular updates. Feel free to contact us with questions or feedback on the mailing list, submit feature requests (and report bugs) in our GitHub issue tracker, and join our Slack channel. Finally, join us at the largest-ever BazelCon conference in December 2019 for an opportunity to meet other Bazel users and the Bazel team at Google, see demos and tech talks, and learn more about fast, correct, and large-scale builds.

Last but not least, we wouldn't have gotten here without the continued trust, support, encouragement, and feedback from the community of Bazel users and contributors. Heartfelt thanks to all of you from the Bazel team!

By Dmitry Lomov, Bazel Team