Tag Archives: Google I/O 2023

Media transcoding and editing, transform and roll out!

Posted by Andrew Lewis - Software Engineer, Android Media Solutions

The creation of user-generated content is on the rise, and users are looking for more ways to personalize and add uniqueness to their creations. These creations are then shared to a vast network of devices, each with its own capabilities. The Jetpack Media3 1.0 release includes new functionality in the Transformer module for converting media files between formats, or transcoding, and applying editing operations. For example, you can trim a clip from a longer piece of media and apply effects to the video track to share over social media, or transcode media into a more efficient codec for upload to a server.

The overall goal of Transformer is to provide an easy to use, reliable and performant API for transcoding and editing media, including support for customizing functionality, following the same API design principles to ExoPlayer. The library is supported on devices running Android 5.0 Lollipop (API 21) onwards and includes device-specific optimizations, giving developers a strong foundation to build on. This post gives an introduction to the new functionality and describes some of the many features we're planning for upcoming releases!


Getting Started

Most operations with Transformer will follow the same general pattern:

  1. Configure a TransformationRequest with settings like your desired output format
  2. Create a Transformer and pass it your TransformationRequest
  3. Apply additional effects and edits
  4. Attach a listener to react to completion events
  5. Start the transformation

Of course, depending on your desired transformations, you may not need every step. Here's an example of transcoding an input video to the H.265/HEVC video format and removing the audio track.

// Create a TransformationRequest and set the output format to H.265 val transformationRequest = TransformationRequest.Builder().setVideoMimeType(MimeTypes.VIDEO_H265).build() // Create a Transformer val transformer = Transformer.Builder(context) .setTransformationRequest(transformationRequest) // Pass in TransformationRequest .setRemoveAudio(true) // Remove audio track .addListener(transformerListener) // transformerListener is an implementation of Transformer.Listener .build() // Start the transformation val inputMediaItem = MediaItem.fromUri("path_to_input_file") transformer.startTransformation(inputMediaItem, outputPath)

During transformation you can get progress updates with Transformer.getProgress. When the transformation completes the listener is notified in its onTransformationCompleted or onTransformationError callback, and you can process the output media as needed.

Check out our documentation to learn about further capabilities in the Transformer APIs. You can also find details about using Transformer to accurately convert 10-bit HDR content to 8-bit SDR in the "Dealing with color washout" blog post to ensure your video's colors remain as vibrant as possible in the case that your app or the device doesn't support HDR content.


Edits, effects, and extensions

Media3 includes a set of core video effects for simple edits, such as scaling, cropping, and color filters, which you can use with Transformer. For example, you can create a Presentation effect to scale the input to 480p resolution while maintaining the original aspect ratio, and apply it with setVideoEffects:

Transformer.Builder(context) .setVideoEffects(listOf(Presentation.createForHeight(480))) .build()

You can also chain multiple effects to create more complex results. This example converts the input video to grayscale and rotates it by 30 degrees:

Transformer.Builder(context) .setVideoEffects(listOf( RgbFilter.createGrayscaleFilter(), ScaleToFitTransformation.Builder() .setRotationDegrees(30f) .build())) .build()

It's also possible to extend Transformer’s functionality by implementing custom effects that build on existing ones. Here is an example of subclassing MatrixTransformation, where we start zoomed in by 2 times, then zoom out gradually as the frame presentation time increases:

val zoomOutEffect = MatrixTransformation { presentationTimeUs -> val transformationMatrix = Matrix() val scale = 2 - min(1f, presentationTimeUs / 1_000_000f) // Video will zoom from 2x to 1x in the first second transformationMatrix.postScale(/* sx= */ scale, /* sy= */ scale) transformationMatrix // The calculated transformations will be applied each frame in turn } Transformer.Builder(context) .setVideoEffects(listOf(zoomOutEffect)) .build()

Here's a screen recording that shows this effect being applied in the Transformer demo app:

moving image showing what subclassing matrix transformation looks like in the Transformer demo app

For even more advanced use cases, you can wrap your own OpenGL code or other processing libraries in a custom GL texture processor and plug those into Transformer as custom effects. See the demo app for some examples of custom effects. The README also has instructions for trying a demo of MediaPipe integration with Transformer.


Coming soon

Transformer is actively under development but ready to use, so please give it a try and share your feedback! The Media3 development branch includes a sneak peek into several new features building on the 1.0 release described here, including support for tone-mapping HDR videos to SDR using OpenGL, previewing video effects using ExoPlayer.setVideoEffects, and custom audio processing. We are also working on support for editing multiple videos in more flexible compositions, with export from Transformer and playback through ExoPlayer, making Media3 an end-to-end solution for transforming media.

We hope you'll find Transformer an easy-to-use and powerful tool for implementing fantastic media editing experiences on Android! You can send us feature requests and bug reports in the Media3 GitHub issue tracker, and follow this blog to get updates on new features. Stay tuned for our upcoming talk “High quality Android media experiences” at Google I/O.

How to optimize your Android app for large screens (And what NOT to do!)

Posted by the Android team

Large foldables, tablets, and desktop devices like Chromebooks – with more active large screen Android devices each year, it’s more important than ever for apps to ensure they provide their users with a seamless experience on large screens. For example, these devices offer more screen space, and users expect apps to do more with that space. We’ve seen that apps enjoy better business metrics on these devices when they do work to support them.

These devices can also be used in different places and in different ways than we might expect on a handset. For example foldables can be used in tabletop mode, users may sit further away from a desktop display, and many large screen devices may be used with a mouse and keyboard.

These differences introduce new things to consider. For example:
  • Can a user reach the most important controls when using your app with two hands on a tablet?
  • Does all of your app’s functionality work with a keyboard and mouse?
  • Does your app’s camera preview have the right orientation regardless of the position of the device?
image showing differentiated experiences across large sceen devices

Large Screen Design and Quality

Defining great design and quality on large screens can be difficult, because different apps will have different ways to excel. You know your product best, so it’s important to use your app on large screen devices and reflect on what will provide the best experience. If you don’t have access to a large screen device; try one of the foldable, desktop, or tablet virtual devices.

Google also provides resources thoughout the development process to help as you optimize your app. If you’re looking for design guidance, there are thoughtful design resources like the large screen Material Design guidance and ready to use compositions like the Canonical layouts. For inspiration, there are great examples of a variety of different apps in the large screens gallery. If you’re looking for a structured way to approach large screen quality, the Large screen app quality guidelines provide a straight-forward checklist and a set of tests to give you confidence that your app is ready for large screens.


Dos and Don’ts of Optimizing for Large Screens

Whether you already have beautiful large screen designs or not, we want to highlight some helpful tips and common mistakes to avoid when optimizing your app for large screens.

Don’t: assume exclusive access to resources

  • Don’t assume you have exclusive access to hardware resources like the camera. Large screens commonly have more than one app active at a time, and those other apps may try to access the same resources.
  • This means you should test your app side by side simultaneously with other apps, and never assume a resource is available at any given time.

Do: handle hardware access gracefully

  • Check for hardware resources like the camera before trying to use them. Remember that hardware peripherals can be added and removed at any time via USB.
  • Fail gracefully when access to a given resource isn’t available at runtime.
try { // Attempt to use the camera ... } catch (e: CameraAccessException) { e.message?.let { Log.e(TAG, it) } // Fail gracefully if the camera isn't currently available } }

Do: respond appropriately to lifecycle events:

  • Your app may still be visible during onPause(), especially when multiple apps on onscreen, so you need to keep media playing and your UI fresh until onStop() is called

Don’t: stop your app’s UI in onPause()

override fun onPause() { //DON'T clean up resources here. //Your app can still be visible. super.onPause() }

Don’t: rely on device-type booleans like “isTablet

  • In the past, a common pattern for apps to use was to leverage screen width to create a boolean like “isTablet” to make changes based on the kind of device the app is running on, but this approach is fragile. The core problem with this approach is that it looks for a proxy to determine what the device type is, and those proxies are error-prone. For example, if you determine a device is a tablet because it has a large display when your app launches, your app can behave incorrectly when its window is resized to not take up the full screen. Even if your device-type boolean responds to configuration changes, unfolding a foldable would change your experience in a way that it couldn’t return to until another configuration change occurs, such as refolding the device.

Do: work to replace existing uses of device-type booleans with the right approach

Query for the information about the device that’s necessary for what you’re trying to accomplish. For example:

  • If you’re using device-type booleans to adapt your layout, use WindowSizeClasses instead. The library has support for both Views and for Jetpack Compose, and it makes it clear and easy to adapt your UI to pre-defined breakpoints.
// androidx.compose.material3.windowsizeclass.WindowSizeClass class MainActivity : ComponentActivity() { … setContent { val windowSizeClass = calculateWindowSizeClass(this) WindowSizeClassDisplay(windowSizeClass) } @Composable fun WindowSizeClassDisplay(windowSizeClass : WindowSizeClass) { when (windowSizeClass.widthSizeClass) { WindowWidthSizeClass.Compact -> { compactLayout() } WindowWidthSizeClass.Medium -> { mediumLayout() } WindowWidthSizeClass.Expanded -> { expandedLayout() } } }
  • If you’re using isTablet for changing user facing strings like “your tablet”, you might not need any more information. The solution can be as simple as using more general phrasing such as “your Android device”.
  • If you’re using a device-type boolean to predict the presence of a hardware feature or resource (e.g. - telephony, bluetooth, etc), check for the desired capabilities directly at runtime before trying to use them, and fail gracefully when they become unavailable. This feature-based approach ensures that your app can respond appropriately to peripheral devices that can be attached or removed. It also avoids cases where a feature is missing even though it could be supported by the device.
val packageManager: PackageManager = context.packageManager val hasTelephony = packageManager.hasSystemFeature(PackageManager.FEATURE_TELEPHONY)

Do: use Jetpack CameraX when possible

  • There can be a surprising amount of complexity in showing camera previews – orientation, aspect ratio, and more. When you use Jetpack CameraX, the library will handle as many of these details as possible for you.

Don’t: assume that your camera preview will align with device orientation

  • There are several kinds of orientation to consider when implementing a camera preview in your app - natural orientation, device orientation, and display orientation. Proper implementation of a camera preview requires accounting for the various kinds of orientation and adapting as the device’s conditions change.

Don’t: assume that aspect ratios are static

Do: declare hardware feature requirements correctly

  • When you’re declaring your app’s feature requirements, refer to the guidance in the Large Screens Cookbook. To ensure that you aren’t unnecessarily limiting your app’s reach, be sure to use the most inclusive manifest entries that work with your app.
<uses-feature android:name="android.hardware.camera.any" android:required="false" /> <uses-feature android:name="android.hardware.camera" android:required="false" /> <uses-feature android:name="android.hardware.camera.autofocus" android:required="false" /> <uses-feature android:name="android.hardware.camera.flash" android:required="false" />

Don’t: assume window insets are static

  • Large screens can change frequently, and that includes their WindowInsets. This means we can’t just check the insets when our app is launched and never change them.

Do: use the WindowInsetsListener APIs for Views

  • The WindowInsetsListener APIs notify your app when insets change
    ViewCompat.setOnApplyWindowInsetsListener(view) { view, windowInsets -> val insets = windowInsets.getInsets( WindowInsetsCompat.Type.systemBars()) view.updateLayoutParams<MarginLayoutParams>( leftMargin = insets.left, bottomMargin = insets.bottom, rightMargin = insets.right, ) WindowInsetsCompat.CONSUMED }

    Do: use the windowInsetsPadding Modifier for Jetpack Compose

    • The windowInsetsPadding Modifier will dynamically pad based on the given type of window insets. Additionally, multiple instances of the Modifier can communicate with each other to avoid adding duplicate padding, and they’re automatically animated.

    Don’t: assume the device has a touch screen

    Do: test your app on large screens

    • The most important thing you can do to ensure your app’s experience is great on large screens is to test it yourself. If you want a rigorous test plan that’s already prepared for you, try out the large screen compatibility tests.

    Do: leverage the large screen tools in Android Studio

    • Android Studio provides tools to use during development that make it much easier to optimize for large screens. For example, multipreview annotations allow you to visualize your app in many conditions at the same time. There’s also a wide variety of tablet, desktop, and foldable AVDs available in the Android Virtual Device Manager to help you test your app on large screens today.

    Stay tuned for Google I/O

    These tips are a great starting point as you optimize your app for large screens, and there are even more updates to come at Google I/O on May 10th. Tune in to watch the latest news and innovations from Google, with live streamed keynotes and helpful product updates on demand.

    Get ready for I/O ‘23: start planning your sessions, and take a look at some of Android’s favorite moments!

    Posted by Maru Ahues Bouza, Director, Android Developer Relations

    Google I/O 2023 is just a week away, kicking off on Wednesday May 10 at 10AM PT with the Google Keynote and followed at 12:15PM PT by the Developer Keynote. The program schedule launched last week, allowing you to save sessions to your calendar and start previewing content.

    To help you get ready for this year's Google I/O, we’re taking a look back at some of Android’s favorite moments from past Google I/Os, as well as a playlist of developer content to help you prepare. Take a look below, and start getting ready!


    Modern Android Development

    Helping you stay more productive and create better apps, Modern Android Development is Android’s set of tools and APIs, and they were born across many Google I/Os. Tor Norbye, Director of Engineering for Android, reflects on how Android development tools, APIs, and best practices have evolved over the years, starting in 2013 when he and the team announced Android Studio. Here are some of the talks we’re excited for in developer productivity at this year’s Google I/O:



    Building for a multi-device world

    From the launch of Android Auto and Android Wear in 2014 to last year’s preview of the Google Pixel Tablet, Google I/O has always been an important moment for seeing the new form factors that Android is extending to. Sara Hamilton, Developer Relations Engineer for Android, discusses how we are continuing to invest in multi-device experiences and making it easier for you to build for the entire Android device ecosystem. Sara shares her excitement for developers continuing to bring unique experiences to all screen sizes and types, from tablets and foldables, to watches and tvs. Some of our favorite talks at this year’s Google I/O in the multi-device world include:




    The platform and app quality

    From originally playing a smaller part in Google I/O keynotes in the early days to announcing 3 billion monthly active users in 2021, Dan Sandler, Software Engineer for Android, looks back at the tremendous growth of the Android platform and how it’s continuing to evolve. With a focus on helping you make quality apps, here are some of our favorite Android platform talks this year:




    We can’t wait to show you all that’s new across Android in just under a week. Be sure to tune in on the Google I/O website on May 10 to catch the latest Android updates and announcements this year!

    Get ready for I/O ‘23: start planning your sessions, and take a look at some of Android’s favorite moments!

    Posted by Maru Ahues Bouza, Director, Android Developer Relations

    Google I/O 2023 is just a week away, kicking off on Wednesday May 10 at 10AM PT with the Google Keynote and followed at 12:15PM PT by the Developer Keynote. The program schedule launched last week, allowing you to save sessions to your calendar and start previewing content.

    To help you get ready for this year's Google I/O, we’re taking a look back at some of Android’s favorite moments from past Google I/Os, as well as a playlist of developer content to help you prepare. Take a look below, and start getting ready!


    Modern Android Development

    Helping you stay more productive and create better apps, Modern Android Development is Android’s set of tools and APIs, and they were born across many Google I/Os. Tor Norbye, Director of Engineering for Android, reflects on how Android development tools, APIs, and best practices have evolved over the years, starting in 2013 when he and the team announced Android Studio. Here are some of the talks we’re excited for in developer productivity at this year’s Google I/O:



    Building for a multi-device world

    From the launch of Android Auto and Android Wear in 2014 to last year’s preview of the Google Pixel Tablet, Google I/O has always been an important moment for seeing the new form factors that Android is extending to. Sara Hamilton, Developer Relations Engineer for Android, discusses how we are continuing to invest in multi-device experiences and making it easier for you to build for the entire Android device ecosystem. Sara shares her excitement for developers continuing to bring unique experiences to all screen sizes and types, from tablets and foldables, to watches and tvs. Some of our favorite talks at this year’s Google I/O in the multi-device world include:




    The platform and app quality

    From originally playing a smaller part in Google I/O keynotes in the early days to announcing 3 billion monthly active users in 2021, Dan Sandler, Software Engineer for Android, looks back at the tremendous growth of the Android platform and how it’s continuing to evolve. With a focus on helping you make quality apps, here are some of our favorite Android platform talks this year:




    We can’t wait to show you all that’s new across Android in just under a week. Be sure to tune in on the Google I/O website on May 10 to catch the latest Android updates and announcements this year!

    Developer Journey: Explore I/O through the lens of our developer communities (May 2023)

    Posted by Lyanne Alfaro, DevRel Program Manager, Google Developer Studio

    Developer Journey is a monthly series to spotlight diverse and global developers sharing relatable challenges, opportunities, and wins in their journey. Every month, we will spotlight developers around the world, the Google tools they leverage, and the kind of products they are building.

    With Google I/O season in full swing, we’re sharing diverse perspectives of developers across Google’s developer communities who have been on the ground.

    Meet AiJing, Jolina, and Maria – members of Google Developer Student Clubs, Google Developer Groups, and Women Techmakers – who share a passion for learning, creating, and connecting through Google technology as they share what they’re most excited for this year at I/O.


    AiJing Wu

    Headshot of AiJing Wu, smiling
    Madison, Wisconsin
    GDSC Lead, Women Techmakers
    GDSC University of Wisconsin-Madison
    Software Engineer

    What does Google I/O mean to you, and what are you looking forward to most this year?

    To me, Google I/O is the paradise for embracing cutting-edge technologies. I have followed the keynotes online for two years, and it is so exciting that I will join in-person this year! I can’t wait to exchange thoughts with other amazing developers and listen to the game-changing AI topics.


    What's your favorite part about Google I/O?

    I’m obsessed with live demos for new technologies. Daring to do a live demo shows Google developers’ strong confidence and pride in their work. It is also exciting to see what kinds of use cases are emphasized and what metrics are evaluated.


    What Google tools have you used to build?

    As a full-stack developer and cloud engineer, I have built progressive apps and distributed services with Chrome, Android Studio, BigQuery, Analytics, Firebase, Google Maps, YouTube, and Google Cloud Platform. Other than those, I love exploring AI and ML features with Google Colab, Cloud TPU, and TensorFlow.


    Which tool has been your favorite? Why?

    Chrome has been my favorite. To me, it is the best choice for web app development: great compatibility across OS platforms, feature-rich developer tools, and smooth mobile integration. ChromeDriver is a sweet bonus when accessing deployments and automating tests on a server.


    Tell us about something you've built in the past using Google tools.

    I collaborated with my friends to build a web app aimed at helping people understand and analyze soccer games easier and faster with pre-trained ML models. This app includes accessing YouTube video sources, detecting targets with Yolo-v3 in TensorFlow, accelerating computation with Colab GPU, and storing results in Google Cloud.


    What advice would you give someone starting in their developer journey?

    Actively discuss with people and listen to their ideas, especially if you are a student or a beginner. Participating in GDSC and GDG events is a great source to connect with peers and senior developers near you and across the globe. I benefit so much simply by chatting about random tech topics with others. Good communication will open your mind and guide your direction. Meeting interesting people will also make your journey as a developer much more colorful and enjoyable!


    Jolina Li

    Headshot of Jolina Li, smiling
    Toronto, Ontario, Canada
    GDSC Lead
    Google Developer Student Club, University of Toronto St. George

    What does Google I/O mean to you, and what are you looking forward to most this year?

    It has been a dream for me since high school to attend Google I/O. In previous years, I would watch clips of the keynotes online and browse through creators’ YouTube vlogs to see all the incredible technologies at the hands-on stations. This May, I can’t believe I will be traveling to Mountain View and experiencing Google I/O 2023 for the first time live in person. For me, Google I/O is an opportunity to connect with passionate individuals in the developer community, including students, and experts from around the world. It is a full day of learning, inspiration, innovation, community, and growth. This year, I’m looking forward to hearing all the exciting keynotes in person, interacting with transformative technology, and making new connections.


    What's your favorite part about Google I/O?

    My favorite part about Google I/O is the technical sessions after the keynotes, where I can learn about innovative products from experts and engage in product demonstrations. I love seeing developments in machine learning, so I will definitely visit the TensorFlow station. I’m also excited to explore other Google technology stations, including Google Cloud and Google Maps Platform, and learn as much as I can.


    What Google tools have you used to build?

    I have used Android to build mobile apps for my software design course and a tech entrepreneurship competition. I have also used Google Colab, a cloud-based Jupyter notebook environment, for my research and deep learning engineering internships.


    Which tool has been your favorite? Why?

    I love using Google Colab because it’s an accessible and cost-free tool for students working on data science and machine learning projects. The environment requires no setup and offers expensive computing resources such as GPUs at no cost. It uses Python, my favorite language, and contains all the main Python libraries. The user interface features independent code segments you can run and test rather than running the entire script every time you edit code. There is also an option to add text segments between code to document various script components. Google Colab notebooks can be easily shared with anyone for collaboration and stored in Google Drive for convenient access.


    Tell us about something you've built in the past using Google tools.

    For my software design course project, a few teammates and I built a cooking recipe organizer app using Android Studio that allows users to discover new recipes and build their own portfolio of recipes. Users can save interesting recipes that they found, give ratings and reviews, and also upload their own recipes to the database. I designed a recipe sorting and filtering system that allows users to sort their saved recipes alphabetically, by interest keywords or rating, and filter their recipes by genre.

    Android Studio allowed me to preview the mobile app development using an emulator that functions across all types of Android devices. This feature helped me to understand the app from a user’s perspective and develop the UI/UX more efficiently. We also used Google Firebase for its cloud storage, non-relational feature, and high compatibility with Android.


    What advice would you give someone starting in their developer journey?

    When I began attending university, I had no experience in programming and had to start my computer science career from zero. I pursued computer science, however, because I was interested in learning about AI and building technology to solve global problems such as climate change.

    I believe that when you are starting your career, it’s important to have a goal about what you want to achieve. There are so many possibilities in tech, and having a goal can help you make decisions and motivate you when you’re facing challenges. It’s also important to keep an open mind about different opportunities and explore multiple areas in tech to learn more about the field and discover your passions.

    Another tip is to look for opportunities and resources to help you grow as a developer. Many opportunities and resources are available for beginners, including online courses, self-guided project tutorials, and beginner-friendly workshops.

    Google has amazing developer communities, including student campus clubs (GDSC), professional developer groups (GDG), Google developer expert groups (GDE), and a women in tech community (WTM). You can also create your own opportunities by teaching a hands-on workshop to enhance your technical and soft skills, starting a local developer group to gain leadership and collaboration skills, or building projects to increase your knowledge and apply what you learn.

    Learn a lot, discover new opportunities, gain new skills, connect with people in tech, and keep pursuing what you love about technology!

    Maria Paz Muñoz Parra

    Headshot of Maria Paz Muñoz Parra, smiling
    Malmö, Sweden
    Google Developer Groups Organizer and Women Techmakers Ambassador
    Senior front-end developer, IKEA


    What does Google I/O mean to you, and what are you looking forward to most this year?

    Google I/O is an opportunity to stay up to date in Google technologies and initiatives. We get to witness innovation, connect with other developers and generate energetic conversations about what we are passionate about.

    Besides Bard, this year I have a special interest in the WebGPU API. Currently, I work as a senior front-end developer on a Knowledge Graph project. There, one of the most powerful tools for ontologists and data scientists to model and understand data are the canvases. I’m curious about how we can boost the performance when rendering these graphs on the web, using the new features of WebGPU. Google I/O will surely be an inspiration for my work.


    What's your favorite part about Google I/O?

    It’s the perfect excuse to meet my colleagues and watch the event together, popcorn included! In the online realm, it’s always fun to follow the discussions on social media, and Google always finds a way to surprise us and keep us engaged in our learning process. I still remember the I/O Adventure platform of 2022. It was an outstanding virtual experience, interacting with people in the community booths. Later, I also followed the recorded talks. A gamified learning experience, top to bottom!


    What Google tools have you used to build?

    The devTools have been my everyday tools for the past 10 years. The ones that I have used the most are the Core Web Vital metrics, devTools for debugging (extra love for the ones to debug accessibility issues), and tools for testing CSS on the browser (i.e. the grid properties and the media queries emulation features).

    Since last year, I’ve been testing the Instant Loading and Seamless APIs, and they have allowed me to deliver high-quality interfaces with intuitive navigation, as we are used to having in native mobile apps.


    Which tool has been your favorite? Why?

    Accessibility guidelines and tools are my favorite. Lighthouse, the accessibility scanner, and Material Design. These tools help us ensure that all users, including those with disabilities, can access and use content and services published on the web. With these tools integrated, other users can start educating themselves on the power of accessibility. My interest in this space started when I noticed that my mother, who has low vision and motor impairments in her hands, couldn’t easily access her favorite music on her phone. The voice search feature on YouTube was revolutionary for her, and probably for many other elders.

    Many questions popped into my mind: “Who is considered a user with a disability? How are the interfaces I create used? Am I creating unintentional barriers?”

    As a web developer, tools that allow me to test, audit, understand and improve are a must.


    Tell us about something you've built in the past using Google tools.

    I collaborated with my friends to build a web app aimed at helping people understand and analyze soccer games easier and faster with pre-trained ML models. This app includes accessing YouTube video sources, detecting targets with Yolo-v3 in TensorFlow, accelerating computation with Colab GPU, and storing results in Google Cloud.


    What advice would you give someone starting in their developer journey?

    Many developers who start their journey come from other areas of expertise or industries. Imagine a journalist, nurse, or primary school teacher who wants to start a developer journey. They may feel they need to throw away all the knowledge they have acquired.

    On the contrary, I believe prior knowledge is key to standing out as a developer. Every person has a different combination of interests, talents, and skills. Master the basics, and shine with your own story.

    From meeting talented developers to exciting keynotes, there’s so much to look forward to at Google I/O 2023. To optimize your experience, create or connect a developer profile, and start saving content to My I/O to build your personal agenda. Share your experience with us by using #GoogleIO across your social media so we can find you!

    Get ready for Google I/O

    Posted by Timothy Jordan, Director, Developer Relations & Open Source

    I/O is just a few days away and we couldn’t be more excited to share the latest updates across Google’s developer products, solutions, and technologies. From keynotes to technical sessions and hands-on workshops, these announcements aim to help you build smarter and ship faster.

    Here are some helpful tips to maximize your experience online.


    Start building your personal I/O agenda

    Starting now, you can save the Google and developer keynotes to your calendar and explore the program to preview content. Here are just a few noteworthy examples of what you’ll find this year:

    What's new in Android
    Get the latest news in Android development: Android 14, form factors, Jetpack + Compose libraries, Android Studio, and performance.
    What’s new in Web
    Explore new features and APIs that became stable across browsers on the Web Platform this year.
    What’s new in Generative AI
    Discover a new suite of tools that make it easy for developers to leverage and build on top of Google's large language models.
    What’s new in Google Cloud
    Learn how Google Cloud and generative AI will help you develop faster and more efficiently.

    For the best experience, create or connect a developer profile and start saving content to My I/O to build your personal agenda. With over 200 sessions and other learning material, there’s a lot to cover, so we hope this will help you get organized.

    This year we’ve introduced development focus filters to help you navigate content faster across mobile, web, AI, and cloud technologies. You can also peruse content by topic, type, or experience level so you can find what you’re interested in, faster.


    Connect with the community

    After the keynotes, you can talk to Google experts and other developers online in I/O Adventure chat. Here you can ask questions about new releases and learn best practices from the global developer community.

    If you’re craving community now, visit the Community page to meet people with similar interests in your area or find a watch party to attend.

    We hope these updates are useful, and we can’t wait to connect online in May!

    Get ready for Google I/O

    Posted by Timothy Jordan, Director, Developer Relations & Open Source

    I/O is just a few days away and we couldn’t be more excited to share the latest updates across Google’s developer products, solutions, and technologies. From keynotes to technical sessions and hands-on workshops, these announcements aim to help you build smarter and ship faster.

    Here are some helpful tips to maximize your experience online.


    Start building your personal I/O agenda

    Starting now, you can save the Google and developer keynotes to your calendar and explore the program to preview content. Here are just a few noteworthy examples of what you’ll find this year:

    What's new in Android
    Get the latest news in Android development: Android 14, form factors, Jetpack + Compose libraries, Android Studio, and performance.
    What’s new in Web
    Explore new features and APIs that became stable across browsers on the Web Platform this year.
    What’s new in Generative AI
    Discover a new suite of tools that make it easy for developers to leverage and build on top of Google's large language models.
    What’s new in Google Cloud
    Learn how Google Cloud and generative AI will help you develop faster and more efficiently.

    For the best experience, create or connect a developer profile and start saving content to My I/O to build your personal agenda. With over 200 sessions and other learning material, there’s a lot to cover, so we hope this will help you get organized.

    This year we’ve introduced development focus filters to help you navigate content faster across mobile, web, AI, and cloud technologies. You can also peruse content by topic, type, or experience level so you can find what you’re interested in, faster.


    Connect with the community

    After the keynotes, you can talk to Google experts and other developers online in I/O Adventure chat. Here you can ask questions about new releases and learn best practices from the global developer community.

    If you’re craving community now, visit the Community page to meet people with similar interests in your area or find a watch party to attend.

    We hope these updates are useful, and we can’t wait to connect online in May!

    Let’s go. It’s Google I/O 2023

    Posted by Jeanine Banks, VP & General Manager, Developer X, and Head of Developer Relations

    Google I/O is back and you’re invited to join us online May 10! Learn about Google’s latest solutions, products, and technologies for developers, that help unlock your creativity and simplify your development workflow. You’ll also get to hear about ways to use the latest in technology, from AI and cloud, to mobile and web. Tune in to watch the live streamed keynotes from Shoreline Amphitheater in Mountain View, CA, then dive into 100+ on-demand technical sessions and engage with helpful learning material. Visit the Google I/O site and register to stay informed about I/O and other related events coming soon.

    Want to get a head start?

    Stay tuned for more updates. We look forward to seeing you in May!

    Let’s go. It’s Google I/O 2023

    Posted by Jeanine Banks, VP & General Manager, Developer X, and Head of Developer Relations

    Google I/O is back and you’re invited to join us online May 10! Learn about Google’s latest solutions, products, and technologies for developers, that help unlock your creativity and simplify your development workflow. You’ll also get to hear about ways to use the latest in technology, from AI and cloud, to mobile and web. Tune in to watch the live streamed keynotes from Shoreline Amphitheater in Mountain View, CA, then dive into 100+ on-demand technical sessions and engage with helpful learning material. Visit the Google I/O site and register to stay informed about I/O and other related events coming soon.

    Want to get a head start?

    Stay tuned for more updates. We look forward to seeing you in May!

    Let’s go. It’s Google I/O 2023

    Posted by Jeanine Banks, VP & General Manager, Developer X, and Head of Developer Relations

    Google I/O is back and you’re invited to join us online May 10! Learn about Google’s latest solutions, products, and technologies for developers, that help unlock your creativity and simplify your development workflow. You’ll also get to hear about ways to use the latest in technology, from AI and cloud, to mobile and web. Tune in to watch the live streamed keynotes from Shoreline Amphitheater in Mountain View, CA, then dive into 100+ on-demand technical sessions and engage with helpful learning material. Visit the Google I/O site and register to stay informed about I/O and other related events coming soon.

    Want to get a head start?

      Stay tuned for more updates. We look forward to seeing you in May!