Tag Archives: google cloud

How to use the App Engine Users service (Module 20)

Posted by Wesley Chun (@wescpy), Developer Advocate, Google Cloud


Introduction and background

The Serverless Migration Station video series and corresponding codelabs aim to help App Engine developers modernize their apps, whether it's upgrading language runtimes like from Python 2 to 3 and Java 8 to 17, or to move laterally to sister serverless platforms like Cloud Functions or Cloud Run. For developers who want more control, like being able to SSH into instances, Compute Engine VMs or GKE, our managed Kubernetes service, are also viable options.

In order to consider moving App Engine apps to other compute services, developers must move their apps away from its original APIs (now referred to as legacy bundled services), either to Cloud standalone replacement or alternative 3rd-party services. Once no longer dependent on these proprietary services, apps become much more portable. Apps can stay on App Engine while upgrading to its 2nd-generation platform, or move to other compute platforms as listed above.

Today's Migration Module 20 content focuses on helping developers refamiliarize themselves with App Engine's Users service, a user authentication system serving as a lightweight wrapper around Google Sign-In (now called Google Identity Services). The video and its corresponding codelab (self-paced, hands-on tutorial) demonstrate how to add use of the Users service to the sample baseline app from Module 1. After adding the Users service in Module 20, Module 21 follows, showing developers how to migrate that usage to Cloud Identity Platform.

How to use the App Engine Users service

Adding use of Users service


The sample app's basic functionality consists of registering each page visit in Datastore and displaying the most recent visits. The Users service helps apps support user logins, App Engine administrative ("admin'") users. It also provides convenient functions for generating login/logout links and retrieving basic user information for logged-in users. Below is a screenshot of the modified app which now supports user logins via the user interface (UI):
Sample app now supports user logins and App Engine admin users (click to enlarge) 
Below is the pseudocode reflecting the changes made to support user logins for the sample app, including integrating the Users service and updating what shows up in the UI:
  • If the user is logged in, show their "nickname" (display name or email address) and display a Logout button. If the logged-in user is an App Engine app admin, also display an "admin" badge (between nickname and Logout button).
  • If the user is not logged in, display the username generically as "user", remove any admin badge, and display a Login button.
Because the Users service is primarily a user-facing endeavor, the most significant changes take place in the UI, whereas the data model and core functionality of registering visits remain unchanged. The new support for user management primarily results in additional context to be rendered in the web template. New or altered code is bolded to highlight the updates.
Table showing code 'Before'(Module 1) on left, and 'After' (Module 20) on the right
 Adding App Engine Users service usage to sample app (click to enlarge)

Wrap-up


Today's "migration" consists of adding usage of the App Engine Users service to support user management and recognize App Engine admin users, starting with the Module 1 baseline app and finishing with the Module 20 app. To get hands-on experience doing it yourself, try the codelab and follow along with the video. Then you'll be ready to upgrade to Identity Platform should you choose to do so.

In Fall 2021, the App Engine team extended support of many of the bundled services to 2nd generation runtimes (that have a 1st generation runtime), meaning you are no longer required to migrate from the Users service to Identity Platform when porting your app to Python 3. You can continue using the Users service in your Python 3 app so long as you retrofit the code to access bundled services from next-generation runtimes.

If you do want to move to Identity Platform, see the Module 21 content, including its codelab. All Serverless Migration Station content (codelabs, videos, and source code [when available]) are available at its open source repo. While we're initially focusing on Python users, the Cloud team is covering other runtimes soon, so stay tuned. Also check out other videos in the broader Serverless Expeditions series.

Migrating from App Engine pull tasks to Cloud Pub/Sub (Module 19)

Posted by Wesley Chun (@wescpy), Developer Advocate, Google Cloud

Introduction and background

The Serverless Migration Station series is aimed at helping developers modernize their apps running one of Google Cloud's serverless platforms. The preceding (Migration Module 18) video demonstrates how to add use of App Engine's Task Queue pull tasks service to a Python 2 App Engine sample app. Today's Module 19 video picks up from where that leaves off, migrating that pull task usage to Cloud Pub/Sub.

Moving away from proprietary App Engine services like Task Queue makes apps more portable, giving them enough flexibility to:

 

    Understanding the migrations

    Module 19 consists of implementing three different migrations on the Module 18 sample app:

    • Migrate from App Engine NDB to Cloud NDB
    • Migrate from App Engine Task Queue pull tasks to Cloud Pub/Sub
    • Migrate from Python 2 to Python (2 and) 3

    The NDB to Cloud NDB migration is identical to the Module 2 migration content, so it's not covered in-depth in Module 19. The original app was designed to be Python 2 and 3 compatible, so there's no work there either. Module 19 boils down to three key updates:

    • Setup: Enable APIs and create Pub/Sub Topic & Subscription
    • How work is created: Publish Pub/Sub messages instead of adding pull tasks
    • How work is processed: Pull messages instead of leasing tasks

    Aside from these physical changes, a key hurdle to overcome is understanding the differences in terminology between pull tasks and Pub/Sub. The following chart attempts to demystify this so developers can more easily grasp how they differ:
    Table of terminology with related GAE Pull Tasks and Cloud Pub/Sub
    Terminology differences between App Engine pull tasks and Cloud Pub/Sub

    Reflecting the chart, these differences can be summarized like this:
    1. With Pull Queues, work is created in pull queues while work is sent to Pub/Sub topics
    2. Task Queue pull tasks are called messages in Pub/Sub
    3. With Task Queues, workers access pull tasks; with Pub/Sub, subscribers receive messages
    4. Leasing a pull task is the same as pulling a message from a Pub/Sub topic via a subscription
    5. Deleting a task from a pull queue when you're done is analogous to successfully acknowledging a Pub/Sub message
    The video walks developers through the terminology as well as the code changes described above. Below is pseudocode implementing the key changes to the main application (new or updated lines of code bolded):
    Table showing changes in code Before (Module 18) on the left, and After (Module 19) on the right
    Migration from App Engine Task Queue pull tasks to Cloud Pub/Sub

    Observe how most of the code, especially app operations and data models are left relatively unchanged. The only visible changes are switching from App Engine NDB and Task Queue to Cloud NDB and Pub/Sub. Complete versions of the app before and after making the changes can be found in the Module 18 and Module 19 repo folders, respectively. In addition to the video, be sure to check out the Module 19 codelab which leads you step-by-step through the migrations discussed.

    Wrap-up

    Module 19 features a migration of App Engine pull tasks to Cloud Pub/Sub, but developers should note that Pub/Sub itself is not based on pull tasks. It is a fully-featured asynchronous, scalable messaging service that has many more features than the pull functionality provided by Task Queue. For example, Pub/Sub has other features like streaming to BigQuery and push functionality. Pub/Sub push operates differently than Task Queue push tasks, hence why we recommend push tasks be migrated to Cloud Tasks instead (see Module 8). For more information on all of its features, see the Pub/Sub documentation. Because Cloud Tasks doesn't support pull functionality, we turn to Pub/Sub instead for pull task users.

    While we recommend users move to the latest offerings from Google Cloud, neither of those migrations are required, and should you opt to do so, can do them on your own timeline. In Fall 2021, the App Engine team extended support of many of the bundled services to 2nd generation runtimes (that have a 1st generation runtime), meaning you don't have to migrate to standalone Cloud services before porting your app to Python 3. You can continue using Task Queue in Python 3 so long as you retrofit your code to access bundled services from next-generation runtimes.

    If you're using other App Engine legacy services be sure to check out the other Migration Modules in this series. All Serverless Migration Station content (codelabs, videos, source code [when available]) can be accessed at its open source repo. While our content initially focuses on Python users, the Cloud team is working on covering other language runtimes, so stay tuned. For additional video content, check out our broader Serverless Expeditions series.

    Lynn Langit: Turning a passion for learning into online courses viewed by millions

    Posted by Kevin Hernandez, Developer Relations Community Manager

    Lynn Langit is not only a Cloud GDE - she’s one of the first ever GDEs to join the program. Despite joining the GDE program early after its establishment, she got a relatively late start with development. Lynn is a self-taught developer that started coding when she was 38 years old - before we had the advent of online educational resources that we do today. To teach herself how to code, she relied on certifications and books and went to her local electronics store to buy equipment to build her own server. Through this process, she found that she was a talented developer and became inspired to try her hand at teaching. She started out with teaching basic topics such as user applications. Today she has 28 Cloud courses on LinkedIn and has an audience of 5 million students. With this immense reach, Lynn runs into her students at various conferences around the world and has even had students recognize her from her voice. She mentions,“Before the pandemic, I used to travel and work globally and it was so gratifying to meet all my students because they would want to come and talk to me. It was incredible to meet students from all over the world.”

    Getting into teaching

    When Lynn left her corporate job, she started her own consultancy in 2011 with two ideas: technical teaching and building. She started out in a classroom with these two ideas but as traditional learning started to usher in a new era of online learning, Lynn followed suit and started to put her lessons on YouTube. This caught the attention of Lynda.com (now LinkedIn Learning) where she was asked to become an author.

    Teaching has proven to be rewarding in several ways. It allows Lynn to have an impact on learners interested in Cloud and dive deeper into topics she’s interested in, all while getting paid for her academic pursuits and instructing. She states, “I can't say I'm an expert in all the services, but I know a lot of the services across all the Clouds. So while I'm learning, I might as well teach and get paid for it.”

    Choosing lessons

    Lynn is in the constant pursuit for knowledge and in the ever-changing world of Cloud, there is always something new to learn or teach and in Lynn’s case, both. “Oftentimes I'll create a course in something that I am genuinely interested in that doesn’t have an existing course. It's so that I can focus my energies, learn it, and then teach it,” she adds. A recent example is with a book club she led last year in quantum computing. “I'm just really taking baby steps into it and as part of that, I started exploring the vendor Cloud quantum offerings. Then I decided to share that as a course,” she says.

    She also mentions that there is a preconceived notion that online content has to be super polished. She believes it’s important to put your lessons out there and more importantly, to learn together. “We're one community and we need to share when we discover something,” she observed.

    Teaching style

    Every instructor has their own teaching style and for Lynn, her brand is a conversational style of instruction. Very much like our interview, her lessons feel as if she’s talking to the audience one-on-one. This is in part by design - Lynn doesn’t write a script and she imagines someone sitting across from her. She can also sprinkle in some useful case studies from her consultancy work and can draw from some real-world examples.

    When asked about effective educators, Lynn says, “Don't be a jerk. The point is not to show how smart you are. The point is to communicate information that you have found useful, that you think other people will find useful and in a way they can understand.”

    Advice for educating online

    Lynn has met a lot of educators in her career and has had the fortune of being able to see published and unpublished content. One thing she noticed is that the problem with a lot of content is that it just simply doesn’t see the light of the day. Some content creators feel as if there is a missing piece or their content needs to be ultra polished but Lynn’s advice is to just click “publish”. She also notes that this can be attributed to imposter syndrome, which shows emotional intelligence, but as a counterpoint she advises, “There's value in the learning, not just the result. That is probably the biggest insight I've gained over my years because I always thought you just had to show polished content.” Lynn believes that your audience wants to go along with you on your journey and since people are busy, they think of you as a curator of knowledge.

    She also advises to start small. She is particularly fond of “snack-sized” pieces of content such as the short-form articles on Dev.to. These “snacks” are easier to produce and in reality, it’s easier on the audience. She says, “It's funny because people want to make a course but this is not a Hollywood movie, I am sorry to break it to you, but people are not going to be rapturously glued to your screen for two hours no matter who you are. So just make little snacks.” If you find something interesting, just put it out there. Over time, as you get practice, you can start to produce longer-form content.

    Advice for GDEs

    Lynn offers valuable advice to any present or future GDE. She encourages, “Really get to know the GDEs. We're all kind of doing the same thing and just jump right in. The bar is high to become a GDE and it's a great community that I've learned a lot from.” There is a wealth of knowledge offered by your community. Maybe you’ll learn how to create an Android app, build a ML model, or build an online course with the guidance of Lynn. Just jump right in.

    You can check out Lynn’s LinkedIn courses or find her on LinkedIn or Substack.

    The Google Developer Experts (GDE) program is a global network of highly experienced technology experts, influencers, and thought leaders who actively support developers, companies, and tech communities by speaking at events and publishing content.

    How to use App Engine pull tasks (Module 18)

    Posted by Wesley Chun (@wescpy), Developer Advocate, Google Cloud

    Introduction and background

    The Serverless Migration Station mini-series helps App Engine developers modernize their apps to the latest language runtimes, such as from Python 2 to 3 or Java 8 to 17, or to sister serverless platforms Cloud Functions and Cloud Run. Another goal of this series is to demonstrate how to move away from App Engine's original APIs (now referred to as legacy bundled services) to Cloud standalone replacement services. Once no longer dependent on these proprietary services, apps become much more portable, making them flexible enough to:

    App Engine's Task Queue service provides infrastructure for executing tasks outside of the standard request-response workflow. Tasks may consist of workloads exceeding request timeouts or periodic tangential work. The Task Queue service provides two different queue types, push and pull, for developers to perform auxiliary work.

    Push queues are covered in Migration Modules 7-9, demonstrating how to add use of push tasks to an existing baseline app followed by steps to migrate that functionality to Cloud Tasks, the standalone successor to the Task Queues push service. We turn to pull queues in today's video where Module 18 demonstrates how to add use of pull tasks to the same baseline sample app. Module 19 follows, showing how to migrate that usage to Cloud Pub/Sub.

    Adding use of pull queues

    In addition to registering page visits, the sample app needs to be modified to track visitors. Visits are comprised of a timestamp and visitor information such as the IP address and user agent. We'll modify the app to use the IP address and track how many visits come from each address seen. The home page is modified to show the top visitors in addition to the most recent visits:

    Screen grab of the sample app's updated home page tracking visits and visitors
    The sample app's updated home page tracking visits and visitors

    When visits are registered, pull tasks are created to track the visitors. The pull tasks sit patiently in the queue until they are processed in aggregate periodically. Until that happens, the top visitors table stays static. These tasks can be processed in a number of ways: periodically by a cron or Cloud Scheduler job, a separate App Engine backend service, explicitly by a user (via browser or command-line HTTP request), event-triggered Cloud Function, etc. In the tutorial, we issue a curl request to the app's endpoint to process the enqueued tasks. When all tasks have completed, the table then reflects any changes to the current top visitors and their visit counts:

    Screen grab of processed pull tasks updated in the top visitors table
    Processed pull tasks update the top visitors table

    Below is some pseudocode representing the core part of the app that was altered to add Task Queue pull task usage, namely a new data model class, VisitorCount, to track visitor counts, enqueuing a (pull) task to update visitor counts when registering individual visits in store_visit(), and most importantly, a new function fetch_counts(), accessible via /log, to process enqueued tasks and update overall visitor counts. The bolded lines represent the new or altered code.

    Adding App Engine Task Queue pull task usage to sample app showing 'Before'[Module 1] on the left and 'After' [Module 18] with altered code on the right
    Adding App Engine Task Queue pull task usage to sample app

    Wrap-up

    This "migration" is comprised of adding Task Queue pull task usage to support tracking visitor counts to the Module 1 baseline app and arrives at the finish line with the Module 18 app. To get hands-on experience doing it yourself, do the codelab by hand and follow along with the video. Then you'll be ready to upgrade to Cloud Pub/Sub should you choose to do so.

    In Fall 2021, the App Engine team extended support of many of the bundled services to 2nd generation runtimes (that have a 1st generation runtime), meaning you are no longer required to migrate pull tasks to Pub/Sub when porting your app to Python 3. You can continue using Task Queue in your Python 3 app so long as you retrofit the code to access bundled services from next-generation runtimes.

    If you do want to move to Pub/Sub, see Module 19, including its codelab. All Serverless Migration Station content (codelabs, videos, and source code) are available at its open source repo. While we're initially focusing on Python users, the Cloud team is covering other runtimes soon, so stay tuned. Also check out other videos in the broader Serverless Expeditions series.

    Unlocking the potential of technology to support health

    This week kicked off the HLTH Conference in Las Vegas where thousands of healthcare leaders, care providers, patients and other people in the industry — like our teams at Google — are coming together to discuss how to create a healthier world.

    At Google, we believe that technology — especially AI and analytics — can unlock a better future for health globally. Our teams from Search, YouTube, Android, Google Cloud and more are using technology to provide health information and insights for consumers, caregivers and communities. Here’s a look at some of our latest updates.

    Giving people information and insights to take action on their health

    For many, the front door to healthcare is their smartphone. Millions of people turn to Google Search and YouTube for authoritative information or use apps and connected devices, like Fitbit, to help stay on top of health and wellness goals.

    To give Android users a new way to get more from their health and wellness data, we introduced Health Connect earlier this year at Google I/O. Through our Early Access Program, more than 10 health, fitness and wellness apps including MyFitnessPal, Oura and Peloton have already integrated with the platform to help people manage everything from workouts to diet to sleep and more. We are now opening up to more developers with Health Connect (Beta) to give people a single place to manage access to data across their health and fitness apps. In the coming months, we will continue to create an even richer ecosystem of apps and features.

    Image of app icons

    We’ve also made strides on our other platforms, Search and YouTube. In 2021, health videos on YouTube were viewed more than 12 billion times in the U.S. YouTube’s authoritative health content and features are now available in 7 countries, and YouTube recently opened up its features to a wider group of health experts in the U.S. to encompass authoritative services that extend beyond educational institutions and health organizations.

    On Search, there’s more ways for people to turn health information into action. After piloting a feature earlier this year that shows available healthcare appointments for primary care, we’re continuing to explore new ways to expand appointments to other specialities and verticals through new and existing partnerships.

    This work is made possible by all our partners who provide the health information, insights and experiences that empower consumers in their health.

    Equipping healthcare ecosystem with analytics and AI to improve health

    Healthcare is one of the largest and most complex industries that is turning towards technology to help organizations run more effectively — which in turn helps people live healthier lives.

    When organizations commit to digital transformation, it can be a long and overwhelming process, but that doesn’t mean it has to take years to see benefits for developers, clinicians and patients.

    Google Cloud came together with several of our customers and partners — including Hackensack Meridian Health, Lifepoint Health and Mayo Clinic — to find a way to encourage rapid reinvention. As a result, we built Google Cloud’s new Healthcare Data Engine (HDE) accelerators to help organizations reinvent quickly and enable the data interoperability that saves lives. The first three HDE accelerators, available in early 2023, will address common use cases around health equity, patient flow, and value-based care.

    The transformation of healthcare requires an open and collaborative approach to be successful. For example, Electronic Health Records (EHR) are a critical part of this ecosystem and we see many ways to work with EHR companies for the benefit of healthcare organizations. Today marks a critical development in this journey. At HLTH we announced an agreement that will allow healthcare organizations to run Epic — an EHR system — on Google Cloud. Hackensack Meridian Health plans to move its Epic workloads to Google Cloud, with the aim to drive greater innovation, efficiencies and security.

    And with our solution Care Studio, we’ve been working with MEDITECH to bring our advanced search, summarization and sense-making capabilities to their EHR, MEDITECH Expanse. We are now extending this integrated solution to our first two partners, Mile Bluff Medical Center and DCH Health System, to give their health teams a more complete view of their patients and easily find salient information to provide better care. This includes organizing patient records from different sources into a longitudinal view, bringing our advanced search functionality to clinicians directly in their EHR so they can easily and quickly access critical information all in one place.

    Fitbit Health Solutions is bringing our technology to healthcare partners, incorporating Fitbit devices, services and insights into programs focused on managing chronic conditions like diabetes. A study from the All of Us research program found that increasing your daily step count by 1000 steps could cut the risk of type 2 diabetes by more than 25%. This kind of insight is key to promoting lifestyle changes for people, and why we are partnering with Babylon Health to support their high-risk members managing chronic conditions.

    Underpinning all our work is a deep commitment to make sure that we do not leave anyone behind. Technology has the power to eliminate health disparities and democratize access to healthcare. But we need to be intentional in our efforts to live up to our goal of improving the health of billions of people by building for everyone, everywhere.

    Machine Learning Communities: Q3 ‘22 highlights and achievements

    Posted by Nari Yoon, Hee Jung, DevRel Community Manager / Soonson Kwon, DevRel Program Manager

    Let’s explore highlights and accomplishments of vast Google Machine Learning communities over the third quarter of the year! We are enthusiastic and grateful about all the activities by the global network of ML communities. Here are the highlights!


    TensorFlow/Keras

    Load-testing TensorFlow Serving’s REST Interface

    Load-testing TensorFlow Serving’s REST Interface by ML GDE Sayak Paul (India) and Chansung Park (Korea) shares the lessons and findings they learned from conducting load tests for an image classification model across numerous deployment configurations.

    TFUG Taipei hosted events (Python + Hugging Face-Translation+ tf.keras.losses, Python + Object detection, Python+Hugging Face-Token Classification+tf.keras.initializers) in September and helped community members learn how to use TF and Hugging face to implement machine learning model to solve problems.

    Neural Machine Translation with Bahdanau’s Attention Using TensorFlow and Keras and the related video by ML GDE Aritra Roy Gosthipaty (India) explains the mathematical intuition behind neural machine translation.

    Serving a TensorFlow image classification model as RESTful and gRPC based services with TFServing, Docker, and Kubernetes

    Automated Deployment of TensorFlow Models with TensorFlow Serving and GitHub Actions by ML GDE Chansung Park (Korea) and Sayak Paul (India) explains how to automate TensorFlow model serving on Kubernetes with TensorFlow Serving and GitHub Action.

    Deploying ? ViT on Kubernetes with TF Serving by ML GDE Sayak Paul (India) and Chansung Park (Korea) shows how to scale the deployment of a ViT model from ? Transformers using Docker and Kubernetes.

    Screenshot of the TensorFlow Forum in the Chinese Language run by the tf.wiki team

    Long-term TensorFlow Guidance on tf.wiki Forum by ML GDE Xihan Li (China) provides TensorFlow guidance by answering the questions from Chinese developers on the forum.

    photo of a phone with the Hindi letter 'Ohm' drawn on the top half of the screen. Hinidi Character recognition shows the letter Ohm as the Predicted Result below.

    Hindi Character Recognition on Android using TensorFlow Lite by ML GDE Nitin Tiwari (India) shares an end-to-end tutorial on training a custom computer vision model to recognize Hindi characters. In TFUG Pune event, he also gave a presentation titled Building Computer Vision Model using TensorFlow: Part 1.

    Using TFlite Model Maker to Complete a Custom Audio Classification App by ML GDE Xiaoxing Wang (China) shows how to use TFLite Model Maker to build a custom audio classification model based on YAMNet and how to import and use the YAMNet-based custom models in Android projects.

    SoTA semantic segmentation in TF with ? by ML GDE Sayak Paul (India) and Chansung Park (Korea). The SegFormer model was not available on TensorFlow.

    Text Augmentation in Keras NLP by ML GDE Xiaoquan Kong (China) explains what text augmentation is and how the text augmentation feature in Keras NLP is designed.

    The largest vision model checkpoint (public) in TF (10 Billion params) through ? transformers by ML GDE Sayak Paul (India) and Aritra Roy Gosthipaty (India). The underlying model is RegNet, known for its ability to scale.

    A simple TensorFlow implementation of a DCGAN to generate CryptoPunks

    CryptoGANs open-source repository by ML GDE Dimitre Oliveira (Brazil) shows simple model implementations following TensorFlow best practices that can be extended to more complex use-cases. It connects the usage of TensorFlow with other relevant frameworks, like HuggingFace, Gradio, and Streamlit, building an end-to-end solution.


    TFX

    TFX Machine Learning Pipeline from data injection in TFRecord to pushing out Vertex AI

    MLOps for Vision Models from ? with TFX by ML GDE Chansung Park (Korea) and Sayak Paul (India) shows how to build a machine learning pipeline for a vision model (TensorFlow) from ? Transformers using the TF ecosystem.

    First release of TFX Addons Package by ML GDE Hannes Hapke (United States). The package has been downloaded a few thousand times (source). Google and other developers maintain it through bi-weekly meetings. Google’s Open Source Peer Award has recognized the work.

    TFUG São Paulo hosted TFX T1 | E4 & TFX T1 | E5. And ML GDE Vinicius Caridá (Brazil) shared how to train a model in a TFX pipeline. The fifth episode talks about Pusher: publishing your models with TFX.

    Semantic Segmentation model within ML pipeline by ML GDE Chansung Park (Korea) and Sayak Paul (India) shows how to build a machine learning pipeline for semantic segmentation task with TFX and various GCP products such as Vertex Pipeline, Training, and Endpoints.


    JAX/Flax

    Screen shot of Tutorial 2 (JAX): Introduction to JAX+Flax with GitHub Repo and Codelab via university of Amseterdam

    JAX Tutorial by ML GDE Phillip Lippe (Netherlands) is meant to briefly introduce JAX, including writing and training neural networks with Flax.


    TFUG Malaysia hosted Introduction to JAX for Machine Learning (video) and Leong Lai Fong gave a talk. The attendees learned what JAX is and its fundamental yet unique features, which make it efficient to use when executing deep learning workloads. After that, they started training their first JAX-powered deep learning model.

    TFUG Taipei hosted Python+ JAX + Image classification and helped people learn JAX and how to use it in Colab. They shared knowledge about the difference between JAX and Numpy, the advantages of JAX, and how to use it in Colab.

    Introduction to JAX by ML GDE João Araújo (Brazil) shared the basics of JAX in Deep Learning Indaba 2022.

    A comparison of the performance and overview of issues resulting from changing from NumPy to JAX

    Should I change from NumPy to JAX? by ML GDE Gad Benram (Portugal) compares the performance and overview of the issues that may result from changing from NumPy to JAX.

    Introduction to JAX: efficient and reproducible ML framework by ML GDE Seunghyun Lee (Korea) introduced JAX/Flax and their key features using practical examples. He explained the pure function and PRNG, which make JAX explicit and reproducible, and XLA and mapping functions which make JAX fast and easily parallelized.

    Data2Vec Style pre-training in JAX by ML GDE Vasudev Gupta (India) shares a tutorial for demonstrating how to pre-train Data2Vec using the Jax/Flax version of HuggingFace Transformers.

    Distributed Machine Learning with JAX by ML GDE David Cardozo (Canada) delivered what makes JAX different from TensorFlow.

    Image classification with JAX & Flax by ML GDE Derrick Mwiti (Kenya) explains how to build convolutional neural networks with JAX/Flax. And he wrote several articles about JAX/Flax: What is JAX?, How to load datasets in JAX with TensorFlow, Optimizers in JAX and Flax, Flax vs. TensorFlow, etc..


    Kaggle

    DDPMs - Part 1 by ML GDE Aakash Nain (India) and cait-tf by ML GDE Sayak Paul (India) were announced as Kaggle ML Research Spotlight Winners.

    Forward process in DDPMs from Timestep 0 to 100

    Fresher on Random Variables, All you need to know about Gaussian distribution, and A deep dive into DDPMs by ML GDE Aakash Nain (India) explain the fundamentals of diffusion models.

    In Grandmasters Journey on Kaggle + The Kaggle Book, ML GDE Luca Massaron (Italy) explained how Kaggle helps people in the data science industry and which skills you must focus on apart from the core technical skills.


    Cloud AI

    How Cohere is accelerating language model training with Google Cloud TPUs by ML GDE Joanna Yoo (Canada) explains what Cohere engineers have done to solve scaling challenges in large language models (LLMs).

    ML GDE Hannes Hapke (United States) chats with Fillipo Mandella, Customer Engineering Manager at Google

    In Using machine learning to transform finance with Google Cloud and Digits, ML GDE Hannes Hapke (United States) chats with Fillipo Mandella, Customer Engineering Manager at Google, about how Digits leverages Google Cloud’s machine learning tools to empower accountants and business owners with near-zero latency.

    A tour of Vertex AI by TFUG Chennai for ML, cloud, and DevOps engineers who are working in MLOps. This session was about the introduction of Vertex AI, handling datasets and models in Vertex AI, deployment & prediction, and MLOps.

    TFUG Abidjan hosted two events with GDG Cloud Abidjan for students and professional developers who want to prepare for a Google Cloud certification: Introduction session to certifications and Q&A, Certification Study Group.

    Flow chart showing shows how to deploy a ViT B/16 model on Vertex AI

    Deploying ? ViT on Vertex AI by ML GDE Sayak Paul (India) and Chansung Park (Korea) shows how to deploy a ViT B/16 model on Vertex AI. They cover some critical aspects of a deployment such as auto-scaling, authentication, endpoint consumption, and load-testing.

    Photo collage of AI generated images

    TFUG Singapore hosted The World of Diffusion - DALL-E 2, IMAGEN & Stable Diffusion. ML GDE Martin Andrews (Singapore) and Sam Witteveen (Singapore) gave talks named “How Diffusion Works” and “Investigating Prompt Engineering on Diffusion Models” to bring people up-to-date with what has been going on in the world of image generation.

    ML GDE Martin Andrews (Singapore) have done three projects: GCP VM with Nvidia set-up and Convenience Scripts, Containers within a GCP host server, with Nvidia pass-through, Installing MineRL using Containers - with linked code.

    Jupyter Services on Google Cloud by ML GDE Gad Benram (Portugal) explains the differences between Vertex AI Workbench, Colab, and Deep Learning VMs.

    Google Cloud's Two Towers Recommender and TensorFlow

    Train and Deploy Google Cloud's Two Towers Recommender by ML GDE Rubens de Almeida Zimbres (Brazil) explains how to implement the model and deploy it in Vertex AI.


    Research & Ecosystem

    WOMEN DATA SCIENCE, LA PAZ Club de lectura de papers de Machine Learning Read, Learn and Share the knowledge #MLPaperReadingClubs, Nathaly Alarcón, @WIDS_LaPaz #MLPaperReadingClubs

    The first session of #MLPaperReadingClubs (video) by ML GDE Nathaly Alarcon Torrico (Bolivia) and Women in Data Science La Paz. Nathaly led the session, and the community members participated in reading the ML paper “Zero-shot learning through cross-modal transfer.”

    In #MLPaperReadingClubs (video) by TFUG Lesotho, Arnold Raphael volunteered to lead the first session “Zero-shot learning through cross-modal transfer.”

    Screenshot of a screenshare of Zero-shot learning through cross-modal transfer to 7 participants in a virtual call

    ML Paper Reading Clubs #1: Zero Shot Learning Paper (video) by TFUG Agadir introduced a model that can recognize objects in images even if no training data is available for the objects. TFUG Agadir prepared this event to make people interested in machine learning research and provide them with a broader vision of differentiating good contributions from great ones.

    Opening of the Machine Learning Paper Reading Club (video) by TFUG Dhaka introduced ML Paper Reading Club and the group’s plan.

    EDA on SpaceX Falcon 9 launches dataset (Kaggle) (video) by TFUG Mysuru & TFUG Chandigarh organizer Aashi Dutt (presenter) walked through exploratory data analysis on SpaceX Falcon 9 launches dataset from Kaggle.

    Screenshot of ML GDE Qinghua Duan (China) showing how to apply the MRC paradigm and BERT to solve the dialogue summarization problem.

    Introduction to MRC-style dialogue summaries based on BERT by ML GDE Qinghua Duan (China) shows how to apply the MRC paradigm and BERT to solve the dialogue summarization problem.

    Plant disease classification using Deep learning model by ML GDE Yannick Serge Obam Akou (Cameroon) talked on plant disease classification using deep learning model : an end to end Android app (open source project) that diagnoses plant diseases.

    TensorFlow/Keras implementation of Nystromformer

    Nystromformer Github repository by Rishit Dagli provides TensorFlow/Keras implementation of Nystromformer, a transformer variant that uses the Nyström method to approximate standard self-attention with O(n) complexity which allows for better scalability.

    Extending support for App Engine bundled services (Module 17)

    Posted by Wesley Chun (@wescpy), Developer Advocate, Google Cloud

    Background

    App Engine initially launched in 2008, providing a suite of bundled services making it convenient for applications to access a database (Datastore), caching service (Memcache), independent task execution (Task Queue), Google Sign-In authentication (Users), or large "blob" storage (Blobstore), or other companion services. However, apps leveraging those services can only run their apps on App Engine.

    To increase app portability and help Google move towards its goal of having the most open cloud on the market, App Engine launched its 2nd-generation service in 2018, initially removing those legacy services. The newer platform allows developers to upgrade apps to the latest language runtimes, such as moving from Python 2 to 3 or Java 8 to 11 (and today, Java 17). One of the major drawbacks to the 1st-generation runtimes is that they're customized, proprietary, and restrictive in what you can use or can't.

    Instead, the 2nd-generation platform uses open source runtimes, meaning ability to follow standard development practices, use common/known idioms, and have fewer restrictions of 3rd-party libraries, and obviating the need to copy or "vendor" them with your code. Unfortunately, to use these newer runtimes, migrating away from App Engine services were required because while you could upgrade language releases, there was no access to bundled services, breaking apps or requiring complete rewrites, making it a showstopper for many users.

    Due to their popularity and the desire to ease the upgrade process for customers, the App Engine team restored access to most (but not all) of those services in Fall 2021. Today's Serverless Migration Station video demonstrates how to continue usage of bundled services available to Python 3 developers.

    Showing App Engine users how to use bundled services on Python 3


    Performing the upgrade

    Modernizing the typical Python 2 App Engine app looks something like this:
    1. Migrate from the webapp2 framework (not available in Python 3)
    2. Port from Python 2 to 3, preserve use of bundled services
    3. Optional migration to Cloud standalone or similar 3rd-party services

    The first step is to move to a standard Python web framework like Flask, Django, Pyramid, etc. Below is some pseudocode from Migration Module 1 demonstrating how to migrate from webapp2 to Flask:

    codeblocks for porting Python 2 sample app from webapp2 to Flask
    Step 1: Port Python 2 sample app from webapp2 to Flask

    The key changes are bolded in the above code snippets. Notice how the App Engine NDB code [the Visit class definition plus store_visit() and fetch_visits() functions] are unaffected by this web framework migration. The full webapp2 code sample can be found in the Module 0 repo folder while the completed migration to Flask sample is located in the Module 1 repo folder.

    After your app has ported frameworks, you're free to upgrade to Python 3 while preserving access to the bundled services if your app uses any. Below is pseudocode demonstrating how to upgrade the same sample app to Python 3 as well as the code changes needed to continue to use App Engine NDB:

    codeblocks for porting sample app to Python 3, preserving use of NDB bundled service
    Step 2: Port sample app to Python 3, preserving use of NDB bundled service

    The original app was designed to work under both Python 2 and 3 interpreters, so no language changes were required in this case. We added an import of the new App Engine SDK followed by the key update wrapping the WSGI object so the app can access the bundled services. As before, the key updates are bolded. Some updates to configuration are also required, and those are outlined in the documentation and the (Module 17) codelab.

    The NDB code is also left untouched in this migration. Not all of the bundled services feature such a hands-free migration, and we hope to cover some of the more complex ones ahead in Module 22. Java, PHP, and Go users have it even better, requiring fewer or no code changes at all. The Python 2 Flask sample is located in the Module 1 repo folder, and the resulting Python 3 app can be found in the Module 1b repo folder.

    The immediate benefit of step two is the ability to upgrade to a more current version of language runtime. This leaves the third step of migrating off the bundled services as optional, especially if you plan on staying on App Engine for the long-term.


    Additional options

    If you decide to migrate off the bundled services, you can do so on your own timeline. It should be a consideration should you ever want to move to modern serverless platforms such as Cloud Functions or Cloud Run, to lower-level platforms because you want more control, like GKE, our managed Kubernetes service, or Compute Engine VMs.

    Step three is also where the rest of the Serverless Migration Station content may be useful:

    *code samples and codelabs available; videos forthcoming

    As far as moving to modern serverless platforms, if you want to break apart a large App Engine app into multiple microservices, consider Cloud Functions. If your organization has added containerization as part of your software development workflow, consider Cloud Run. It's suitable for apps if you're familiar with containers and Docker, but even if you or your team don't have that experience, Cloud Buildpacks can do the heavy lifting for you. Here are the relevant migration modules to explore:


      Wrap-up

      Early App Engine users appreciate the convenience of the platform's bundled services, and after listening to user feedback, adding them back to 2nd-generation runtimes is another way we can help developers modernize their apps. Whether upgrading to newer language runtimes to stay on App Engine and continue to use its bundled services, migrating to Cloud standalone products, or shifting to other serverless platforms, the Google Cloud team aims to provide the tools to help streamline your modernization efforts.

      All Serverless Migration Station content (codelabs, videos, source code [when available]) can be accessed at its open source repo. While our content initially focuses on Python users, the Cloud team is working on covering other language runtimes, so stay tuned. Today's video features a special guest to provide a teaser of what to expect for Java. For additional video content, check out the broader Serverless Expeditions series.

      Dev Library Letters: 14th Issue

      Posted by Garima Mehra, Program Manager

      ‘Google Dev Library letters’ is curated to bring you some of the best projects developed with Google tech that have been submitted to the Dev Library platform. We hope this brings you the inspiration you need for your next project!


      Android



      Image-compressor 
      by Vinod Baste

      Check out Vinod’s Android Image compress library that helps reduce the size of the image by 90% without losing any of its pixels.


      SealedX 
      by Jaewoong Eum

      Learn how to auto-generate extensive sealed classes and interfaces for Android and Kotlin.

      Flutter



      GitHub Actions to deploy
      Flutter Web to gh-pages
       
      by Sai Rajendra Immadi

      Tired of manually deploying the app every time? Or do you want to deploy your flutter web applications to gh-pages? Use this blog as your guide.



      Double And Triple Dots in Flutter 
      by Lakshydeep Vikram

      Learn the reason for using double and triple dots in flutter and where to use them.



      Machine Learning



      Nystromformer 
      by Rishit Dagli

      Learn how to use the Nystrom method to approximate standard self-attention. 


      Google Cloud



      by Ezekias Bokove

      Learn how to set up a notification system for Cloud Run services. 



      Switch to GCP for cost savings and better performance
      by Gaurav Madan

      Learn why architects dealing with complex application design and who use well-known Google services should consider the Google Cloud Platform. 




      "The Google community includes people with diverse backgrounds. No matter what an individual circumstance is, the platform should support anyone to explore and be creative. We encourage authors to boldly consider diverse backgrounds and to be inclusive when authoring."

      Vinesh Prasanna M

      Customer Engineer | Google Cloud 





      "Authoring a good code sample is hard. The difficulty comes from the additional pieces you need to add to your respository to keep the code sample fresh and appealing to your developers."

      Brett Morgan

      Developer Relations Engineer | Flutter







      Want to read more? 
      Check out the latest projects and community-authored content by visiting Google Dev Library
      Submit your projects to showcase your work and inspire developers!


      Google Cloud Next: Tech predictions that might surprise you

      Our favorite pastimes at Google Cloud are imagining and building. We like to help organizations think about their biggest opportunities, and then offer technology to help them get there. It's about giving companies access to technology so they can better serve their customers — like helping Ford create connected cars and helping H&M Group optimize their supply chain. It’s also about offering new ways for employees to work together using Google Workspace, with all of the applications they know and love (like Gmail, Docs, Drive, Calendar and Meet).

      This week at Google Cloud Next, our annual developer and tech leader event, we’ll be diving into everything we are building. Our theme is “Today meets tomorrow,” and we'll be sharing the latest and greatest cloud technologies for organizations around the world. In this year’s opening keynote (October 11, 9 a.m. Pacific), Google Cloud CEO Thomas Kurian will share what’s new today in data, AI, infrastructure, security, collaboration and sustainability. Then, for a look into the future, we’re putting our experts on stage at 10 a.m. Pacific to make 10 cloud predictions for 2025. Tune in to learn how artificial intelligence will get us to a four-day work week (!) and how you can build applications without any coding experience.

      Graphic showing Top 10 Cloud Technology Predictions preview, with pictures of various people speaking at the keynote

      The Predictions keynote will air live as part of Innovator’s Hive @ Google Cloud Next, our developer community with localized events in Sunnyvale, California, Mexico City, Tokyo, Munich, and Bengaluru, India. One of my favorite parts about the 2022 version of Cloud Next is that we are localizing our programming for audiences around the globe, so tune in wherever and whenever for our 24-hour livestream.

      Sports and tech are merging

      Google Cloud Next will also bring a new-ish sport to developers. With the baseball playoff season underway, AI and data analytics is making sports even more exciting, and we’re helping MLB create personalized fan experiences and help the winning Golden State Warriors become “Data Champions.” But that reflects a broader global trend: sports and tech are merging. This includes rapidly growing sports that many people have never heard of, like drone racing.

      What is drone racing, you ask? Imagine strapping on a headset that feeds you the live view from an actual, miniature aircraft that you pilot through the air at speeds of up to 120 mph. Skim above the seats of an empty stadium, navigate through glowing gates at varying heights, and zip through narrow tunnels — all ahead of your fellow pilots (if you’re skilled enough).

      Drone racing has gained massive momentum since its inception in 2011. With its mix of the virtual and the physical, drone racing already feels like a sport of tomorrow. Starting this year, a partnership between Google Cloud and the Drone Racing League (DRL) will advance it even further as engineers on both sides collaborate on new developments in the sport.

      At Next ‘22, developers of all skill levels can get a taste of the action through immersive learning experiences using Google Cloud’s data and analytics services with real race data from DRL.

      Participants in the Google Cloud Fly Cup Challenge can predict race outcomes and give their best tips to pilots in the hopes of enhancing their season performance. Participants will also compete for a chance to win an all-expenses-paid trip to the season finale of the DRL World Championship Race and be crowned the champion on stage.Register on our website to join the race to become the DRL champion today.

      Find out what’s new and what’s next

      Join us October 11 at Google Cloud Next to hear from industry experts about the latest cloud technology trends. Learn about new solutions, engage with peers and even get in a bit of drone racing.