Tag Archives: App Development

#WeArePlay | 4 stories of founders building apps for the LGBTQIA+ community

Posted by Robbie McLachlan, Developer Marketing

#WeArePlay celebrates the inspiring journeys of people behind apps and games on Google Play. In honor of Pride Month, we are highlighting founders who have built tools to empower the LGBTQIA+ community. From dating apps to mental health tools, to storytelling platforms - these founders are paving the way for more inclusive technology.


npckc is a game creator from Kanto, Japan whose stories portray the trans experience

npckc – Game Creator, Kanto, Japan

Born in Hong Kong and raised in Canada, npckc is a trilingual translator based in Japan. A self-taught programmer, they create games that feature stories and characters which are often from marginalized communities. One such game is "one night, hot springs" where players follow Haru, a trans woman, as she embarks on a visit to the hot springs. Players have praised the game's realistic portrayal of trans experiences and the relaxing music composed by npckc's partner, sdhizumi. As a finalist in Google Play's Indie Games Festival in Japan, they hope to attend more gaming conventions to connect with fellow developers in person.


Anshul and Rohan from Mumbai, India built a mental health support app geared to the LGBTQIA+ community’s needs

Anshul and Rohan – App Creators, Mumbai, India

After Anshul returned to India from London, he met Rohan and the pair bonded over their mental health struggles. Together they shared a dream; to create something in the wellness space. This became Evolve, an app with guided meditations, breathing exercises, and daily affirmations. When the pandemic hit, the pair saw first-hand how underserved the LGBTQIA+ community was in mental health support. For Rohan, who identifies as a gay man, this realization hit close to home. Together, Anshul and Rohan redeveloped Evolve towards the LGBTQIA+ community’s specific needs - building a safe space where users can share their experiences, seek mentorship, and build a supportive community.


BáiYù from Indiana, U.S. created a platform to publish authentic, queer visual novels and indie games

BáiYù – Game Creator, Indiana, USA

Queer developer BáiYù loves writing stories, and started making games at age 16. Part of a game-development community, BáiYù wanted an affordable way to help get their creations out. So they set up Project Ensō, publishing queer visual novels and narrative indie games. With 10 titles on Google Play, BáiYù supports other developers from under-represented groups to share their own authentic stories on Project Ensō, even polishing their games before release. The most popular title on Project Ensō is “Yearning: A Gay Story”, in which gamers play a newly-out gay man navigating his freshman year of college. BáiYù's efforts have had a profound impact on players, with many sharing how these games have positively transformed their lives.


Alex and Jake from Nevada, U.S. built an inclusive dating app and social community for everyone

BáiYù – Game Creator, Indiana, USA

Alex and Jake grew up in an environment that didn’t accept the LGBTQIA+ community. They started building apps together after a mutual friend introduced them. When they realized that queer people were looking for a platform that offered support and meaningful connections, they created Taimi. Taimi is not just a dating app for LGBTQIA+ people; it's also a social network where they can bond, build community, and feel safe. Alex and Jake are also proud to partner with NGOs that provide mental health support for the community.


Discover more stories of app and game creators in #WeArePlay.



How useful did you find this blog post?

#WeArePlay | Meet the people creating apps and games in Australia

Posted by Robbie McLachlan – Developer Marketing

Last year #WeArePlay went on a virtual tour of India, Europe and Japan to spotlight the stories of app and game founders. Today, we’re continuing our tour across the world with our next stop: Australia

From an app helping people during natural disasters to a game promoting wellbeing through houseplants, meet the 50 apps and games companies building growing businesses on Google Play.

Let’s take a quick road trip across the territories.

Tristen's app gives accurate information to people during natural disasters

Tristen, founder of Disaster Science
Tristen, founder of Disaster Science

Meet Tristen from Canberra, founder of Disaster Science. When Tristen was stranded by a bushfire with friends during a holiday, he realized the need to have accurate information in a crisis situation. Moved to help others, he leveraged his software development skills to create his app, Bushfire.io. It collects data from multiple sources to give people an overview of fires, floods, road closures, and vital weather updates.

He has recently added real-time satellite imagery and has plans to expand further internationally, with coverage of region-specific events like cyclones, earthquakes, evacuations and heat warnings.


Christine and Lauren's promotes wellbeing through houseplants

Christine and Lauren, co-founders of Kinder World
Christine and Lauren, co-founders of Kinder World

Friends Christine and Lauren from Melbourne co-founded gaming company Kinder World. As a child, Lauren used video games to soothe the pain of her chronic ear infections. That was how she discovered they could be a healing experience for people—a sentiment she dedicated her career to. She partnered with engineer Christina to make Kinder World: Cozy Plants.

In the game, players enter the comforting, botanical world of houseplants, home decoration, steaming hot coffee, and freshly baked cookies. Since going viral on several social media platforms, the app has seen huge growth.


Kathryn's app helps reduce stress and anxiety in children

Kathryn, founder of Courageous Kids
Kathryn, founder of Courageous Kids

Kathryn from Melbourne is the founder of Courageous Kids. When Kathryn's son was anxious and fearful whenever she dropped him off at school, as a doctor, her instincts for early intervention kicked in. She sought advice from pediatric colleagues to create stories to explain his day, making him the main character. Friends in a similar situation began to ask her for advice and use the stories for their own children so she created Courageous Kids.

A library of real-world stories for parents to personalize, Courageous Kids helps children to visualize their day and manage their expectations. Her app has become popular among families of sensitive and autistic children, and Kathryn is now working with preschools to give even more kids the tools to feel confident.


Discover more #WeArePlay stories from Australia, and stories from across the globe.



How useful did you find this blog post?

Top 3 Updates for Building with AI on Android at Google I/O ‘24

Posted by Terence Zhang – Developer Relations Engineer

At Google I/O, we unveiled a vision of Android reimagined with AI at its core. As Android developers, you're at the forefront of this exciting shift. By embracing generative AI (Gen AI), you'll craft a new breed of Android apps that offer your users unparalleled experiences and delightful features.

Gemini models are powering new generative AI apps both over the cloud and directly on-device. You can now build with Gen AI using our most capable models over the Cloud with the Google AI client SDK or Vertex AI for Firebase in your Android apps. For on-device, Gemini Nano is our recommended model. We have also integrated Gen AI into developer tools - Gemini in Android Studio supercharges your developer productivity.

Let’s walk through the major announcements for AI on Android from this year's I/O sessions in more detail!

#1: Build AI apps leveraging cloud-based Gemini models

To kickstart your Gen AI journey, design the prompts for your use case with Google AI Studio. Once you are satisfied with your prompts, leverage the Gemini API directly into your app to access Google’s latest models such as Gemini 1.5 Pro and 1.5 Flash, both with one million token context windows (with two million available via waitlist for Gemini 1.5 Pro).

If you want to learn more about and experiment with the Gemini API, the Google AI SDK for Android is a great starting point. For integrating Gemini into your production app, consider using Vertex AI for Firebase (currently in Preview, with a full release planned for Fall 2024). This platform offers a streamlined way to build and deploy generative AI features.

We are also launching the first Gemini API Developer competition (terms and conditions apply). Now is the best time to build an app integrating the Gemini API and win incredible prizes! A custom Delorean, anyone?


#2: Use Gemini Nano for on-device Gen AI

While cloud-based models are highly capable, on-device inference enables offline inference, low latency responses, and ensures that data won’t leave the device.

At I/O, we announced that Gemini Nano will be getting multimodal capabilities, enabling devices to understand context beyond text – like sights, sounds, and spoken language. This will help power experiences like Talkback, helping people who are blind or have low vision interact with their devices via touch and spoken feedback. Gemini Nano with Multimodality will be available later this year, starting with Google Pixel devices.

We also shared more about AICore, a system service managing on-device foundation models, enabling Gemini Nano to run on-device inference. AICore provides developers with a streamlined API for running Gen AI workloads with almost no impact on the binary size while centralizing runtime, delivery, and critical safety components for Gemini Nano. This frees developers from having to maintain their own models, and allows many applications to share access to Gemini Nano on the same device.

Gemini Nano is already transforming key Google apps, including Messages and Recorder to enable Smart Compose and recording summarization capabilities respectively. Outside of Google apps, we're actively collaborating with developers who have compelling on-device Gen AI use cases and signed up for our Early Access Program (EAP), including Patreon, Grammarly, and Adobe.

Moving image of Gemini Nano operating in Adobe

Adobe is one of these trailblazers, and they are exploring Gemini Nano to enable on-device processing for part of its AI assistant in Acrobat, providing one-click summaries and allowing users to converse with documents. By strategically combining on-device and cloud-based Gen AI models, Adobe optimizes for performance, cost, and accessibility. Simpler tasks like summarization and suggesting initial questions are handled on-device, enabling offline access and cost savings. More complex tasks such as answering user queries are processed in the cloud, ensuring an efficient and seamless user experience.

This is just the beginning - later this year, we'll be investing heavily to enable and aim to launch with even more developers.

To learn more about building with Gen AI, check out the I/O talks Android on-device GenAI under the hood and Add Generative AI to your Android app with the Gemini API, along with our new documentation.


#3: Use Gemini in Android Studio to help you be more productive

Besides powering features directly in your app, we’ve also integrated Gemini into developer tools. Gemini in Android Studio is your Android coding companion, bringing the power of Gemini to your developer workflow. Thanks to your feedback since its preview as Studio Bot at last year’s Google I/O, we’ve evolved our models, expanded to over 200 countries and territories, and now include this experience in stable builds of Android Studio.

At Google I/O, we previewed a number of features available to try in the Android Studio Koala preview release, like natural-language code suggestions and AI-assisted analysis for App Quality Insights. We also shared an early preview of multimodal input using Gemini 1.5 Pro, allowing you to upload images as part of your AI queries — enabling Gemini to help you build fully functional compose UIs from a wireframe sketch.


You can read more about the updates here, and make sure to check out What’s new in Android development tools.

#WeArePlay | How Zülal is using AI to help people with low vision

Posted by Leticia Lago – Developer Marketing

Born in Istanbul, Türkiye with limited sight, Zülal has been a power-user of visual assistive technologies since the age of 4. When she lost her sight completely at 10 years old, she found herself reliant on technology to help her see and experience the world around her.

Today, Zülal is the founder of FYE, her solution to the issues she found with other visual assistive technologies. The app empowers people with low vision to be inspired by the world around them. Employing a team of 4, she heads up technological development and user experience for the app.

Zülal shared her story in our latest film for #WeArePlay, which celebrates people around the world building apps and games. She shared her journey from uploading pictures of her parents to a computer to get descriptions of them as a child, to developing her own visual assistive app. Find out what’s next for Zülal and how she is using AI to help people like herself.

Tell us more about the inspiration behind FYE.

Today, there are around 330 million people with severe to moderate visual impairment. Visual assistive technology is life-changing for these people, giving them back a sense of independence and a connection to the world around them. I’m a poet and composer, and in order to create I needed this tech so that I could see and describe the world around me. Before developing FYE, the visual assistive technology I was relying on was falling short. I wanted to take back control. I didn’t want to sit back, wait and see what technology could do for me - I wanted to harness its power. So I did.

Why was it important for you to build FYE?

I never wanted to be limited by having low vision. I’ve always thought, how can I make this better? How can I make my life better? I want to do everything, because I can. I really believe that there’s nothing I can’t do. There’s nothing WE can’t do. Having a founder like me lead the way in visual assistive technology illustrates just that. We’re taking back control of how we experience the world around us.

What’s different about FYE?

With our app, I believe our audience can really see the world again. It uses a combination of AI and human input to describe the world around them to our users. It incorporates an AI model trained on a dataset of over 15 million data points, so it really encompasses all the varying factors that make up the world of everyday visual experiences. The aim was to have descriptions as vivid as if I was describing my surroundings myself. It’s the small details that make a big difference.

What’s next for your app?

We already have personalized AI outputs so the user can create different AI assistants to suit different situations. You can use it to work across the internet as you’re browsing or shopping. I use it a lot for cooking - where the AI can adapt and learn to suit any situation. We are also collaborating with places where people with low vision might struggle, like the metro and the airport. We’ve built in AI outputs in collaboration with these spaces so that anyone using our app will be able to navigate those spaces with confidence. I’m currently working on evolving From Your Eyes as an organization, reimagining the app as one element of the organization under the new name FYE. Next, we’re exploring integrations with smart glasses and watches to bring our app to wearables.

Discover more #WeArePlay stories and share your favorites.



How useful did you find this blog post?

A Developer’s Roadmap to Predictive Back (Views)

Posted by Ash Nohe and Tram Bui – Developer Relations Engineers

Before you read on, this topic is scoped to Views. Predictive Back with Compose is easier to implement and not included in this blog post. To learn how to implement Compose with Predictive Back, see the Add predictive back animations codelab and the I/O workshop Improve the user experience of your Android app.

This blog post aims to shed light on various dependencies and requirements to support predictive back animations in your views based app.

First, view the Predictive Back Requirements table to understand if a particular animation requires a manifest flag, a compileSDK version, additional libraries or hidden developer options to function.

Then, start your quest. Here are your milestones:

  1. Upgrade Kotlin milestone
  2. Back-to-home animation milestone
  3. Migrate all activities milestone
  4. Fragment milestone
  5. Material Components (Views) milestone
  6. [Optional] AndroidX transitions milestone
Milestones

Upgrade Kotlin milestone

The first milestone is to upgrade to Kotlin 1.8.0 or higher, which is required for other Predictive Back dependencies.

Upgrade to Kotlin 1.8.0 or higher

Back-to-home animation milestone

The back-to-home animation is the keystone predictive back animation.

To get this animation, add android:enableOnBackInvokedCallback=true in your AndroidManifest.xml for your root activity if you are a multi-activity app (see per-activity opt-in) or at the application level if you are a single-activity app. After this, you’ll see both the back-to-home animation and a cross-task animation where applicable, which are visible to users in Android 15+ and behind a developer option in Android 13 and 14.

If you are intercepting back events in your root activity (e.g. MainActivity), you can continue to do so but you’ll need to use supported APIs and you won’t get the back-to-home animation. For this reason, we generally recommend you only intercept back events for UI logic; for example, to show a dialog asking the user to save before they quit.

See the Add support for the predictive back gesture guide for more details.

Milestone grid

Migrate all activities milestone

If you are a multi-activity app, you’ll need to opt-in and handle back events within those activities too to get a system controlled cross-activity animation. Learn more about per-activity opt-in, available for devices running Android 14+. The cross-activity animation is visible to users in Android 15+ and behind a developer option in Android 13 and 14.

Custom cross activity animations are also available with overrideActivityTransition.

Milestone grid

Fragment milestone

Next, you’ll want to focus on your fragment animations and transitions. This requires updating to AndroidX fragment 1.7.0 and transition 1.5.0 or later and using Animator or AndroidX Transitions. Assuming these requirements are met, your existing fragment animations and transitions will animate in step with the back gesture. You can also use material motion with fragments. Most material motions support predictive back as of 1.12.02-alpha02 or higher, including MaterialFadeThrough, MaterialSharedAxis and MaterialFade.

Don’t strive to make your fragment transitions look like the system’s cross-activity transition. We recommend this full screen surface transition instead.

Learn more about Fragments and Predictive Back.

Milestone grid

Material Components milestone

Finally, you’ll want to take advantage of the Material Component View animations available for Predictive Back. Learn more about available components.

Milestone grid

After this, you’ve completed your quest to support Predictive Back animations in your view based app.

[Optional] AndroidX Transitions milestone

If you’re up for more, you might also ensure your AndroidX transitions are supported with Predictive Back. Read more about AndroidX Transitions and the Predictive Back Progress APIs.

Milestone grid

Other Resources

#WeArePlay | Meet the founders changing women’s lives: Women’s History Month Stories

Posted by Leticia Lago – Developer Marketing

In celebration of Women’s History month, we’re celebrating the founders behind groundbreaking apps and games from around the world - made by women or for women. Let's discover four of my favorites in this latest batch of nine #WeArePlay stories.


Múkami Kinoti Kimotho

Royelles Revolution / Royelles Revolution: Gaming For Girls (USA)

Múkami Kinoti Kimotho – Royelles Revolution / Royelles- Gaming For Girls | USA

Múkami's journey began when she noticed the lack of representation for girls in the gaming industry. Determined to change this narrative, she created Royelles, a game designed to inspire girls and non-binary people to pursue careers in STEAM (science, technology, engineering, art, math) fields. The game is anchored in fierce female avatars like the real life NASA scientist Mara who voices a character. Royelles is revolutionizing the gaming landscape and empowering the next generation of innovators. Múkami's excited to release more gamified stories and learning modules, and a range of extended reality and AI-powered avatars based on the game’s characters.

"If we're going to effectively educate Gen Z and Gen Alpha, we have to meet them in the metaverse and leverage gamified play as a means of driving education, awareness, inspiration and empowerment.” 

- Múkami

Leonika Sari Njoto Boedioetomo

Reblood: Blood Services App (Indonesia)

Leonika Sari Njoto Boedioetomo – Reblood / Blood Services App | Indonesia

When her university friend needed an urgent blood transfusion but discovered there was none available in the blood bank, Leonika became aware of the blood donation shortage in Indonesia. Her mission to address this led her to create Reblood, an app connecting blood donors with those in need. With over 140,000 blood donations facilitated to date, Reblood is not only saving lives but also promoting healthier lifestyles with a recently added feature that allows people to find the most affordable medical checkups.

“Our goal is to save more lives by raising awareness of blood donation in Indonesia and promoting healthier lifestyles for blood donors.” 

- Leonika

Luciane Antunes dos Santos and Renato Hélio Rauber

CARSUL / Car Sul: Urban Mobility App (Brazil)

Luciane Antunes dos Santos and Renato Hélio Rauber – Car Sul: Urban Mobility App | Brazil

Luciane was devastated when she lost her son in a car accident. Her and her husband Renato's loss led them to develop Carsul, an urban mobility app prioritizing safety and security. By providing safe transportation options and partnering with government health programs to chauffeur patients long distances to larger hospitals, Carsul is not only preventing accidents but also saving lives. Luciane and Renato's dedication to protecting others from the pain they've experienced is ongoing and they plan to expand to more cities in Brazil.

“Carsul was born from this story of loss, inspiring me to protect other lives. Redefining myself in this way is very rewarding.” 

- Luciane

Diariata (Diata) N'Diaye

Resonantes / App-Elles: Safety App for Women (France)

Diariata (Diata) N'Diaye – Resonantes /App-Elles: Safety App for Women | France

After hearing the stories of young people who had experienced abuse that was similar to her own, Spoken word artist Diata developed App-Elles – an app that allows women to send alerts when they're in danger. By connecting users with support networks and professional services, App-Elles is empowering women to reclaim their safety and seek help when needed.Diata also runs writing and recording workshops to help victims overcome their experiences with violence and has plans to expand her app with the introduction of a discreet wearable that sends out alerts.

“I realized from my work on the ground that there were victims of violence who needed help and support systems. This was my inspiration to create App-Elles." 

- Diata


Discover more #WeArePlay stories and share your favorites.



How useful did you find this blog post?

#WeArePlay | How two sea turtle enthusiasts are revolutionizing marine conservation

Posted by Leticia Lago – Developer Marketing

When environmental science student Caitlin returned home from a trip monitoring sea turtles in Western Australia, she was inspired to create a conservation tool that could improve tracking of the species. She connected with a French developer and fellow marine life enthusiast Nicolas to design their app We Spot Turtles!, allowing anyone to support tracking efforts by uploading pictures of them spotted in the wild.

Caitlin and Nicolas shared their journey in our latest film for #WeArePlay, which showcases the amazing stories behind apps and games on Google Play. We caught up with the pair to find out more about their passion and how they are making strides towards advancing sea turtle conservation.

Tell us about how you both got interested in sea turtle conservation?

Caitlin: A few years ago, I did a sea turtle monitoring program for the Department of Biodiversity, Conservation and Attractions in Western Australia. It was probably one of the most magical experiences of my life. After that, I decided I only really wanted to work with sea turtles.

Nicolas: In 2010, in French Polynesia, I volunteered with a sea turtle protection project. I was moved by the experience, and when I came back to France, I knew I wanted to use my tech background to create something inspired by the trip.

How did these experiences lead you to create We Spot Turtles!?

Caitlin: There are seven species of sea turtle, and all are critically endangered. Or rather there’s not enough data on them to inform an accurate endangerment status. This means the needs of the species are going unmet and sea turtles are silently going extinct. Our inspiration is essentially to better track sea turtles so that conservation can be improved.

Nicolas: When I returned to France after monitoring sea turtles, I knew I wanted to make an app inspired by my experience. However, I had put the project on hold for a while. Then, when a friend sent me Caitlin’s social media post looking for a developer for a sea turtle conservation app, it re-ignited my inspiration, and we teamed up to make it together.

close up image of a turtle resting in a reef underwater

What does We Spot Turtles! do?

Caitlin: Essentially, members of the public upload images of sea turtles they spot – and even get to name them. Then, the app automatically geolocates, giving us a date and timestamp of when and where the sea turtle was located. This allows us to track turtles and improve our conservation efforts.

How do you use artificial intelligence in the app?

Caitlin: The advancements in AI in recent years have given us the opportunity to make a bigger impact than we would have been able to otherwise. The machine learning model that Nicolas created uses the facial scale and pigmentations of the turtles to not only identify its species, but also to give that sea turtle a unique code for tracking purposes. Then, if it is photographed by someone else in the future, we can see on the app where it's been spotted before.

How has Google Play supported your journey?

Caitlin: Launching our app on Google Play has allowed us to reach a global audience. We now have communities in Exmouth in Western Australia, Manly Beach in Sydney, and have 6 countries in total using our app already. Without Google Play, we wouldn't have the ability to connect on such a global scale.

Nicolas: I’m a mobile application developer and I use Google’s Flutter framework. I knew Google Play was a good place to release our title as it easily allows us to work on the platform. As a result, we’ve been able to make the app great.

Photo pf Caitlin and Nicolas on the bach in Australia at sunset. Both are kneeling in the sand. Caitlin is using her phone to identify something in the distance, and gesturing to Nicolas who is looking in the same direction

What do you hope to achieve with We Spot Turtles!?

Caitlin: We Spot Turtles! puts data collection in the hands of the people. It’s giving everyone the opportunity to make an impact in sea turtle conservation. Because of this, we believe that we can massively alter and redefine conservation efforts and enhance people’s engagement with the natural world.

What are your plans for the future?

Caitlin: Nicolas and I have some big plans. We want to branch out into other species. We'd love to do whale sharks, birds, and red pandas. Ultimately, we want to achieve our goal of improving the conservation of various species and animals around the world.


Discover other inspiring app and game founders featured in #WeArePlay.



How useful did you find this blog post?

#WeArePlay | How two sea turtle enthusiasts are revolutionizing marine conservation

Posted by Leticia Lago – Developer Marketing

When environmental science student Caitlin returned home from a trip monitoring sea turtles in Western Australia, she was inspired to create a conservation tool that could improve tracking of the species. She connected with a French developer and fellow marine life enthusiast Nicolas to design their app We Spot Turtles!, allowing anyone to support tracking efforts by uploading pictures of them spotted in the wild.

Caitlin and Nicolas shared their journey in our latest film for #WeArePlay, which showcases the amazing stories behind apps and games on Google Play. We caught up with the pair to find out more about their passion and how they are making strides towards advancing sea turtle conservation.

Tell us about how you both got interested in sea turtle conservation?

Caitlin: A few years ago, I did a sea turtle monitoring program for the Department of Biodiversity, Conservation and Attractions in Western Australia. It was probably one of the most magical experiences of my life. After that, I decided I only really wanted to work with sea turtles.

Nicolas: In 2010, in French Polynesia, I volunteered with a sea turtle protection project. I was moved by the experience, and when I came back to France, I knew I wanted to use my tech background to create something inspired by the trip.

How did these experiences lead you to create We Spot Turtles!?

Caitlin: There are seven species of sea turtle, and all are critically endangered. Or rather there’s not enough data on them to inform an accurate endangerment status. This means the needs of the species are going unmet and sea turtles are silently going extinct. Our inspiration is essentially to better track sea turtles so that conservation can be improved.

Nicolas: When I returned to France after monitoring sea turtles, I knew I wanted to make an app inspired by my experience. However, I had put the project on hold for a while. Then, when a friend sent me Caitlin’s social media post looking for a developer for a sea turtle conservation app, it re-ignited my inspiration, and we teamed up to make it together.

close up image of a turtle resting in a reef underwater

What does We Spot Turtles! do?

Caitlin: Essentially, members of the public upload images of sea turtles they spot – and even get to name them. Then, the app automatically geolocates, giving us a date and timestamp of when and where the sea turtle was located. This allows us to track turtles and improve our conservation efforts.

How do you use artificial intelligence in the app?

Caitlin: The advancements in AI in recent years have given us the opportunity to make a bigger impact than we would have been able to otherwise. The machine learning model that Nicolas created uses the facial scale and pigmentations of the turtles to not only identify its species, but also to give that sea turtle a unique code for tracking purposes. Then, if it is photographed by someone else in the future, we can see on the app where it's been spotted before.

How has Google Play supported your journey?

Caitlin: Launching our app on Google Play has allowed us to reach a global audience. We now have communities in Exmouth in Western Australia, Manly Beach in Sydney, and have 6 countries in total using our app already. Without Google Play, we wouldn't have the ability to connect on such a global scale.

Nicolas: I’m a mobile application developer and I use Google’s Flutter framework. I knew Google Play was a good place to release our title as it easily allows us to work on the platform. As a result, we’ve been able to make the app great.

Photo pf Caitlin and Nicolas on the bach in Australia at sunset. Both are kneeling in the sand. Caitlin is using her phone to identify something in the distance, and gesturing to Nicolas who is looking in the same direction

What do you hope to achieve with We Spot Turtles!?

Caitlin: We Spot Turtles! puts data collection in the hands of the people. It’s giving everyone the opportunity to make an impact in sea turtle conservation. Because of this, we believe that we can massively alter and redefine conservation efforts and enhance people’s engagement with the natural world.

What are your plans for the future?

Caitlin: Nicolas and I have some big plans. We want to branch out into other species. We'd love to do whale sharks, birds, and red pandas. Ultimately, we want to achieve our goal of improving the conservation of various species and animals around the world.


Discover other inspiring app and game founders featured in #WeArePlay.



How useful did you find this blog post?

#WeArePlay | How two sea turtle enthusiasts are revolutionizing marine conservation

Posted by Leticia Lago – Developer Marketing

When environmental science student Caitlin returned home from a trip monitoring sea turtles in Western Australia, she was inspired to create a conservation tool that could improve tracking of the species. She connected with a French developer and fellow marine life enthusiast Nicolas to design their app We Spot Turtles!, allowing anyone to support tracking efforts by uploading pictures of them spotted in the wild.

Caitlin and Nicolas shared their journey in our latest film for #WeArePlay, which showcases the amazing stories behind apps and games on Google Play. We caught up with the pair to find out more about their passion and how they are making strides towards advancing sea turtle conservation.

Tell us about how you both got interested in sea turtle conservation?

Caitlin: A few years ago, I did a sea turtle monitoring program for the Department of Biodiversity, Conservation and Attractions in Western Australia. It was probably one of the most magical experiences of my life. After that, I decided I only really wanted to work with sea turtles.

Nicolas: In 2010, in French Polynesia, I volunteered with a sea turtle protection project. I was moved by the experience, and when I came back to France, I knew I wanted to use my tech background to create something inspired by the trip.

How did these experiences lead you to create We Spot Turtles!?

Caitlin: There are seven species of sea turtle, and all are critically endangered. Or rather there’s not enough data on them to inform an accurate endangerment status. This means the needs of the species are going unmet and sea turtles are silently going extinct. Our inspiration is essentially to better track sea turtles so that conservation can be improved.

Nicolas: When I returned to France after monitoring sea turtles, I knew I wanted to make an app inspired by my experience. However, I had put the project on hold for a while. Then, when a friend sent me Caitlin’s social media post looking for a developer for a sea turtle conservation app, it re-ignited my inspiration, and we teamed up to make it together.

close up image of a turtle resting in a reef underwater

What does We Spot Turtles! do?

Caitlin: Essentially, members of the public upload images of sea turtles they spot – and even get to name them. Then, the app automatically geolocates, giving us a date and timestamp of when and where the sea turtle was located. This allows us to track turtles and improve our conservation efforts.

How do you use artificial intelligence in the app?

Caitlin: The advancements in AI in recent years have given us the opportunity to make a bigger impact than we would have been able to otherwise. The machine learning model that Nicolas created uses the facial scale and pigmentations of the turtles to not only identify its species, but also to give that sea turtle a unique code for tracking purposes. Then, if it is photographed by someone else in the future, we can see on the app where it's been spotted before.

How has Google Play supported your journey?

Caitlin: Launching our app on Google Play has allowed us to reach a global audience. We now have communities in Exmouth in Western Australia, Manly Beach in Sydney, and have 6 countries in total using our app already. Without Google Play, we wouldn't have the ability to connect on such a global scale.

Nicolas: I’m a mobile application developer and I use Google’s Flutter framework. I knew Google Play was a good place to release our title as it easily allows us to work on the platform. As a result, we’ve been able to make the app great.

Photo pf Caitlin and Nicolas on the bach in Australia at sunset. Both are kneeling in the sand. Caitlin is using her phone to identify something in the distance, and gesturing to Nicolas who is looking in the same direction

What do you hope to achieve with We Spot Turtles!?

Caitlin: We Spot Turtles! puts data collection in the hands of the people. It’s giving everyone the opportunity to make an impact in sea turtle conservation. Because of this, we believe that we can massively alter and redefine conservation efforts and enhance people’s engagement with the natural world.

What are your plans for the future?

Caitlin: Nicolas and I have some big plans. We want to branch out into other species. We'd love to do whale sharks, birds, and red pandas. Ultimately, we want to achieve our goal of improving the conservation of various species and animals around the world.


Discover other inspiring app and game founders featured in #WeArePlay.



How useful did you find this blog post?

#WeArePlay | Learn how a childhood experience with an earthquake shaped Álvaro’s entrepreneurial journey

Posted by Leticia Lago – Developer Marketing

Being trapped inside a house following a major earthquake as a child motivated Álvaro to research and improve the outcomes of destructive, large-scale quakes in Mexico. Using SkyAlert technology, sensors detect and report warnings of incoming earthquakes, giving people valuable time to prepare and get to safety.

Álvaro shared his story in our latest film for #WeArePlay, which spotlights the founders and creatives behind inspiring apps and games on Google Play. We caught up with him to find out his motivations for SkyAlert, the impact the app’s had and what his future plans are.


What was the inspiration behind SkyAlert?

Being in Colima near the epicenter of a massive earthquake as a kid had a huge impact on me. I remember feeling powerless to nature and very vulnerable watching everything falling apart around me. I was struck by how quick and smart you had to be to get to a safe place in time. I remember hugging my family once it was over and looking towards the sea to watch out for an impending tsunami – which fortunately didn’t hit my region badly. It was at this moment that I became determined to find out what had caused this catastrophe and what could be done to prevent it being so destructive another time.

Through my research, I learned that Mexico sits on five tectonic plates and, as a result, it is particularly prone to earthquakes. In fact, there've been seven major quakes in the last seven years, with hundreds losing their lives. Reducing the threat of earthquakes is my number one goal and the motivation behind SkyAlert. The technology we’ve developed can detect the warning signs of an earthquake early on, deliver alerts to vulnerable people and hopefully save lives.


How does SkyAlert work exactly?

SkyAlert collects data from a network of sensors and translates that information into alerts. People can put their zip code in order to filter updates for their locality. We’re constantly investing in getting the most reliable and fast technology available so we can make the service as timely and effective as possible.


Did you always imagine you’d be an entrepreneur?

Since I was a kid I knew I wanted to be an entrepreneur. This was inspired by my grandfather who ran a large candy company with factories all over Mexico. However, what I really wanted, beyond just running my own company, was to have a positive social impact and change lives for the better: a feat I feel proud to have achieved with SkyAlert.


How is Google Play helping your app to grow?

Being on Google Play helps us to reach the maximum number of people. We’ve achieved some amazing numbers in the last 10 years through Google Play, with over 7 million downloads. With 35% of our income coming from Google Play, this reach has helped us invest in new technologies and sensors.

We also often receive advice from Google Play and they invite us to meetings to tell us how to do better and how to make the most of the platform. Google Play is a close partner that we feel really takes care of us.


What impact has SkyAlert had on the people of Mexico?

The biggest advantage of SkyAlert is that it can help them prepare for an earthquake. In 2017, we were able to notify people of a massive quake 12 seconds before it hit Mexico City. At least with those few seconds, many were able to get themselves to a safe place. Similarly, with a large earthquake in Oaxaca, we were able to give a warning of over a minute, allowing teachers to get students in schools away from infrastructure – saving kids’ lives.

Also, many find having SkyAlert on their phone gives them peace of mind, knowing they’ll have some warning before an earthquake strikes. This can be very reassuring.


What does the future look like for SkyAlert?

We’re working hard to expand our services into new risk areas like flooding, storms and wildfires. The hope is to become a global company that can deliver alerts on a variety of natural phenomena in countries around the world.


Read more about Álvaro and other inspiring app and game founders featured in #WeArePlay.



How useful did you find this blog post?