Editor’s Note: Today, we’re GIFted with the presence of a guest author. Bethany Davis, current University of Pennsylvania student and former software engineering summer intern at GIPHY, shares the details of her summer project, which was powered by Google Cloud Vision. This is a condensed and modified version of a post published on the GIPHY Engineering blog.
When my friend was starting her first full-time job, I wanted to GIF her a pep talk before her first day. I had the perfect movie reference in mind: Becca from “Bridesmaids” saying, “You are more beautiful than Cinderella! You smell like pine needles and have a face like sunshine!”
I searched GIPHY for “you are more beautiful than Cinderella” to no avail, then searched for “bridesmaids” and scrolled through several dozen results before giving up.
It was easy to search for GIFs with popular tags, but because no one had tagged this GIF with the full line from the movie, I couldn’t find it. Yet I knew this GIF was out there. I wished there was a way to find the exact GIF that was pulled from the line in a movie, scene from a TV show or lyric from a song. Luckily, I was about to start my internship at GIPHY and I had the opportunity to tackle the problem head on—by using optical character recognition (OCR) and Google Cloud Vision to help you (and me) find the perfect GIF.
GIF me the tools and I’ll finish the job
When I started my internship, GIPHY engineers had already generated metadata about our collection of GIFs using Google Cloud Vision, an image recognition tool that is powered by machine learning. Specifically, Cloud Vision had performed optical character recognition (OCR) on our entire GIF library to detect text or captions within the image. The OCR results we got back from Google Cloud Vision were so good that my team was ready to incorporate the data directly into our search engine. I was tasked with parsing the data and indexing each GIF, then updating our search query to leverage the new, bolstered metadata.
Using Luigi I wrote a batch job that processed the JSON data generated from Google Cloud Vision. Then I used AWS Simple Queue Service to coordinate data transfer from Google Cloud Vision to documents in our search index. GIPHY search is built on top of Elasticsearch, which stores GIF documents; and the search query returns results based on the data in our Elasticsearch index. Bringing all these components together looks something like this:
One of the biggest challenges in building this update was ensuring that we could process data for millions of GIFs quickly. I had to learn how to optimize the runtime of the code that prepares GIF updates for Elasticsearch. My first iteration took 80+ hours, but eventually I got it to run in just eight.
Once all the data was indexed, the next step was to incorporate the text/caption metadata into our query. I used what’s called a match phrase query, which looks for words in the caption that appear in the same order as the words in the search input—guaranteeing that a substring of my movie quote is intact in the results. I also had to decide how much to weigh the data from Google Cloud Vision relative to other sources of data we have about a GIF (like its tags or the frequency with which users click on it) to determine the most relevant results.
It was time to see how the change would affect results. Using an internal GIPHY tool called Search UX, I searched for “where are the turtles,” a quote from “The Office.” The difference between the old query and the new one was dramatic:
I also used a tool that examines the change on a larger scale by running the old and new queries against a random set of search terms—useful for ensuring that the change won’t disrupt popular searches like “cat” or “happy birthday,” which already deliver high-quality results.
See the GIFference
After our internal tools indicated a positive change, I launched the updated query as an A/B experiment. The results looked promising, with an overall increase in click-through rate of 0.5 percent. But my change affects a very specific type of search, especially longer phrases, and the impact of the change is even more noticeable for queries in this category. For example, click-through rate when searching for the phrase “never give up never surrender” (from “Galaxy Quest”) increased 32 percent, and click-through rate for the phrase “gotta be quicker than that” increased 31 percent. In addition to quotes from movies and TV shows, we saw improvements for general phrases like “everything will be ok” and “there you go.” The final click-through rate for these queries is almost 100 percent!
The ultimate test was my own, though. I revisited my search query from the beginning of the summer:
Success! The search results are much improved. Now, the next time you use GIPHY to search for a specific scene or a direct quote, the results will show you exactly what you were looking for.
To learn more about the technical details behind my project, see the GIPHY Engineering blog.