Introducing Style Detection for Google Cloud Vision API



(Cross-posted on the Google Cloud Platform Blog.)

At Google Cloud Platform, we’re thrilled by the developer community’s enthusiastic response to the beta release of Cloud Vision API and our broader Cloud Machine Learning product family unveiled last week at GCP NEXT.

Cloud Vision API is a tool that enables developers to understand the contents of an image, from identifying prominent natural or man-made landmarks to detecting faces and emotion. Right now, Vision API can even recognize clothing in an image and label dominant colors, patterns and garment types.

Today, we’re taking another step forward. Why only evaluate individual components of an outfit when we could evaluate the full synthesis — the real impact of what you wear in today’s culture?

We’re proud to announce Style Detection, the newest Cloud Vision AP feature. Using millions of hours of deep learning, convolutional neural networks and petabytes of source data, Vision API can now not just identify clothing, but evaluate the nuances of style to a relative degree of uncertainty.

Style Detection aims to help people improve their style — and lives — by navigating the complex and fickle landscape of fashion. Does a brown belt go with black shoes? Pleats or no pleats? “To tuck or not to tuck?” is now no longer a question. With Style Detection, we’re able to mine our nearly bottomless combined data sets of selfies, fashion periodicals and the unstructured ramblings of design bloggers into a coherent and actionable tool for picking tomorrow’s trousers.

We’re already seeing incredible results. Across our training corpus, we were able to detect the majority of personal style choices and glean with 52-97% accuracy not just what people were wearing, but what those clothes might say about them. The possibilities are endless — and it could mean the end of spandex forever!

Learn more about Style Detection and the Cloud Vision API here. We’re offering it to a small group of developers in alpha today (obviously, there are still details to iron out).