Author Archives: Bruno Oliveira

Exhibits and experiments that are music to your ears

Today is World Music Day, also known as “Fête de la musique.” It’s an annual celebration of music that encourages amateur and professional musicians to play and perform outside in parks or in the streets. At Google Arts & Culture we took the name “Fête de la musique,” which translates to “music party,” literally, and made sure the internet also will celebrate, with music created by you. 

Our artists in residence at the Google Arts & Culture lab created two new experiments, “Paint with Music” and “Assisted Melody,” to offer you an easy and fun way to get creative with sound.

A collage of images including dolphins and birds

Paint with Music is an interactive experience that connects two major forms of artistic expression: painting and musical composition. That means – with the help of machine learning – you can turn a paintbrush into a musical instrument that translates the movement of your brushstrokes into musical notes performed by an instrument of your choice. A wide range of canvases, from the sky to the ocean, are ready to take your composition to the next level by incorporating special sound effects unique to each setting. Try it now on your desktop or Android device.

Portraits of three male composers

Over a year ago we launched Assisted Melody, an experiment that helped you compose your own tune on a virtual sheet of music, and with the click of a button make it sound like Bach. For World Music Day, a new version of Assisted Melody invites you to play classical music and compose your own melody in the signature style of the maestros Mozart and Beethoven. Not only will you be able to compose directly on each maestro’s favorite instrument (Mozart's harpsichord, for example), but also you can hear your stylized output on a wide range of modern instruments, from the flute to the synthesizer.

And while you transform your melodies, you can also check out some fun facts about each maestro or dive deeper into the rich music-themed stories and online exhibitions our newest partners are also launching today on Google Arts & Culture. 


MILA, a French association fostering the local scene, steps into the stories of a true Parisian gem, the “Rue de la Musique” – the street of music. Nestled in Paris’ Montmartre district, independent music makers, record labels and record shops have transformed an entire neighborhood into an effervescent music scene. One example: Record Makers, one of the most influential French Touch labels, settled in an old bakery where French pop legend Joe Dassin used to buy his croissants.

In Pune, India, the Baithak Foundation publishes its collection of rare archival interviews and recordings of Indian classical music. They’re also launching new stories about Indian ragas and the correct time to sing them, and exhibits on maestros like Pandit Vijay Sardeshmukh or Ustad Amir Khan, founder of Indore gharana.

Banglanatak is showcasing folk musicians from western Raj deasthan (like Langas and Manganiyars), theirunique instruments and their informal and organic way of training the next generation. And last but not least, the Salar Jung Museum is launching an exhibit on theragamala paintings from their collection, which are visual representations of Indian ragas. 


Got into the groove? Check out our Performing Arts collection online and via our free app for iOS or Android, or try some of our most recent experiments such  as AR Synth or Blob Opera.

Daydream Labs: Interactive scenes with Blocks objects

Since the launch of Blocks, people have been enthusiastically creating and sharing their amazing models with the community. So we asked ourselves: what would it be like to use Blocks objects to create an entire interactive scene?

Turns out it’s possible. In an experiment our team built recently, we created a system that lets people make their own "Escape the Room" experience in VR. Every object in the game is made from Blocks objects, including typical stuff like a flashlight, desk, bookcase, and the obligatory keypad, but also even the room itself.

Throw in some lighting, and the result is a scene with exactly the cartoonishly spooky vibe we were going for. Not a room you'd want to be trapped in for too long!

BlockScene

To get everything to work, we had to define how objects interact. We could’ve just written that directly in our code, but our goal was to allow anybody to create these experiences—no programming knowledge required. So we created a simple system of triggers and actions that allows the creator to indicate what happens next in response to certain events.

The system can express concepts such as "when the battery object collides with the flashlight object, activate the light object." The light happens to be a spotlight located at the tip of the flashlight object, so when the player places the battery in the right place, a cone of light will shine forward and move with the flashlight.

Using this simple trigger/action system, we built a number of other puzzles in the room, like opening a locked chest with a key, placing a book in a sliding bookcase and figuring out the combination to enter on a keypad.

Blockscenegif

Combining Blocks objects to create interactive scenes was a lot of fun. Because Blocks has a consistent low-poly visual style, the result of our efforts was an engaging environment where everything fit well together, even though objects were made by many different people on our team.

We learned a few other things along the way. First, the ability to add interactivity to a scene is super important, and a wide range of interactive scenes can be built from the simple primitives we had set up with our trigger and action system. Most of the interactions could be expressed as collisions (key and lock, battery and flashlight, book and bookcase) and simple actions like showing/hiding or animating particular objects.

Next, setting up the rendering was almost no work at all, because Blocks objects are low-poly and work well with simple materials. We just used the standard diffuse shaders for the opaque surfaces and a simple translucent one for the glass surfaces. Combining that with an ambient light and a spotlight achieved the rendering effect that we wanted.

Last, we set up a simple animation system where we pre-recorded the motions of certain objects and expressed them as a sequence of transformations (position, rotation, scale). This rudimentary animation system worked well when moving solid objects like a bookcase or the lid of a chest, but we’d need something more elaborate if we were to do character animation, perhaps using what we learned from our experiments on animating Blocks models. What’s more, adjusting the colliders for the objects to ensure they interacted correctly required some manual tweaks. In order to scale this, it might be worth looking into automatically generating simple colliders for objects.

Scene building and interactivity with Blocks objects are exciting areas for experimentation, and we're looking forward to seeing what other applications developers will come up with in this space.