Author Archives: Jesse Friedman

The Arrival of our 32nd Word Lens Language, Heptapod B

We’re honored to have partnered with Dr. Louise Banks, esteemed linguistics professor, to develop instant camera translation for our 32nd language, Heptapod B. Following our experience with logograms in Chinese and Japanese, as well as the many characters containing circles in Korean, we were ready to blend our expertise in low-memory-footprint convolutional modeling and Dr. Banks’ linguistic background to the deciphering of circular logograms in the Word Lens feature in the Google Translate app.

Translate_Arrival_1.png

The challenge with understanding Heptapod B is its nonlinear orthography. Fortunately, Google's neural machine translation system employs an encoder/decoder system that internally represents sentences as high-dimensional vectors. These vectors map well to the non-linear orthography of the Heptapod language and they are really the enabling technical factor in translating Heptapod B.

We interpret Heptapod B into English, Chinese, Danish, Japanese, Urdu, Russian, French, Spanish and Arabic. As with our other Word Lens languages, it works offline, which is really handy if you happen to need to read a circular logogram in an isolated location. Dr. Banks assures us that the app will continue to work for at least 3,000 years.

Translate_Arrival_2.png

Communicating across language (and glass) barriers can be a rather alienating experience. While learning a new writing system can be quite rewarding and even a mind-altering experience, not everyone has time for that. So whether the world’s fate hangs in the balance, or if you’re simply trying to discern whether your coffee stain ring means something, we wish you success as you integrate this tool into the story of your life.

Source: Translate


The Arrival of our 32nd Word Lens Language, Heptapod B

We’re honored to have partnered with Dr. Louise Banks, esteemed linguistics professor, to develop instant camera translation for our 32nd language, Heptapod B. Following our experience with logograms in Chinese and Japanese, as well as the many characters containing circles in Korean, we were ready to blend our expertise in low-memory-footprint convolutional modeling and Dr. Banks’ linguistic background to the deciphering of circular logograms in the Word Lens feature in the Google Translate app.

Translate_Arrival_1.png

The challenge with understanding Heptapod B is its nonlinear orthography. Fortunately, Google's neural machine translation system employs an encoder/decoder system that internally represents sentences as high-dimensional vectors. These vectors map well to the non-linear orthography of the Heptapod language and they are really the enabling technical factor in translating Heptapod B.

We interpret Heptapod B into English, Chinese, Danish, Japanese, Urdu, Russian, French, Spanish and Arabic. As with our other Word Lens languages, it works offline, which is really handy if you happen to need to read a circular logogram in an isolated location. Dr. Banks assures us that the app will continue to work for at least 3,000 years.

Translate_Arrival_2.png

Communicating across language (and glass) barriers can be a rather alienating experience. While learning a new writing system can be quite rewarding and even a mind-altering experience, not everyone has time for that. So whether the world’s fate hangs in the balance, or if you’re simply trying to discern whether your coffee stain ring means something, we wish you success as you integrate this tool into the story of your life.

(Okay, if you haven’t guessed already... we’re just having some fun here. But we really are eager to bring Word Lens and Neural translation to more languages,
so stay tuned.)

The Arrival of our 32nd Word Lens Language, Heptapod B

We’re honored to have partnered with Dr. Louise Banks, esteemed linguistics professor, to develop instant camera translation for our 32nd language, Heptapod B. Following our experience with logograms in Chinese and Japanese, as well as the many characters containing circles in Korean, we were ready to blend our expertise in low-memory-footprint convolutional modeling and Dr. Banks’ linguistic background to the deciphering of circular logograms in the Word Lens feature in the Google Translate app.

Translate_Arrival_1.png

The challenge with understanding Heptapod B is its nonlinear orthography. Fortunately, Google's neural machine translation system employs an encoder/decoder system that internally represents sentences as high-dimensional vectors. These vectors map well to the non-linear orthography of the Heptapod language and they are really the enabling technical factor in translating Heptapod B.

We interpret Heptapod B into English, Chinese, Danish, Japanese, Urdu, Russian, French, Spanish and Arabic. As with our other Word Lens languages, it works offline, which is really handy if you happen to need to read a circular logogram in an isolated location. Dr. Banks assures us that the app will continue to work for at least 3,000 years.

Translate_Arrival_2.png

Communicating across language (and glass) barriers can be a rather alienating experience. While learning a new writing system can be quite rewarding and even a mind-altering experience, not everyone has time for that. So whether the world’s fate hangs in the balance, or if you’re simply trying to discern whether your coffee stain ring means something, we wish you success as you integrate this tool into the story of your life.

(Okay, if you haven’t guessed already... we’re just having some fun here. But we really are eager to bring Word Lens and Neural translation to more languages,
so stay tuned.)

The Arrival of our 32nd Word Lens Language, Heptapod B

We’re honored to have partnered with Dr. Louise Banks, esteemed linguistics professor, to develop instant camera translation for our 32nd language, Heptapod B. Following our experience with logograms in Chinese and Japanese, as well as the many characters containing circles in Korean, we were ready to blend our expertise in low-memory-footprint convolutional modeling and Dr. Banks’ linguistic background to the deciphering of circular logograms in the Word Lens feature in the Google Translate app.

Translate_Arrival_1.png

The challenge with understanding Heptapod B is its nonlinear orthography. Fortunately, Google's neural machine translation system employs an encoder/decoder system that internally represents sentences as high-dimensional vectors. These vectors map well to the non-linear orthography of the Heptapod language and they are really the enabling technical factor in translating Heptapod B.

We interpret Heptapod B into English, Chinese, Danish, Japanese, Urdu, Russian, French, Spanish and Arabic. As with our other Word Lens languages, it works offline, which is really handy if you happen to need to read a circular logogram in an isolated location. Dr. Banks assures us that the app will continue to work for at least 3,000 years.

Translate_Arrival_2.png

Communicating across language (and glass) barriers can be a rather alienating experience. While learning a new writing system can be quite rewarding and even a mind-altering experience, not everyone has time for that. So whether the world’s fate hangs in the balance, or if you’re simply trying to discern whether your coffee stain ring means something, we wish you success as you integrate this tool into the story of your life.

(Okay, if you haven’t guessed already... we’re just having some fun here. But we really are eager to bring Word Lens and Neural translation to more languages,
so stay tuned.)

Source: Translate


The Arrival of our 32nd Word Lens Language, Heptapod B

We’re honored to have partnered with Dr. Louise Banks, esteemed linguistics professor, to develop instant camera translation for our 32nd language, Heptapod B. Following our experience with logograms in Chinese and Japanese, as well as the many characters containing circles in Korean, we were ready to blend our expertise in low-memory-footprint convolutional modeling and Dr. Banks’ linguistic background to the deciphering of circular logograms in the Word Lens feature in the Google Translate app.

Translate_Arrival_1.png

The challenge with understanding Heptapod B is its nonlinear orthography. Fortunately, Google's neural machine translation system employs an encoder/decoder system that internally represents sentences as high-dimensional vectors. These vectors map well to the non-linear orthography of the Heptapod language and they are really the enabling technical factor in translating Heptapod B.

We interpret Heptapod B into English, Chinese, Danish, Japanese, Urdu, Russian, French, Spanish and Arabic. As with our other Word Lens languages, it works offline, which is really handy if you happen to need to read a circular logogram in an isolated location. Dr. Banks assures us that the app will continue to work for at least 3,000 years.

Translate_Arrival_2.png

Communicating across language (and glass) barriers can be a rather alienating experience. While learning a new writing system can be quite rewarding and even a mind-altering experience, not everyone has time for that. So whether the world’s fate hangs in the balance, or if you’re simply trying to discern whether your coffee stain ring means something, we wish you success as you integrate this tool into the story of your life.

(Okay, if you haven’t guessed already... we’re just having some fun here. But we really are eager to bring Word Lens and Neural translation to more languages,
so stay tuned.)

Source: Translate