Project Tango: Google To Install 3D Mapping Sensors In Future Cell Phones


Image Source: Google Images – Project Tango Prototypes

Just as Apple got their patents on laser technology and wanted to map everything around them, Google went on doing the same thing – how can Google stay behind – but with a different twist. Just like their previous projects (i.e. – Loon, Ara, Wings and other hidden projects under Google X). However, “Project Tango” is said to be their next big gig, and its main purpose is to bring “virtual reality to mobile devices”.

The co-founder of Fairyland, Amir Rubin, partnered with Google to develop Project Tango. They say that this project will empower handheld devices to create a 3D map of each and every place on Earth in high-res detail. Amir also stated, “Fairyland helped Google place sensors on phones that are able enough to measure and capture in-depth perception pictures that may be created into 3D maps.”


pt1Image Source: Google Images – A man testing the software and sensors on the device.

With these maps, artificial intelligence and 3D applications, these handheld devices will be able to browse the surroundings around them. Amir also stated that, “cell phones do not perceive physical reality. But by studying machine learning techniques and third dimensional processing, applications and robots will know the exact form of our world.” The best part is the long term storage of content, believes Amir. He says, “once every phone has a detector and notes its surroundings then Fairyland and Google will be able to amend things in applications that will help to better other gadgets.”

These devices contain customized hardware and software designed to track the full 3D motion of the device, while simultaneously creating map of the environment. The sensors allow the device to make over a quarter of a million 3D measurements every second, updating its position and orientation in real time, and then combines that data into a single 3D model of the environment around us.


pt2Image Source: Google Images – Device mapping and generating spectrum 3D graphics.

They will run Android and include development APIs to provide position, orientation and depth data to standard Android applications written in Java, C or C++, as well as the Unity Game Engine. These early “not for sale” prototypes, algorithms and APIs are still in the active development phase.

Furthermore, as they are still in development, we cannot tell precisely when we will be able to see 3D sensors on Google’s Nexus or other Android devices, but what we can tell you is that this will take phones to a completely different level.

“We hope you will take this journey with us. We believe it will be one worth travelling.” -Johnny Lee & the ATAP – Project Tango Team.





Get Your Anonymous T-Shirt / Sweatshirt / Hoodie / Tanktop, Smartphone or Tablet Cover or Mug In Our Spreadshirt Shop! Click Here



  1. Similar to Batman, only Batman’s used sound waves, kind of like sonar. Much more effective and more likely the government would have this technology to snoop as you don’t have to rely on line of sight for mapping. A laser would only be useful when its not in someones pocket or faced toward a table top. What google is introducing is the public version of this type of technology adding a better visual to the sonar type of technology. People with tinfoil hats are become less crazy to me every day.


Please enter your comment!
Please enter your name here