I have very good memories about the times when I used to speak in events about Microsoft Robotics. It was the year 2006 and my girl make give me the best present ever: Lego Mindstorm NXT. I’ve spend hours and hours with the Lego and also playing around with Visual Studio try to control my bot using C#. (During those days I really learn about the Microsoft approach to Bluetooth) Microsoft Robotics supported several types of bots, however the most popular ones were Lego Mindstorms and the popular Roomba vacuum. Yes, roomba used to have a special model with Bluetooth connectivity and, you know, hack the bluetooth, hack the world! You can control the Roomba using Microsoft Robotics.
Of course, once you were there, the next step was to get a 2nd Roomba vacuum and prepare a Sumo fight between the 2 devices. I think it was in TechEd 2008 when it took place the final of Microsoft Robotics challenge, I was lucky enough to be there as an attendee.
In this line, you are probably thinking: vacuums? bots? how this is related to Augmented Reality? Bruno finally lost it.
Let’s get back a little to the AR world. Most of the AR SDKs relies on some type of sensors to create a model of the surrounding environment. In ARKit and ARCore this is supported by the camera and the motion sensors, in other devices and SDKs you can also have depth sensors.
And what happens if we add information of external sensors into the process of the creation, analysis or exploration of the surrounding 3D model ? Why not use external sensors, whose who can make a Full scan of the indoors of a house? Take a look at this video.
If the SDK can somehow detect “where it is” and make a request to a 3rd party service for a 3D model of the current environment this will be a huge advance. The chance to use this information as an external capability to our mapping process will be a big advance. Today in the AR world, the tracking / mapping is one of the main challenges for all the players (Apple, Microsoft, Google, Facebook, etc …)
Note: Of course, once you have the map, you need to understand “where are you inside the map”, and this is not an easy task. Today we see this as the “lost tracking” scenario, but this is just an idea for the near future.
There are plenty of paths to take from here. In our Avanade TechVision 2017, we constantly suggest our clients to take advantage of the amazing ecosystem of services we have around. I’m sure we didn’t think about using information from a vacuum to be used in a Mixed Reality experience !
Greetings @ Burlington