So, today was initially the day to continue with my series of Myo posts, but Fusion4D deserves a quick entry.
Microsoft Research’s team have created and shared a new learning-based technique to analyze movement across the RGBD frames. Fusion 4D combines volumetric fusion with estimation of a smooth deformation field across RGBD views to handle large frame-to-frame motion. It supports both incremental reconstruction, improving surface estimation over time, and the parameterization of non-rigid scene motion.
And, just in case > THIS IS REALTIME!
Now is time to imagine the new world of opportunities from here: realtime virtual interaction with devices like Hololens or Oculus?; perform this 4D realtime scan using Kinects? and more.
So times are changing and this types of technologies will change the way we interact with devices, with other people, and maybe the way we can catch Pokemons in a park.
Greetings @ Toronto