#Hololens – 29 new countries are added to the Hololens market, total of 39 and is still targeting developers

e4358372c737c7ac307eaebb86108d19-1024x576

Hello!

Today is a quick post, I’m still trying to understand how the Qubits work and this is consuming me a lot of time.

At the Future Decoded event in London, Microsoft announced that Microsoft Hololens will be available in 29 new markets.

The new countries that are added to the list are Austria, Belgium, Bulgaria, Croatia, Cyprus, Czech Republic, Denmark, Estonia, Finland, Greece, Hungary, Iceland, Italy, Latvia, Liechtenstein, Lithuania, Luxembourg, Malta, the Netherlands, Norway, Poland, Portugal, Romania, Slovakia, Slovenia, Spain, Sweden, Switzerland and Turkey.

They are added to the markets where they could already be purchased; United States, Canada, Australia, France, Germany, Ireland, New Zealand, the United Kingdom and Japan. Officially the sale of Hololens was formalized also in China, however, it seems that they still can not be obtained in the Chinese market.

In general this is good news. If we consider some similar movements in the past,
we can think that when Hololens V2 is released to the market, it will be in these 39 countries where Hololens V1 is already commercialized.

 

Happy Coding!

Greetings @ Burlington

El Bruno

References

Advertisements

#MixedReality – #Hololens new, #Steam partnership, a new Halo, new Devices, maybe a #Minecraft VR and more!

I2

Hi !

Today I’ll share some updates in the Mixed Reality and Hololens world. Before the big announcement on IFA, there is a good idea related to Hololens:

LimpidArmor presented a hybrid between a helmet and Hololens. This device is integrated with cameras outside a tank to give the Hololens user a 360-degree view of both optical and thermal systems without exposing them to additional risk.

I really like the photo, and if you want to see more pictures and some field tests, their Facebook page have some other materials.

i3

And, after this quick intro, it’s time to share some of the big Microsoft announcements during the IFA keynote on Berlin, Germany.

Microsoft Mixed Reality to get support for SteamVR, Minecraft and Halo starting this holiday season

Wow! In the references section you can read some articles with more details. I’ll try to highlight the more important topics:

  • Microsoft announced a collaboration with 343 Industries to add Mixed Reality Experiences to Halo. I’m a long time Halo fan, so this is amazing for me!
  • As part of the presentation, Microsoft also shared a long term view on which types of devices will be required to use MR capabilities. This is important because usually we think on VR devices and Expensive Ones, and Microsoft took a different approach here. There will be 2 types of devices:
    • Windows Mixed Reality PCs: will consist of desktops and laptops with integrated graphics.  When plugged into these devices, our immersive headsets will run at 60 frames per second.
    • Windows Mixed Reality Ultra PCs: will consist of desktops and laptops with discrete graphics. When plugged into these devices, our immersive headsets will run at 90 frames per second.
  • And, IMHO the best one. Microsoft is partnership with Steam, I may mean SteamVR. You probably know Steam, and they been investing and working in their VR branch for a while. Virtual Reality is not new for them. Using SteamVR you can already play and use several titles for Oculus Right or HTC Vive. I’m sure the path for Mixed Reality will receive some great ideas and maybe more from the Steam group. Also the chance to have some titles in the Steam Store for the new Mixed Reality devices is a very interesting approach.

So, these partnerships, new projects and devices may open the chance to enjoy Minecraft VR in the new Windows 10 Mixed Reality devices. Also maybe in 4K. If you don’t know Minecraft VR, please take a look at this video

Greetings @ Burlington

El Bruno

References

 

#Hololens – Getting Started with #MixedRealityToolkit #MRToolkit

Hi!

So, HoloToolkit is gone (until you see the code on the new toolkit) and now it’s time to start using the new Mixed Reality Toolkit. There are a couple of ways to do this, IMHO the best one is to import a custom package in Unity3D with all the contents of the Mixed Reality Toolkit.

I’ve used to create and maintain my own custom packages for HoloToolkit, however I’ll follow the guidelines and now I’ll start to use the official ones. We can find them in the Release section on the GitHub repository (link). Then we need to click on [Edit / Packages / Custom package] and we have all the assets in our project.

In the releases page

  • We are going to see the status of the latests releases. That’s mean current bug, new features, fixed bugs and more.
  • We will have 3 different packages to download and import in Unity3D
    • Toolkit
    • Toolkit + Tests
    • Toolkit + Tests + Examples
  • We will be able to download the source code of the toolkit (if you don’t want to clone the git repository)

If you already know how to work with the Mixed Reality Toolkit, the 1st package is all you need. On the other hand, if you want to see some examples on how to use it, maybe the 3rd package is the best option. In example, the 3rd package contains several samples, one of them is a interaction demo such as the following animation.

2017 08 15 Holo MRToolkit New Samples 02

Among the innovations included in this version, we can find some very useful such as a keyboard to be used in AR / VR applications. We have a Prefab that we can use in our applications in [Assets / HoloToolkit / UI / Prefabs]

I1

Important: this new virtual keyboard does not automatically appear when you focus or gaze in a TextBox. There is some code required to do this, I”ll write a post in the near future on this.

Other elements to consider are the assets that allow us to manage the transitions between Scenes. In [Assets / HoloToolkit / Utilities / Prefabs] we can find a button to launch the navigation to a new Scene and also a prefab to return to the previous scene.

Finally I have to highlight the gallery of examples that are included in this release. In [Assets / HoloToolkit-Examples] we can find all the samples. At first glance I detected that there are a couple new ones, like the scenes of “Medical” and the scenes of “Prototyping”.

The 2nd one is especially interesting since it shows different forms of interaction with elements in a world of MR. The animation at the beginning of the post corresponds to the “CycleArray” scene.

Happy Holocoding!

Greetings @ Burlington

El Bruno

References

El Bruno, my posts

#Hololens – Goodbye #HoloToolkit, now it’s time for #MixedRealityToolkit for Unity!

MRTK_Logo_Rev.png

Hi !

During the past few weeks I’ve writing on some samples to create Hololens Apps using Unity3D and other assets outside of th HoloToolkit world. So, now it’s official the HoloToolkit change and I can start to share some information on this. Long story short,

HoloToolkit is getting a big update and now it’s called Mixed Reality Toolkit.

This new version includes some big changes. I really like the official support to Visual Studio 2017 and Unity3D 2017.1; and if we take a look at the road map, we can see the dates when future versions of Unity3D are going to be supported (view references). The main architecture of a Hololens Apps is not changing, however, now that we have some new devices in the Mixed Reality family, the way to create Apps is a little more complex. (still is a very funny one!)

MixedRealityStack.png

Like in the previous version, we have 2 flavors of the product: the generic version focused on Mixed Reality and the specific one for Unity3D. Maybe in the near future we will have a set of tools also for URHOSharp and Unreal Engine.

I’ll write a little more about this in the near future, mostly on the big changes and new features included in this “new toolkit”.

Happy Holocoding!

Greetings @ Toronto

El Bruno

References

El Bruno, my posts

#Hololens – Tutorial para seleccionar un holograma, cambiar su tamaño o modificar su posición utilizando #MRDesignLab

Hola!

Otro de las funcionalidades que podemos implementar utilizando el Mixed Reality Design Labs kit para Microsoft Hololens, es

Agregar la capacidad de seleccionar un holograma, cambiar el tamaño del mismo, modificar su posición o eliminarlo de una escena.

Muy similar a lo que podemos hacer con la app de con la galería de Hologramas que viene por defecto en el device. La siguiente imagen muestra un ejemplo de este escenario

2017 08 07 Holo MRDesignLab Move Resize Holograms 02

Una vez que hemos importado los assets del MRDesign Labs, y agregado el prefab de Hololens, agregaremos una Capsule sobre la que trabajaremos. En primer lugar, agregaremos una colección para almacenar la Capsule. Esto lo podemos hacer desde el menú [HUX / Create Collection]

I2

Dentro de la colección agregamos un elemento 3D del tipo Capsule. Y luego editamos las propiedades de la Colección para que trabaje sobre esta Capsule.

  • Node List / Size = 1
  • Arrastramos la Capsule a la propiedad Capsule 1 / Transform
  • Rows = 1

I3

Ahora es momento de agregar un par de assets a la Capsule para poder tener las capacidades de funcionalidades de seleccionar el holograma, cambiar el tamaño del mismo, modificar su posición o eliminarlo de una escena.

En la Capsula agregamos los siguientes componentes:

  • Sphere Collider, para que el elemento sea “tangible” en el mundo virtual
  • Compound Button (Script), este script es el que se encarga de manejar las interacciones con el elemento asociado. En este elemento definimos el tipo de interacción con la propiedad Button State.
  • Bounding Box Target (Script), este script define en que acciones mostrar y ocultar el menú de acciones. Podemos definir las acciones que se habilitan y si la misma se muestra en modo Toolbar.

I4

Y con esto ya podemos tener una App completamente funcional con estos elementos en la misma. En próximos posts comentare los cambios que tenemos que realizar para personalizar el menú de interacción.

El código fuente del ejemplo se puede descargar desde aquí (link).

Happy Coding!

Saludos @ Burlington

El Bruno

References

El Bruno, my posts

#Hololens – Tutorial to use Buttons, Dialogs and more with #MRDesignLab (#HoloToolkit ++)

Hi!

Last week I wrote a post about Lunar Module. This is a sample App for Hololens, released by the Mixed Reality Design Labs team. Besides the sample App, tehre are couple of interesting Assets / Prefabs which are very useful if you are creating Hololens Apps.

So, after a question about this, I’ll write a couple of posts on how to use this assets. The first one is a very common scenario

Create and display a Button and OnClick display a Dialog with a message text and action button, ie: OK and Cancel.

This is a very usual scenario in any app. The output result will be something similar to the following image

20170801 Holo MRDesignLab Buttons 01

We start by clone the Mixed Reality Design Labs repository from GitHub (check references). There are a couple of submodules here for external tools, so it’s time to use your Git skills to get everything up and running.

Important: this package already have a HoloToolkit version which is compatible with Unity3D 2017.1 So we can use the latest and official Unity3D version for this sample.

I’ve created a unity3D package which includes HoloToolkit, MRDesignLab and HUX named “MRDesignLabs_Unity-0.1.3-Unity-2017.unitypackage”, you can download the package from here (link).

Let’s create an empty Unity project, clean the scene and import this package.

Important: At this point we usually add some elements from HoloToolkit to use the Hololens Camera, InputManager and more. This new package already have a Prefab which include all this elements. It’s named Hololens location [Assets / HUX / Prefabs / Interface].

I2

Inside this Prefab we find a CameraRig, InputMapping and more.

I’ll add a 3D Text element to display the output of the selected dialog message button.

And a few tutorial steps:

  • Add an empty elements to host and control out buttons. Name: DialogMenus,
  • Add 2 buttons in DialogMenus. From the prefab located at [Assets / HUX / Prefabs / Buttons / SquareButton.prefab].
  • Names SquareButtonDiag1 and SquareButtonDiag2

I03

  • In each new button the script [Compound Button Text] will allow us to define the text to be displayed in the button.

I04

  • Now it’s time to put this buttons in the Hololens interaction environment. I mean, in from of the Hololens user. We usually perform this with specific positions for each element. Now, thanks to MRDesignLans we have a new Prefabs which will arrange this elements in a more developer friendly way.
  • In [DialogMenus] let’s add a new component [Object Collection]. This new script will help us to arrange elements in a 3D environment

I05

  • Work on some script properties
    • Node List / Size = 2, we will use 2 elements, out 2 buttons
  • I’ve rename this elements to Dialog1 and Dialog2.
    • Inside [Dialog1 / Transform], let’s drag our [SquareButtonDiag1]
    • Inside [Dialog2 / Trasform], let’s drag our [SquareButtonDiag2]
  • I only want a single file of elements [Rows = 1]
  • The final property definition is similar to the following image

I06

Now it’s time to add some code to display the Dialogs in each button Click action. In [DialogMenus] let’s add a new script based on [Dialog and Menu Example]. This is a sample script which can help us to have this interaction.

  • We need to define on 2 the size on interactible elements, [Interactibles / Size = 2]
  • In [Interactibles / Element 0], drag [SquareButtonDiag1]
  • In [Interactibles / Element 1], drag [SquareButtonDiag2]
  • Our output will be a simple dialog element [Targets / Dialog Prefab = SimpleDialogShell]
  • Fill some other properties, like the output text

I07

The complete code can be downloaded from GitHub, however there are some interesting code parts to remark

  • On the OnTapped event we validate the clicked element
  • Based on this name we define if how many buttons the dialog will have
  • On the OnClosed event, we refresh the 3D text with the selected option

Full Source Code (link)
Happy Coding!

Greetings @ Mississauga

El Bruno

References

El Bruno, my posts

#Hololens – Lunar Module, new sample App with some very cool PreFabs for motion controllers

p1.jpg

Hi !

Last week I spent too much time between Cognitive Services and UWP. Between all this AI stuff I also put some times to try, test and understand on of the latests samples available for Hololens Developers in the Holographic Academy: Lunar Module.

The sample is very complete, the main goal is to recreate a Lunar Module game here the objective is to help the Lunar Module land safely in the Moon. We can use hand gestures or and XBox One Controller to control the lunar module. The HoloApp will also scan the environment to find a “landing plane”.

So, the cool stuff, is to go to the Github repository and review some of the assets included in the sample. I will remark 2.

Simple Menu Collection. Finally we have an easy way to create 2D interactive menus to be used in our Hololens Apps..

p2

Hand Coach, other cool one!. Every time we create and App for Hololens, there is a chance to include a tutorial on “how to use this app”, which is usually knows as a tutorial. This Prefab have some animations on how a hand can interact using gestures with holograms in the Hololens world.

P3

Happy Unity3D coding!

Greeting @ Burlington

El Bruno

References

El Bruno, my posts

#Opinion – Some news on #Hololens V2, HPU V2 and how #Microsoft choose the hardware path, build their own Chips

Hi !

I was planning to write this post yesterday, however, a Canadian wasp decided that it was better to leave me almost immobilized by attacking my foot and forcing me to plan my agenda differently.

Well, Marc Pollefeys (director of Science in the Hololens team) shared some information about the new version of Hololens. Until the code name is made public, I will refer to the new device as Hololens 2. What he tells us is a simple and powerful message:

The new HPU chip included in Hololens 2 will have Deep Neural Networks capabilities. (It means Artificial Intelligence!)

Let’s not forget that this is not new for Microsoft, but let’s also keep in mind that Microsoft is not dedicated to the design and creation of chips such as those we know from Intel or AMD. For years until now Microsoft is investing in R&D for a new generation of chips. Those chips are currently used mostly in the Azure Data Centers. In fact, it all started back in 2012, when Doug Burger presented a risky bet to Steve Ballmer: Project Catapult.

Doug commented to Steve that, in the near future, Internet would be controlled by a handful of group companies that would provide essential services for users. This “new Internet” would require a different architecture as a base platform. If Microsoft wanted to be part of this “new Internet”, they should not only build the OS and the Software, but they also had to take care of the hardware of the servers, manage the networks and more. It seems that at this moment Steve Ballmer change his face into a Gear of Wars Bad Boss, his eyes were all red and he responded with a “I thought this would be a research meeting, non a strategy one.

Note: I have been fortunate to meet Steve Ballmer face to face, and the energy that he has is impressive. While I have seen him in happy and animated mode, I imagine that a 1:1 in discussion mode should require special skills to pull the conversation forward.

And the Qi Lu appeared (he was in charge of Bing), he was added into the discussion. It seems that Qi Lu also had a similar idea in his head: the need to build re programmable chips, allowing upgrades much faster than those that were running at that time.

And there was more, the Bing team had already started to work on this, from here we began to read the term FPGA much more often in some areas. (FPGA: Field programmable gate arrays). And that’s me on this part of the story. This is quite interesting and I recommend reading the Wired article (see references).

Let’s go back to 2017 with Hololens 2 and the new HPU 2 (HPU: Holographic Processing Unit). The task performed by the HPU in Hololens version 1 is to coordinate, analyze and present a coherent result of the information obtained by the device of all sensors. In other words:

The HPU merge information from different sources: motion sensor, camera, depth sensors and infrared camera sensors, with all this information the HPU is capable to determine our position in the space around us. With this information, holographic projectors can determine how and where to position the holograms that are projected in our field of vision.

To this day, this type of processing is something that is unique to the Hololens. And if it is combined with a GPU, a CPU and a battery, it allows Microsoft to have a 100% autonomous untethered device: Microsoft Hololens 😀

Update: Thanks @AlexDrenea for some typos help here!

Now, what would happen if this processor, also has some kind of DNN capacity. In some blogs they called it “AI co-processor” and we can think that it could help in tasks such as voice recognition, face detection, shape detection, image analysis and more. The first thing they have presented during CVPR17 is how these new capabilities can be used to improve Hololens’s hand tracking and hand gestures capabilities. This is the demo recorded by a conference assistant

Clipboard02.png

Now comes the time to think that we can do with a device that “does not have to constantly send all this information to the cloud”, many of these tasks will be done in local. This will allow for more fluid applications, much more natural interactions and another couple of interesting surprises.

What is true is that 2018 will be a year where we will see what’s new Hololens 2 and surely we will have many interesting surprises along the way.

Greetings @ Burlington

El Bruno

References

PS: This is my foot 12 hours after the “Bruno vs The Wasp” moment

ee7db6f7-37c7-451b-8dae-4a13e9e6d782

#Hololens – How to improve the App deployment time from VS2017 to Hololens using USB

Hi !

Another post with some questions noted during the Hololens Tour (link). During the session, I was sharing some of my “lessons learned” and one of them was related to the application deployment scenarios when we are working with Microsoft Hololens and Visual Studio 2017.

My main recommendation was to use a (long) USB cable connected to the PC running Visual Studio to deploy and debug Apps in the Hololens. This way of works is almos X3 times faster than deploy via a WiFi network.

The following image display the amount of time consumed during a deploy process using WiFi as connection channel. In this sample, I used a simple “Hello World” demo app, only a cube in front of the user PoV and some HoloToolkit prefabs to save some code and time.

Clipboard01

The final time for the complete deploy process from Visual Studio 2017 to Hololens is 0:00:14.276

The next image displays the same scenario with a different deployment channel, changed from “Remote Machine” to “Device”. For this I got my Hololens device connected to my Visual Studio 2017 PC using a USB cable.

In this scenario the time is 00:05.232

Clipboard04

In both scenarios:

  • I clean up the device and the App is deployed from scratch.
  • Previous to the deploy, the app source code is generated from Unity3D started the build and deploy.

There is another option to deploy an App: use the Device Developer Portal and upload an UWP App package directly to the device. I do not cover this option in this post.

Finally, in Visual Studio Dev Center there is a good article which cover all the necessary steps to perform this deploy operations working with Visual Studio 2017 and Hololens (link). You can understand the step by step process here for all the operations you need to perform when you are deploying and debugging operations using VS2017 and Microsoft Hololens.

Greetings @ Toronto

El Bruno

References

El Bruno, my posts

#Hololens – How To save a #3D model of the environment around the device

Hi !

During the Hololens tour, I got a couple of questions and I promised to answer them later. Here is one of them:

Can we save the surrounding environment of the Hololens in a 3D model and work with this model later?

The answer is YES, and now I’l share the necessary steps to perform this.

We need to access to the Hololens Developer Portal of our device and navigate to the option “3D View”

Clipboard01.png

When we click on the [Update] button we will see a 3D model of the current environment scanned by the device.

Clipboard02

We can use mouse and keyboard to “navigate” in this view and view details of this environment.

Clipboard03.png

Important: The Hololens device is always scanning the environment, and it’s also updating the mesh of this model. If we press again the [Update] button we will be able to see the updated changes to the model. Also, the model is not restrained to “only one room”, we can have and work with complex models including several rooms. The next image is a photo and 3D view of a corner in my home office.

Clipboard04.png

We can export this scene model in a file with OBJ format. We can import and work with this file using several tools like Unity 3D, Paint 3D or 3D Builder. The next screen shoot is my office while I polish the model in 3D Builder.

Clipboard01

Greetings @ Toronto

El Bruno

References

El Bruno, my posts