#Event – Materials used on “Lessons learned building #Hololens 3D apps from a 2D app developer”

 

Hi !

Yesterday I was lucky to get invited to participate in an event, and I shared my experiences learning on how to develop Apps for Hololens. (link). I was still kind of sick, so my voice was not as good as I want, however, I think the ideas were there :D.

As always, here are the materials for the event

The UrhoSharp samples apps is available here (link)

And this are the links I talked about during the event

Greetings @ Toronto

El Bruno

#Hololens – Moving and rotating holograms using an #XBoxOne Controller

Hello!

Now that I have already connected the XBoxOne controller to the Hololens and I wrote on how to use some code to work with it, today’s post will show some of the features of the HoloToolkit class “Game Controller Manipulator“.

This class allows us to do several things, without having to add a line of code:

  • Selecting and moving a hologram in the PoV of the user of Hololens
  • Rotate a hologram on its 3-axis

Here is an example of this class in use.

2017 03 02 Holo XboxOne Controller Rotate Move 01.gif

Important: I found an error which affects the performance on a final app. I need to define the “ControllerTriggerAxis” as a new input element to be used for the Game Controller script. The following steps fix the original ones.

We start with the basics with these steps

  • Create a 3D project in Unity3D
  • Configure project to support HoloLens projects
  • Clean Scene elements
  • Import HoloToolkit package
  •  Add
    • HololensCamera
    • SpatialMapping
      • Uncheck the property “Draw Visual Meshes”
    • DefaultCursor
    • InputManager
  • Add Empty element, Managers
    • Add existing scripts
      • Game Controller Manipulator
      • Check the option “Move Gaze Target”
  • Add Empty element, HoloCollection
    • Add 3D elements
      • Cube
        • Position, x:0 y:0 z:2
        • Rotation, x:0 y:0 z:0
        • Scale, x:0.3 y:0.3 z:0.3
  • Edit – project settings – Input
    • Add element
      • Name: ControllerLeftStickX
      • Gravity: 0
      • Dead: 0.19
      • Sensitivity: 1
      • Type: Joystick Axis
      • Axis: X Axis
    • Add element
      • Name: ControllerLeftStickY
      • Gravity: 0
      • Dead: 0.19
      • Sensitivity: 1
      • Type: Joystick Axis
      • Axis: Y Axis
    • Add element
      • Name: ControllerTriggerAxis
      • Gravity: 0
      • Dead: 0.19
      • Sensitivity: 1
      • Type: Joystick Axis
      • Axis: Y Axis

The configuration of the Cube would look similar to the following image:

clipboard04

The input section would be similar to the following image:

clipboard02

Now we see a little so you can make the script Game Controller Manipulator. The first thing we see is that it has the following properties

clipboard02

In these properties we use the custom values we add in the Input for the Project settings. The 2 latest properties are also important

  • “Rotate modifier button name” indicate the name of a button on the remote control of XboxOne. Fire2 corresponds with the B button on the controller. While we hold it and move the joystick on the remote control, the hologram that this pointed to with the Gaze, will rotate on their axes, X, and y Z.
  • If you select “Move Gaze Target” when you press the button “Fire 1” (A button on the controller) we can move the hologram selected using the joystick on the remote control.

In these 2 scenarios, we see how the properties of the class define what are the elements of Input that are used to move the object of work.

Finally, comment that this class can be used in 2 ways

  • If you select the option “Move Gaze Target”, the movement and rotation apply to a hologram that is pointing the Gaze.
  • If the script is attached to an element and the “Move Gaze Target” option is not selected, move and rotate options are applied to the element

Greetings @ Toronto

El Bruno

References

#Hololens – #HoloToolkit compiled packages for #Unity3D in #GitHub

Hello!

A while ago I wrote on how to use HoloToolkit in my projects: I export the latest version of HoloToolkit-Unity in a Unity 3D package and then imported them into my Unity 3D projects (link).

This week I created a repo on GitHub where I will begin to leave the compiled packages: HoloToolkit-Unity-Packages (link)

In each compiled package I include:

  • HoloToolkit
  • HoloToolkit-Examples

Clipboard01

This seems a good way to share or update HoloToolkit if you work on various devices and with a distributed team. I will comment on the experience in a couple of months.

Greetings @ Toronto

El Bruno

References

#Hololens – Detect user hand interactions using #HoloToolkit (update!)

Hello!

During the last few months HoloToolkit has changed a lot. These changes make some of the examples that I have written as non-valid posts. In today’s post, I’ll quickly explain how to implement detection of hands with the current version of HoloToolkit.

We start with the basics with these steps

 

  1. Create a 3D project in Unity3D
  2. Configure project to support HoloLens projects
  3. Clean Scene elements
  4. Import HoloToolkit package
  5. Add
    1. HololensCamera
    2. SpatialMapping
    3. CursorWithFeedback
  6. Add Empty element, Managers
    1. Add existing scripts
      1. Gaze Managers
      2. Gaze Stabilizer
      3. Input Manager
  7. Add Empty element, HoloCollection
    1. Add 3D elements
      1. Cube
        1. Position, x:0 y:0 z:2
        2. Rotation, x:0 y:0 z:0
        3. Scale, x:0.3 y:0.3 z:0.3
      2. AnchorText (from HoloToolkit)
        1. Position, x:0 y:0.35 z:2
        2. Rotation, x:0 y:0 z:0
        3. Scale, x:0.3 y:0.3 z:0.3
  8. Into the Manager collection
    1. Add script Text Debug Manager (later on the post)
      1. Drag the Anchor Text (7.1.2) into the Anchor Debug Text Property

This is the process creating a basic Hololens project. Now let’s create a script that detects any of the user actions with his hands. This script will display these actions in the AnchorText which I have added to the collection of holograms in debug mode.

For this example, I will call the script “TextDebugManager.cs“. The same code is as follows

 

using HoloToolkit.Unity.InputModule;
using UnityEngine;

public class TextDebugManager : MonoBehaviour, IHoldHandler, IInputHandler
{

 public TextMesh AnchorDebugText;
 private string _debugTextHold = "";
 private string _debugTextInput = "";
 
 void Update()
 {
   UpdateText();
 }

 private void UpdateText()
 {
   if (AnchorDebugText != null)
     AnchorDebugText.text = string.Format(
       "Hold: {0}\nInput: {1}", _debugTextHold, _debugTextInput);
 }

 public void OnHoldStarted(HoldEventData eventData)
 {
   _debugTextHold = "OnHoldStarted";
 }

 public void OnHoldCompleted(HoldEventData eventData)
 {
   _debugTextHold = "OnHoldCompleted";
 }

 public void OnHoldCanceled(HoldEventData eventData)
 {
   _debugTextHold = "OnHoldCanceled";
 }

 public void OnInputUp(InputEventData eventData)
 {
   _debugTextInput = "OnInputUp";
 }

 public void OnInputDown(InputEventData eventData)
 {
   _debugTextInput = "OnInputDown";
 }
}

Within the class I implemented interfaces “IHoldHandle” and “IInputHandler“. Then be shown the operations be interfaces in the Update() of the script. In this way, we can implement a model of Debug PPP of Hololens in a quick way, and in this case, to capture the interactions of the user about the holograms.

Greetings @ TorontoEl Bruno

References

#Hololens – How to detect AirTap and Click actions using #HoloToolkit (updated!)

Hello!

During the last few months, HoloToolkit has evolved a lot. Some of the changes in the Toolkit make some of my blog samples as non-valid. For example, basic actions such as detecting an AirTap. In today’s post, I’ll quickly explain how to implement the AirTap or Click.

We start with the basics with these steps

  • Create a 3D project in Unity3D
  • Configure project to support HoloLens projects
  • Clean Scene elements
  • Import HoloToolkit package
  • Add
    • HololensCamera
    • SpatialMapping
    • CursorWithFeedback
  • Add Empty element, Managers
    • Add existing scripts
      • Gaze Managers
      • Gaze Stabilizer
      • Input Manager
  • Add Empty element, HoloCollection
    • Add 3D elements
      • Cube
        • Position, x:0 y:0 z:2
        • Rotation, x:0 y:0 z:0
        • Scale, x:0.3 y:0.3 z:0.3

This is the basic process to create a Hololens project in Unity3D.  Now let’s associate a script to the Cube that detects some user actions of the user, such as for example the AirTap. For this example, I will call the script “CubeManager.cs“. This is the source code of the script:

using HoloToolkit.Unity.InputModule;
using UnityEngine;

public class CubeManager : MonoBehaviour, IInputClickHandler, IInputHandler
{
  public void OnInputClicked(InputClickedEventData eventData)
  {
    // AirTap code goes here
  }
  public void OnInputDown(InputEventData eventData)
  { }
  public void OnInputUp(InputEventData eventData)
  { }
}

To capture AirTaps actions, we need to implement the “IInputClickHandler” interface. In the OnInputClicked(InputClickedEventData eventData) method we can perform actions when the user AirTap on the element. In the same way, if we implement the interface “IInputHandler” we can capture events “Click/Tap Down” and “Click/Tap Up” in an element of our app.

Note: Besides the AirTap action, this example serves to Click action on the Hololens clicker.

IMHO, I find it a much more ‘clean’ way this way of working with events using interfaces. Although there is still a way to go. In my next post, I will show how to work with hands and Hololens, for this sample the approach is not interface driven.

The sample code can be downloaded from GitHub (link)

Greetings @ Toronto

El Bruno

References

Opinion – #Hololens, thanks to #Kinect, #SurfacePro, #Windows10 #Mojang and #Azure

Hi !

A few months ago I wrote a post where I share my personal view on how in the -not too distant future- most of the Apps that we will create will be for 3D (link) environments. To give me a little more packing, I wrote another post where I compared the history of the Mouse with the “current moment of 3D environments” (link).

hackerman

Nota: Hackerman was ahead of his time.

For me one of the important points in this moment he is seen as Microsoft searched hardware partners outside their Hololens Microsoft to continue maturing Windows Holographic platform. For example the case of Acer and their “Mixed Reality Development Edition” (link) or the brand new “Windows Mixed Reality Portal“, included in the latest version of Windows 10 Insiders.

636239729455651646 (1)

There is a very large group of people behind all this innovation. However, one of the best-known public faces is Alex Kipman (@akipman). He is popularly known as “Hololens Father“. According to his point of view, the Augmented Reality revolution will be superior even to the smartphone revolution. The following sentence is which sustains its vision of the future:

“The potential of these devices,” he said, is that they could one day “replace your phones, TVs, and all these screens.” Once your apps, videos, information, and even social life are projected into your line of sight, you won’t need any other screen-based gadgetry. Kipman calls it the “natural conclusion” of mixed reality.

Note: The full article in Business Insider is almost mandatory.

The interesting thing about this change, in my opinion, is that it is not a product/platform/technology focused at the Enterprise level or event at users or entertainment levels. The change of AR / MR affect all aspects of our way of life. Once we begin to interact with virtual environments, we will see them at home, at work and little by little they will be part of our day to day.

But well, I don’t want to write about my vision of the future, I have already done so. What I want in this case is to highlight the words of Alex Kipman, which gives them thanks to the vision and the support of Satya Natella. It stresses how, thanks to the effort and support to different teams over the years, today Microsoft has all the tools necessary to create a platform of mixed reality and a device as Microsoft Hololens.

 

The teams that have allowed Mixed Reality to get this far has been

  • Azure, a must have for all holograms calculations and processes modeling
  • The Windows team. Microsoft needed a change of paradigm in its vision on an OS and Windows 10 was the tool which makes it happen
  • Microsoft Surface Product Team, which has been instrumental in the design and development of the hardware of Hololens
  • Mojang, yes the Minecraft guys! which have provided clarity and experience to improve the user experience
  • Microsoft Kinect. Recognition of movements, spatial perception, and other Kinect technologies has served to be able to provide these capabilities to Microsoft Hololens
  • And more. 

Note: Many people think that Kinect was a failure as a program and a product. I totally disagree with this. Personally, I think that the device did not work as a consumer product, even if they sold 29 million devices., Anyways, the Kinect experience has left much available knowledge to create new ideas. For example Microsoft Hololens.

Once you understand how the Microsoft Hololens works (link), then also you realize a number of resources and capabilities which are required to bring forward a concept such as Mixed Reality. It is also in that moment where you begin to understand which are the best scenarios where this technology can be applied.

If you have not seen yet, this video summarizes the concept of Mixed Reality perfectly.

 

 

Greetings @ Toronto

El Bruno

References

 

#Review – #Hololens, #hardware and how the hologram process works !

giphy-2

Hello!

I had this post in draft mode for while, and just today @Rfog asked me on Twitter about the speed (FPS) on the Hololens. So I’ve tweaked some post to comment as Hololens work Apps.

Hololens Device

Let’s start from the beginning, Hololens has everything we can found for standard Windows 10 device:

  • 579 grams
  • Uses a micro USB port for charginf
  • 2 to 3 battery hours
  • 32-bit Intel chip
  • 2GB RAM (with an additional 1GB of RAM for the HPU)
  • 64GB Flash storage
  • 2MP frontal camera
  • Video recording at 30FPS – 720p
  • Bluetooth 4.0 and Wifi

On the HPU, the best is to refer what was shared in The Register

HPU is a TSMC-fabricated 28nm coprocessor that has 24 Tensilica DSP cores. It has around 65 million logic gates, 8MB of SRAM, and an additional layer of 1GB of low-power DDR3 RAM. That RAM is separate to the 1GB that’s available for the Intel Atom Cherry Trail processor, and the HPU itself can handle around a trillion calculations per second.

Tom Warren, from The Verge, had exclusive access to a “piece by piece” of Hololens. If anyone is interested in more details, the following 2 minutes are essential.

By the way, after seeing this, there is no doubt that Hololens is a piece of art!

Hololens Holographic Features

And now let’s talk about some of the holographic capabilities of the device. One of the most frequent complaints is the small size of the FOV. FOV represents the Field of View, which is the point of view of the user. Here for each eye, we have a 720p resolution, or what is the same 1268 × 720.

The main difference when compared to other devices to Hololens complement dedicated to Virtual reality, is that this 2nd group of devices require work to a few 90FPS or more to make “realistic” experience. With less FPS are symptoms of dizziness, disconnection, etc.

The Hololens scenario is different. As the user sees reality through the lens he is alwasy connected to the reality; holograms are projected in the lens, so that’s why the device avoid these feelings of dizziness or lost of connection. And this is why it is possible to lower the FPS for Apps. Microsoft recommendation is that Apps in Hololens work at 60 fps. And here we must also understand a little how the Hololens to understand this concept.

HoloLens continuously calculates the position and orientation of the head of the user in relation to its surroundings. When an App begins to prepare the next frame to be projected, Hololens predicts where the user’s head will be in the future at the exact moment that the Frame will be displayed on the screens. Based on this prediction, the system calculates the view and projection for that Frame. Here is where the HPU comes into play, since it is responsible for all this work. 

Holograms Interaction Distance

Another important detail is the distance which the holograms are projected / displayed. In the Design Guide and Principles for Mixed Reality (see references) we found a nice and very detailed explanation:

Interaction with holograms presents your best experience between 1.25 m and 5 m.

 

hololens-hologram-placement-100647965-large

2 meters is the optimal distance, and the experience will degrade if we come up less than 1m. Less distance we will not see holograms, or see them “cut”. The design recommendations aim to use techniques of fading out or clipping in these scenarios.

Hololens Audio

Audio time ! This is a very powerful topic in Windows Holographic and not mentioned very often..

In HoloLens, there is an audio engine which completes the experience of mixed reality through the simulation of sound 3D by environmental simulations, distance and direction. This is known as SPATIAL SOUND

clipboard03

When we use Spatial Sound in an App, this locates us developers sounds in a 3 dimensions space all around the user that uses the Hololens. Sounds will then appear as if it came from real physical objects or holograms of mixed reality in a user environment. I personally think that the Spatial Sound helps to create a much more credible experience and immersive.

This is an interesting topic to read and learn. Analysing how much sound reaches our ears, our brain determines the distance and the address of the object emitting the sound. HRTF or Head Related Transfer Functions, allows you to simulate this interaction that characterizes as an ear receives sound from a point in space. Spatial Sound uses custom HRTFs to extend the mixed world holographic experience and simulate the sounds coming from different directions and distances.

Greetings @ Toronto

El Bruno

References

#Hololens – Fire Buttons actions with a #XBoxOne Controller

Hello!

Yesterday I wrote on how to connect an XBoxOne controller to a Hololens. Today I’ll share a bit of code on how to interact with the device. The example is simple

Once added the ability of Spatial Mapping in the project, we will activate the display of Visual Mesh when you press button A on the XBoxOne remote control

For example:

 

2017-03-01-holo-xboxone-controller-03

We started with the basics, creating a project vacuum in Unity3D and following these steps

  • Configure project to support HoloLens projects
  • Clean Scene elements
  • Import HoloToolkit package
  • Add
    • HololensCamera
    • SpatialMapping
    • CursorWithFeedback
  • Add Empty element, Managers
    • Add existing scripts
      • Gaze Managers
      • Gaze Stabilizer
      • Input Manager
    • Add new empty script
      • XboxControllerManager

After completion of the project would be similar to the following:

Clipboard02.jpg

Now we see the code required for the “XboxControllerManager” script. We will work on the Update(): check for button A pressed state, and let’s add that state to the DrawVisualMeshes property of the active instance of Spatial Mapping.

clipboard04

Done! This is good enough for a sample, although here it is convenient to study a bit about concepts that handles Unity for the “Inputs” (see references). In this case, in the project properties, we can see the basic settings for projects 3D in Unity in this aspect. And the main class for the States of the buttons on the remote control to access is UnityEngine.Input.

In the next post, when working with a kind of HoloToolkit for control of Xbox, comment the changes needed to make it work properly.

The source of the example source code can be downloaded from link.

Greetings @ Toronto

El Bruno

References

#Hololens – Windows 10, Xbox One Controller, Bluetooth and some lessons learned

Hello!

Since version Windows 10 Anniversary Edition, we can to connect a XBox One controller wireless to a computer to use it. I’m not a gamer, so the couple of times I have used it it always had been for very useful scenarios. For example: Control a Usb missile launcher with a XBox One controller (link)

And what better than a video showing this.

 

Now comes the time when I need to connect a controller wirelessly. So, I did the steps suggested by common sense. First, go to Settings and search for “Connect wireless Xbox One controller

clipboard05

In the Bluetooth section, add my device.

clipboard06

But nothing… I had the controller in “discovery” mode, but I found no way to connect it. I thought maybe I needed an update, so I went back to the old school steps. I connected the controller with a USB cable and installed the app “XBox Accessories”.

clipboard08

Then check if the device needed a Firmware upgrade. In this case, the device is the up to date.

clipboard10

I went back to try to connect the device via BlueTooth and nothing. The problem began to get interesting. I start to browse and see what you could be going.

After awhile to navigate and read, I went back to the home page of connection and problems with the controls of the XBox (link). In it, appears in very clear there are 2 models of the XBox controllers, the “new ones” and “old ones”. As logic dictates, the old ones do not support the wireless connection, and the new ones Yes.

clipboard03

As you can see in the video of the rocket launcher, I have a couple of “old” controllers. Lucky me, when a few months ago I bought the XBox One S, it brought me a “new” command. So I made the change to Valentino (my son) and I could now go ahead.

clipboard01

Now the important part, connect the XBox controller to Hololens. This is quite simple, just have to go to Settings, Bluetooth, add device.

clipboard01

In the list of devices we’ll see XBox control. We pressed the button Pair.

clipboard02

And few seconds later we will have connected to our Hololens command.

clipboard03

From here I can already return to write about Unity 3D and Hololens with XBox Controller!

Greetings @ Toronto

El Bruno

References

#Event – March 22, Lessons learned building #Hololens 3D apps from a 2D app developer

holobruno

Hello!

After a year in Canada, now it’s time to participate more actively in the technical communities. In this case, the amazing team of the Canadian Technology Triangle .NET User Group has given me the chance of speaking a bit about Hololens development.

The event will be the 22 of March in Kitchener to them 18:30, and is can read the details of the same here (link)

Greetings @ Toronto

El Bruno