#Windows10 – Windows #VisionSkills sample UWP App

Hi!

Yesterday the Windows Team announced the preview version of Windows Vision Skills. So today I was browsing the samples in Github and I’ve created a simplified version of the Skeleton tracker using a live feed from a webcam.

Here are some notes about my GitHub sample

  • The UWP App must be Windows 10 version 1809
  • I added the NuGet packages [Microsoft.AI.Skills.Vision.SkeletalDetectorPreview] and [Microsoft.Toolkit.Uwp.UI.Controls]
  • The MainView uses the CameraPreview control from the [Microsoft.Toolkit.Uwp.UI.Controls] toolkit.
  • Each frame is processed and I use a SkeletalBinding to detect Skeletons / bodies
  • The core detection is performed here
        private async Task RunSkillAsync(VideoFrame frame, bool isStream)
        {
            m_evalPerfStopwatch.Restart();

            // Update input image and run the skill against it
            await m_skeletalDetectorBinding.SetInputImageAsync(frame);
            await m_skeletalDetectorSkill.EvaluateAsync(m_skeletalDetectorBinding);

            m_evalPerfStopwatch.Stop();
            m_skeletalDetectionRunTime = m_evalPerfStopwatch.ElapsedMilliseconds;

            await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () =>
            {
                m_bodyRenderer.Update(m_skeletalDetectorBinding.Bodies, !isStream);
                m_bodyRenderer.IsVisible = true;
                UISkillOutputDetails.Text = $"Found {m_skeletalDetectorBinding.Bodies.Count} bodies (took {m_skeletalDetectionRunTime} ms)";
            });
        }
  • There is also a BodyRenderer.cs class used to draw the skeletons on top of the CameraPreview Image control. It draws lines in an empty canvas.

You can download the sample code from here https://github.com/elbruno/Blog/tree/master/20190501%20VisionSkills%20Skeleton%20Sample

Greetings @ Burlington

El Bruno

References

Advertisements

#Windows10 – Windows Vision Skills (Preview), an amazing set of AI APIs to run in the edge!

Hi!

Today’s announcement is a big one if you are interested on move AI capabilities to the Edge. The Windows team make public the preview of Windows Vision Skills framework:

Windows Vision Skills framework is meant to standardize the way AI and CV is put to use within a WinRT application running on the edge. It aims to abstract away the complexity of AI and CV techniques by simply defining the concept of skills which are modular pieces of code that process input and produce output. The implementation that contains the complex details is encapsulated by an extensible WinRT API that inherits the base class present in this namespace, which leverages built-in Windows primitives which in-turn eases interop with built-in acceleration frameworks or external 3rd party ones.

The official blog explain the basic features of the framework and describes a set of scenarios like Object Detector, Skeletal Detector, and Emotion Recognizer.

We have UWP Apps in the repo samples, and it only took 1 min to setup everything to get the App up and running. In the following image, it smoothly detects a person and a chair.

The next image is the sample for Skeletal detector (as a old Kinect dev, this really makes me happy!)

This is an big announcement, because all of this APIs are native , and that means we can easily use them in

Greetings @ Toronto

El Bruno

References


#AI – AI for Earth, AI tools in the hands of those working to solve global environmental challenges

Hi !

When I was in Ohio @CodeMash, I was lucky enough to meet Jennifer Marsman, Principal Engineer & speaker on the AI for Earth team at Microsoft (@jennifermarsman). She hosted an amazing session where she shared details about some projects on AI for Earth.

AI for Earth puts Microsoft cloud and AI tools in the hands of those working to solve global environmental challenges

See references

The work that the AI for Earth teams are doing are amazing, and I was really impressed by the “Mexican whale story”. The team uses image analysis to identify individual animals in regular persons photos or videos, and using meta data like date and location of a photo or a video, they can generate paths of animal migration. And yes, the photos came from public social media spaces like Facebook, Instagram or YouTube.

So, I got this information as a draft for a while, and now I get some more details and it makes sense to share it. The project name is Wild Me:

Wild Me is using computer vision and deep learning algorithms to power Wildbook, a platform that can identify individual animals within a species.  They also augment their data with an intelligent agent that can mine social media. 

And as usual, a video is the best way to explain this:

Besides Wild Me, there are other amazing projects like SilviaTerra or FarmBeats. You can find the complete list of projects and challenges here (link).

Happy Coding!

Greetings @ Burlington

El Bruno

References

#Event – Resources for the session [Getting Started with Machine Learning.Net & #Azure] on the #GlobalAzure Bootcamp in GTA

Hi!

Another post-event post, this time with a big thanks to the team behind one of the biggest Community events globally: Global Azure Bootcamp.

Avanade Canada sponsored this session and I had the amazing chance to share some insights around Machine Learning.Net and Azure.

As usual, now it’s time to share slides, code and more.


Source Code in GitHub https://github.com/elbruno/events/tree/master/2019%2004%2027%20GAB%20MLNet

And some Machine Learning.Net and Azure Notebook resources:

Resources

See you next one in Chicago for some Deep Learning fun!

Happy coding!

Greetings @ Toronto

El Bruno

From GitHub to Azure App Service through Azure DevOps pipelines

Juanlu, ElGuerre

En post anteriores vimos como compilar, ejecutar tests y lanzar la cobertura de código desde línea de comandos e incluso ejecutamos el análisis estático con Sonarqube:

pues bien, en éste veremos como hacer todo esto en Pipelines de Azure DevOps y además, concluiremos con una publicación en dos entornos (Development y Producción) basados en Azure App Service. En resumen, vamos a construir un sistema de CI y CD, para el que seguiremos los siguientes pasos, teniendo en cuenta como en ocasiones anteriores que nuestro código se encuentra en Github y concretamente aquí (MyBudget):

  • Configurar “Azure Pipelines” en GitHub buscándo esta carácterística en el Marketplace y eligiendo el plan gratuito.

  • Desde el Portal de Azure (aunque podemos elegir…

View original post 499 more words

#Python – The best way to explain how jupyter notebooks works with Visual Studio Code @Code

Hi !

So, after my yesterday post [Edit and work with Jupyter notebooks in Visual Studio Code], today some people asked me how the Jupyter Notebooks and Python integration works.

The best way to explain this is with a simple animated video with the following actions

  • Create a cell using the prefix # %%
  • Run the cell and display the output in Python Interactive
  • Create a new cell
  • Run the new cell and the previous one
  • Analyze output in Python Interactive

I think this 15 seconds are good enough to understand the benefits of Jupyter Notebooks and Visual Studio Code.

Happy Coding!

Greetings @ NY

El Bruno

References

#VSCode – Edit and work with #jupyter notebooks in Visual Studio Code

Hi !

I’ve been using Python and Jupyter notebooks more and more. And somehow, during this learning path I also realize that I can use Visual Studio Code to code amazing Python apps, and also to edit and work with Jupyter notebooks.

If you are VSCode python developer, you may know some of the features available in the tool. I won’t describe them, because you may find the official documentation very useful (see below links or references).

The Python extension provides many features for editing Python source code in Visual Studio Code:

However, during the part months I’ve also working a lot using Jupyter notebooks, and I was very happy when I realize that VSCode also have some cool features to work with notebooks. The core of the notebooks are cells, and we can use them with the prefix #%%.

This is how it looks inside the IDE, running a cell in the code

Another interesting feature is to run notebooks in a remote Jupyter server, maybe using Azure Notebooks. I haven’t tried this one, and it’s on my ToDo list for the near future.

On top of adding cells features into standard python [.py] files, we can also edit standard Jupyter files. I’ve installed jupyter into one of my anaconda local environments, and now I can edit files inside VSCode.

First, I’ll be prompted to import the file as a standard python file

And, done! Now I got my Jupiter notebook inside VSCode

The final step will be to export my file or debug session, and for this we have the command [Python: Export …]

Super useful!

Happy coding!

Greetings @ NY

El Bruno

References

#event – #DeepLearning for everyone @chicagocodecamp

Photo by Venkata Goli on Pexels.com

Hi !

So, I ‘m lucky enough to be part of the Speakers in the amazing Chicago CodeCamp on May 11th. This will be my perfect excuse to visit the city of Chicago and also to meet some of the amazing people in the Chicago Tech Community.

My session will be about Deep Learning for regular devs (like myself)

Deep Learning for Everyone

You probably read a lot about Machine Learning and Deep Learning these days; however, if you are a standard developer (like me), is hard to find a way to start with ML or DL. So, let’s avoid learning specific ML languages and tools, and let’s have some fun using a popular language like C# to create a DL model. And, let’s also try to run this model in a popular device like a Raspberry Pi (why not?). We may add some cloud, and some IoT pieces to the scene, however, keep in mind that during this session the idea is to LEARN and HAVE FUN. Creating something from zero is one of the best ways to understand how Deep Learning works!

More information: https://www.chicagocodecamp.com/

Happy Coding !

Greetings @ Burlington

El Bruno

#Humor – To Understand Recursion, You Must First Understand Recursion (Note: #Google does Understand Recursion)

Just search for recursion in Google, and you’ll get the joke …

Greetings @ Burlington

El Bruno