#AZURE – Complete sample of Face APIs and Emotions APIs #ProjectOxford

Hello

Today we will see a little implementation and the detail provided by Project Oxford if we want to work with Emotion APIs. Firstly a gif with an example of recognition of age, gender and emotions in a WPF app.
2015 11 14 Project Oxford Emotions

We see a photo with some faces recognized: 64 MVPs (nice set of people) and then a bit of Han Solo and another one with Martina on the beach.

If you look at the code a little we will see that in the FaceAPI class, I have created an operation for the detection of emotions. This operation uses the stream of the selected image, and also an array with the location of the faces that were previously found in the process of face detection.

Clipboard03

Once the detection of emotions is complete, I update this information in the collection with the detected faces. For now, I do this looking for the same rectangles in both collections, perhaps I should improve this routine.
In addition, in the case that represents a face, I’ve created a property showing the excitement with the highest value.
Clipboard02

This value is that then show in the app along with the face frame.
.

Clipboard03

The source code is available on GitHub https://github.com/elbruno/ProjectOxford

Greetings @ Madrid

El Bruno

References

8 comments

Leave a comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: