Hello

Today we will see a little implementation and the detail provided by Project Oxford if we want to work with Emotion APIs. Firstly a gif with an example of recognition of age, gender and emotions in a WPF app.
2015 11 14 Project Oxford Emotions

We see a photo with some faces recognized: 64 MVPs (nice set of people) and then a bit of Han Solo and another one with Martina on the beach.

If you look at the code a little we will see that in the FaceAPI class, I have created an operation for the detection of emotions. This operation uses the stream of the selected image, and also an array with the location of the faces that were previously found in the process of face detection.

Clipboard03

Once the detection of emotions is complete, I update this information in the collection with the detected faces. For now, I do this looking for the same rectangles in both collections, perhaps I should improve this routine.
In addition, in the case that represents a face, I’ve created a property showing the excitement with the highest value.
Clipboard02

This value is that then show in the app along with the face frame.
.

Clipboard03

The source code is available on GitHub https://github.com/elbruno/ProjectOxford

Greetings @ Madrid

El Bruno

References

8 responses to “#AZURE – Complete sample of Face APIs and Emotions APIs #ProjectOxford”

  1. […] Complete sample of Face APIs and Emotions APIs […]

    Like

  2. […] Complete sample of Face APIs and Emotions APIs […]

    Like

  3. […] Complete sample of Face APIs and Emotions APIs […]

    Like

  4. […] Complete sample of Face APIs and Emotions APIs […]

    Like

  5. […] Complete sample of Face APIs and Emotions APIs […]

    Like

Leave a reply to #AZURE – #VisionAPI, add a frame for each detected text #OCR | El Bruno Cancel reply

Discover more from El Bruno

Subscribe now to keep reading and get access to the full archive.

Continue reading