Hello !
Some time ago, I wrote a couple of posts where I explained how to use Azure Machine Learning Face APIs in some apps like Console Apps, Windows Presentation Foundation apps and Windows Store Apps (see references section). These features are part of the Project Oxford initiative, where we can find other useful features like Vision Recognition and LUIS (Language Understanding Intelligent Service).
In my series of posts, I shared some code samples on how to detect and display faces in static images. Now we have a new cool service to add into our apps: Emotion APIs. The emotion features allows us to add a new layer of information with emotions. When we run the Machine Learning process into an image, we can get face detection information along with extra information for emotions grouped in a section named “Scores”.
In example, in the next picture (my self and my little princess in the beach)
Emotion APIs detects only 1 face (thanks glasses!) and also the following emotion information
"Scores": { "Anger": 0.0000022761285, "Contempt": 0.000227006109, "Disgust": 1.07316886e-7, "Fear": 8.930022e-7, "Happiness": 0.5243546, "Neutral": 0.475262463, "Sadness": 0.000147109211, "Surprise": 0.000005583676
And, the cool SDK is also very developer friendly with a new property for each one of the values.
In the next few days, I’ll wrote a series of posts with detailed sample on Emotion API. Today I’ve updated my public repository on GitHub with WPF and Console App samples.
The source code is available in GitHub https://github.com/elbruno/ProjectOxford
Regards @ Madrid
-El Bruno
References
- Project Oxford Emotion APIs
- My Azure ML Face APIs series
9 comments