Microsoft’s Cognitive Services Demonstrate Detection of Emotions in Human Face
NEW YORK—Last August 25, Nick Landry, senior technical evangelist at Microsoft, held at demonstration of Microsoft’s Cognitive Services and its 22 APIs (previously called Project Oxford) at Microsoft Reactor at Grand Central Tech.
Imagine an API that could detect the emotion in a person’s face in an image, not to mention tell someone’s age, which became hugely popular when people used the technology from the How Old Do You Look app?, which it is improving; not good news for those who hide their real age.
Now Microsoft is allowing developers to customize the new Cognitive Services. This was highlighted in previous events of Microsoft this year.
The rebranding to Cognitive Services also means that it brings together Bing, Oxford and Translator APIs.
The new Cognitive Services APIs include emotion (comparing facial expressions); entity linking (a textual analysis function); face (facial recognition); linguistic analysis, speaker recognition, speech (speech to text); video (vision analysis); WebLM (an SDK for the Web Language Model).
Seeing AI shows how these new capabilities can help people who are visually impaired or blind understand who and what is around them. See how it works here https://youtu.be/R2mC-NUAmMk
Landry demonstrated how developers can use all these services at the meetup.
Cognitive Services is reportedly a nod to IBM’s Watson, which has been marketed as a “cognitive computing” product, one that’s based on the way the human brain works.