Microsoft Cognitive Services
With Azure to back it up, Microsoft’s Cognitive Services are proving to be a fantastic piece of tech. Based on the way the human brain works, Microsoft has created a set of services that have the potential to give any AI-based application a more human side. Scalable, these APIs can be utilized in many ways, and are rooted in deep learning.
While yes you can build these APIs yourself, why would you? That’s time spent when you could be making something bigger, better, and more captivating than what already exists. Take was already achieved, and use it to create.
Deep learning allows a computer to solve a complex problem by attempting to replicate the ways the human mind processes senses such as light and sound. Instead of programming it to address an issue, it’s a lot easier to develop an algorithm. The algorithm will define a set of rules that a computer needs to follow, at which point it will parse through a given set of data, and learn by example. In the end, you get a machine that can produce more accurate and intricate conclusions than before.
Remember Microsoft’s How Old website that used machine learning to guess your age? People were quick to dismiss the application for producing inaccurate results. But the catch while learning is you make a lot of mistakes, and then try to avoid them in the future to get a correct answer. The more people used How Old, the more accurate the results were. With our most recent attempts, the website was able to guess ages within a year.
With Cognitive Services, Microsoft has made a plethora of APIs available to cater to varied set needs with the ability to be implemented into almost any application. Just pick a type of API you need ranging from subjects like Vision, Speech, Language, Knowledge, and Search; then go. You can even combine APIs for a more immersive experience. Want to find your face in a sea of thousands of photos, while automatically adding captions? Entirely possible.
At Lixar’s recent Data Meetup, Jim Provost gave a live demonstration on how to integrate the emotion API into in a program that worked using a webcam and a Sphero. If the program detected he was smiling, the Sphero would change to the colour green. If he wasn’t wearing glasses, it would move forward. While programming a Sphero may seem like a small thing to do, it opens the door to a larger realm of possibilities.
Uber’s driver verification system uses the facial recognition API to determine a driver’s identity, Mattel made a Cortana-powered personal assistant for kids that has vocal recognition, and Volvo used the emotion API to have their cars recognize a distracted driver and draw their attention back to the road.
There are some incredibly impressive ways Cognitive Services are being put to use – so what will you do?
Interested in Data Science? Join us for our quarterly Meetup!