An Emotion Recognition API for Analyzing Facial Expressions

Using Emotion Recognition AlgorithmsReading emotional expression is one of the most difficult tasks for humans, let alone computers. Two people looking at the same photo might not agree whether someone is grimacing or grinning. Until recently, computers weren’t much better at the job, either.

Fortunately advances in deep learning has brought us speed, efficiency and accuracy in detecting people’s emotions in photos.

Algorithmia’s Emotion Recognition algorithm is based on convolutional neural networks and mapped binary patterns via Gil Levi and Tal Hassner’s research.

Emotion Recognition classifies images quickly using cutting edge research that is accessible via an API call.

What is Emotion Recognition?

Many of the most popular deep learning algorithms involve image classification problems, including emotion recognition. The Emotion Recognition algorithm gives you the emotion in the given photo with its corresponding confidence interval.

Emotion recognition algorithms are based on Convolutional Neural Networks. CNN’s are an algorithm design that reflects a network similar to the human visual cortex.

Even though CNN’s have been around for decades, it’s only recently that computing power has caught up with the algorithm’s capabilities. This enables CNN’s to run in scaled, production environments.

Why You Need Emotion Recognition

Emotion detection in images is a useful task for many applications, like tagging social media images, or marketing analysis such as usability studies, focus groups, security, sales training, or healthcare.

However, it’s not easy to take research and turn it into a useable product available at scale.

Algorithmia’s implementation of the Emotion Recognition in the Wild paper converts this research into a scalable API you can call in a few lines of code. Leave the system infrastructure for running CNN’s in a GPU enabled environment to us.

Algorithmia has support for Ruby, Rust, Python, JavaScript, Scala, Java, and R so it’s easy to identify emotions in images using the language of your choice, even in real-time.

How to Use Emotion Recognition

To get started using Emotion Recognition, you’ll need a free API key from Algorithmia.

After creating your account, go to your profile page and navigate to the Credentials tab. There you will find your API key which you’ll need later.

For this example we will show how to use the Emotion Recognition algorithm with Python, but you could call it using any of our supported clients.

Sample API Call:

import Algorithmia

client = Algorithmia.client("your_algorithmia_api_key")

input = {
    "image": "",
    "numResults": 7

algo = client.algo('deeplearning/EmotionRecognitionCNNMBP/0.1.2')

result = algo.pipe(input).result


Note: The Emotion Recognition algorithm takes different types of “image” values including a Data API Url, Web (http/https) Url, binary image or a base64 encoded image.


  "results": [
    [0.6013028, "Neutral"],
    [0.3834005000000001, "Angry"],
    [0.0142599, "Disgust"],
    [0.0004335000000000001, "Happy"],
    [0.000291, "Fear"],
    [0.0002885, "Sad"],
    [0.000023699999999999997, "Surprise"]

While emotion detection is interesting in images, it can be even more useful when combined with video algorithms. For example you could split a video into frames and then get the emotion of each frame. You could then categorize the video as one that is happy, sad or another emotion. Or in the case of focus groups in our marketing example, you could target what areas in the experience are triggers for joy or pain points.

Let us know what you plan on building with emotion detection in images @Algorithmia.