Algorithmia

Building Intelligent Applications

Algorithmia was delighted to speak at Seattle’s Building Intelligent Applications meetup last month.  We provided attendees with an introductory view of machine learning, walked through a bit of sample code, touched on deep learning, and talked about about various tools for training and deploying models.

For those who were able to attend, we wanted to send out a big “thank you!” for being a great audience.  For those who weren’t able to make it, you can find our slides and notes below, and we hope to see you at the next meetup on Wednesday, April 26.  Data Scientists Emre Ozdemir and Stephanie Peña will be presenting two Python-based recommender systems at Galvanize in Pioneer Square.

To come to Wednesday’s talk, RSVP via Eventbrite.  To keep an eye out for future events, join the Building Intelligent Applications Meetup Group.

Read More…

Quickly Classify Clothing and Fashion Items in Images

You may already know that Algorithmia hosts scalable deep learning models. If you are a developer, you’ve seen how easy it is to run over 3,000 microservices through any of our supported languages and frameworks.

But sometimes it’s nice just to play with a simple demo.

The Deep Fashion microservice is a deep CNN, performing multi-category classification, which has been trained with humans in the loop to recognize dozens of different articles of clothing. It can be used standalone to locate specific items in an image set, or combined with a nearest-neighbors service such as KNN or Annoy to recommend similar items to online shoppers. And since the service provides bounding box coordinates for each item within the image, it could even used to censor or modify images themselves.

To see it in action, just head over to the Deep Fashion Demo, click (or upload) an image, and watch as state-of-the-art deep learning models scan the image to identify clothing and fashion items.

Create a Custom Color Scheme From Your Favorite Website

Pink hair woman

In a recent Algorithm Spotlight post we introduced the Color Scheme Extraction algorithm which retrieves the top 15 colors that best approximates the color scheme of an image.

The Color Scheme Extraction algorithm is a great way to retrieve the colors of an image quickly with a serverless API call in a few lines of code using the programming language of your choice.

This recipe allows you to pipe in several images, including ones that are a montage of other images in order to get a personalized color scheme.

While there are plenty of online color scheme generators, none can generate random color palettes from multiple images found on your favorite design blog or shopping site.

Maybe you’re looking for a unique and compelling color palette for your website or looking to design a living room that has been inspired by the latest trends in fashion. Perhaps you are selling your house and want to see what the best selling houses are using for their color schemes and come up with your own unique palette based on those images.

This recipe will show you how to extract image links from a url that has multiple images on it, get the color schemes from those image urls and generate a new custom color scheme made of five random hexadecimal colors. Then you can plug them into Adobe Color Wheel or another online color scheme creator and visualize your new color scheme.

Step 1: Install the Algorithmia Client

This tutorial is in Python. But, it could be built using any of the supported clients, like Scala, Ruby, Java, Node and others. Here’s the Python client guide for more information on using the Algorithmia API.

Install the Algorithmia client from PyPi:

pip install algorithmia

You’ll also need a free Algorithmia account, which includes 5,000 free credits a month.

Sign up here, and then grab your API key.

Step 2: Retrieve Image Urls

After you’ve gotten your API key, you’ll want to call the Get Image Links algorithm that retrieves all the image url’s from a web page. These are the images that will be used to create your custom color palette.

"""Get image URL's from a site and extract unique color palette from images."""
import Algorithmia

client = Algorithmia.client("YOUR_API_KEY")


def get_image(url):
    """Retrieve images from site."""
    algo = client.algo("diego/Getimagelinks/0.1.0")
    if url.startswith("http:") or url.startswith("https:"):
        try:
            response = algo.pipe(url).result
            print(response)
            return response
        except Algorithmia.algo_response.AlgoException as e:
            print(e)
    else:
        raise Exception("Please pass in a valid url")

The code above does a very simple check to make sure that the url passed is valid and then calls the algorithm, piping in the URL as its input. The output is a list of image urls:

['http://n.nordstrommedia.com/id/84c91f6a-acb1-4251-85f7-b5ef95ac602b.jpeg?w=2560&h=723', 'http://n.nordstrommedia.com/id/19addc67-3fc2-4460-b962-f7e6acaf1b9c.jpeg?w=760&h=855', 'http://n.nordstrommedia.com/id/3561c21c-3ae7-4e0d-84da-b1151cf1dff7.jpeg?w=760&h=640', 'http://n.nordstrommedia.com/id/999eea01-4072-4366-ac3f-6d1f59e81e3f.jpeg?w=760&h=638', 'http://n.nordstrommedia.com/id/32daaaac-410f-4075-9995-10847b00e737.jpeg?w=760&h=644', 'http://n.nordstrommedia.com/id/242715f5-cf11-4c1e-864f-6bcb6ae7dfec.jpeg?w=760&h=649', 'http://n.nordstrommedia.com/id/c5d5f797-08bc-45b4-9cec-c4a8f83d74b1.jpeg?w=2560&h=995', 'http://n.nordstrommedia.com/id/87c47e87-5e21-48b4-9d1a-e8cf3dca6342.jpeg?w=760&h=955', 'http://n.nordstrommedia.com/id/f96fb198-d426-48fe-a222-319a4b78bcc5.jpeg?w=940&h=106', 'https://secure.nordstromimage.com/images/store/common/flagicons/US.gif']

Step 3: Extract Colors

Next you’ll call the Color Scheme Extraction algorithm which choses the top 15 colors from each image that the previous function found from your url.

def color_extractor(url):
    """Extract top 15 colors from images."""
    urls = get_image(url)
    algo = client.algo("vagrant/ColorSchemeExtraction/0.2.0")
    try:
        response = [algo.pipe({"url": path}).result for path in urls]
        print(response)
        return response
    except Algorithmia.algo_response.AlgoException as e:
        print("Exception ")
        print(e)

The previous snippet iterates over each image and outputs a nested list of colors in both rgba and hexadecimal form and here’s a sample of the output generated:

[{'colors': [{'rgb': [255, 255, 255], 'hex': '#ffffff'}, {'rgb': [196, 85, 104], 'hex': '#c45568'}, {'rgb': [237, 207, 176], 'hex': '#edcfb0'}, {'rgb': [100, 43, 32], 'hex': '#642b20'}, {'rgb': [184, 130, 95], 'hex': '#b8825f'}, {'rgb': [181, 35, 56], 'hex': '#b52338'}, {'rgb': [116, 116, 116], 'hex': '#747474'}, {'rgb': [203, 177, 162], 'hex': '#cbb1a2'}, {'rgb': [0, 0, 0], 'hex': '#000000'}, {'rgb': [209, 56, 78], 'hex': '#d1384e'}, {'rgb': [155, 99, 64], 'hex': '#9b6340'}, {'rgb': [240, 236, 231], 'hex': '#f0ece7'}]

Step 4: Create your Custom Color Palette

Finally you’ll create a function for creating a custom color palette, pulling five colors from the color schemes created by the Color Scheme Extraction.

import random

def shuffle_colors(url):
    """Create unique color palette from images passed in."""
    data = color_extractor(url)
    a_list = []
    for c in data:
        for hex in c["colors"]:
            a_list.append(hex["hex"])

    new_color_scheme = random.sample(set(a_list), 5)
    print(new_color_scheme)

shuffle_colors(
    "http://shop.nordstrom.com/c/designer-collections?origin=topnav&cm_sp=Top%20Navigation-_-Designer%20Collections")

This code iterates over all the colors that the Color Scheme Extraction algorithm created and randomly generates five hexadecimal colors that create your custom color palette.

['#ece9e5', '#f0cfc8', '#08030b', '#b58297', '#d0a695']

Now comes the fun part, checking to see what your custom color scheme is.

Go ahead and check out Adobe Color Wheel or another color palette visualization tool and use the custom option on the color wheel. Then simply type in your hex colors and now you can see your custom color palette.

We used Nordstrom’s Designer Collections Spring 2017 splash page to generate our color scheme and it came out pretty nice!

Custom Adobe Color Wheel Color Palette

 

This recipe showed you how to extract images and their colors in order to create a random color scheme using the Algorithmia API. Once you’re done you can take it a step further by creating a plugin to generate random color schemes or use the Get Zilllow Image algorithm to find image palettes from top selling houses in your neighborhood.

Resources:

For ease of reference here is the whole script (also available in Algorithmia’s sample-apps repository on Github):

"""Get image URL's from a site and extract unique color palette from images."""
import Algorithmia
import random

client = Algorithmia.client("YOUR_API_KEY")


def get_image(url):
    """Retrieve images from site."""
    algo = client.algo("diego/Getimagelinks/0.1.0")
    if url.startswith("http:") or url.startswith("https:"):
        try:
            response = algo.pipe(url).result
            print(response)
            return response
        except Algorithmia.algo_response.AlgoException as e:
            print(e)
    else:
        raise Exception("Please pass in a valid url")


def color_extractor(url):
    """Extract top 15 colors from images."""
    urls = get_image(url)
    algo = client.algo("vagrant/ColorSchemeExtraction/0.2.0")
    try:
        response = [algo.pipe({"url": path}).result for path in urls]
        print(response)
        return response
    except Algorithmia.algo_response.AlgoException as e:
        print("Exception ")
        print(e)


def shuffle_colors(url):
    """Create unique color paletted from images passed in."""
    data = color_extractor(url)
    a_list = []
    for c in data:
        for hex in c["colors"]:
            a_list.append(hex["hex"])

    new_color_scheme = random.sample(set(a_list), 5)
    print(new_color_scheme)

shuffle_colors(
    "http://shop.nordstrom.com/c/designer-collections?origin=topnav&cm_sp=Top%20Navigation-_-Designer%20Collections")

Machine Learning with Humans in the Loop

TL;DR The most accurate machine learning systems to date are those that use a “human-in-the-loop” computing paradigm.

Though we have seen huge advances in the quality and accuracy of pure machine-driven systems, they tend to fall short of acceptable accuracy rates. The combination of machine-driven classification enhanced by human correction, on the other hand, provides a clear path forward in acceptable accuracy. Below we will describe a real-world use case of building and scaling these type of systems.

Read More…

Making University Research Discoverable and Accessible

Distribution, Reach, and Monetization of University Research and AlgorithmsOne of the most rewarding parts of working at Algorithmia is that we get to collaborate with amazing university researchers across the globe.

Last May, Richard Zhang, Philip Isola, and Alexei A. Efros from the University of Berkeley Vision Lab published their work “Colorful Image Colorization.” This paper describes a novel use of a convolutional neural net (learn more about deep learning) for colorizing black and white pictures.

Read More…