All posts in Blog Posts

AlphaGo’s Historic Victory, The Brain vs Deep Learning, and more from the Department of Bots

You may have heard about AlphaGo: Go has officially fallen to machines, just like Jeopardy did before it to Watson, and chess before that to Deep Blue. Now that artificial intelligence has mastered Go, New Scientist asks what game should it take on next. Deep-Q is learning not only Pong with Tensorflow and PyGame, but also Flappy Bird. If that wasn’t enough, here’s a timeline of artificial intelligence victories from 1997-3041. You read that correctly.

Department of Bots
Motherboard argues that joke-telling robots represent the final frontier of A.I., since humor requires self-awareness, spontaneity, linguistic sophistication, and empathy. That’s not an easy task for a bot. Speaking of, why do developers love chatbots so much? Facebook’s Messenger Bot Store is coming, and it could be the most important launch since the App Store. We’re believers, but will robots take your job?

The Brain vs Deep Learning
Want to know why the singularity is anywhere but near? Read this great examination of the brain’s electrochemical and biological information processing pipeline as it relates to deep learning. There are a few problems with consciousness as it relates to superintelligence. The DeepMind founder has plans beyond just Go. He’s designing for healthcare, robots, and your phone. Use Neural Doodle to turn your two-bit doodles into fine artworks with deep neural networks. Very cool.

Point/Counterpoint
Harvard Business Review argues that you need an algorithm, not a data scientist. Not so fast, says Data Science Central. You need a data scientist, and then an algorithm. But, what you’re really looking for is the Algorithm Economy.

Debunking A.I. Myths
Thanks to the pioneering work of scientists, a clearer picture is emerging about A.I., and the most common misconceptions and myths. These are the 7 biggest myths about A.I., and 17 predictions about the future of big data.

The Internet of (Broken) Things
A security expert hacked a hotel’s Android-based light-switch tablet, and then gained control to the electronics in every single room. Oof. This is going to be a continual challenge for companies as they integrate digital technologies in meaningful ways to enhance homes and improve their lives. Here’s your chance to meet the 10 pigeons(!) live tweeting London’s air pollution. Oh, and by the way, they’re wearing tiny backpacks.


Emergent Future is a weekly, hand-curated dispatch exploring technology through the lens of artificial intelligence, data science, and the shape of things to come. Subscribe below.

The Emergent Future and the Shape of Things to Come

We’ve started a newsletter called the Emergent Future, which is a weekly, hand-curated dispatch exploring technology through the lens of artificial intelligence, data science, and the shape of things to come. EF is published every Tuesday and goes out to Algorithmia subscribers. Stay on top of emerging trends by subscribing to Emergent Future today. 


Google vs Go

Google's DeepMind defeats legendary Go player Lee Se-dol in historic victory.
You might have heard: Se-dol finally won his first match, after losing three in a row in a best-of-five competition. The two meet for the final time Today.
+ ‘I’m in shock!’ How an AI beat the world’s best human at Go


The Future of Computing

The Economist weighs in now that the era of predictable improvement in computer hardware is ending.
+ Chris Dixon: What’s Next in Computing?


How To Think About Bots

In order to better comprehend the possibilities, and perils, of social bots we must ask about their design, implementation, regulation, and ethics.
+ Motherboard presents In Our Image, a week of stories on AI


Minecraft Will Soon Be Able to Play Itself

Microsoft is using Minecraft to train artificial intelligence to play the hugely popular game.
+ Microsoft invites artificial intelligence developers to test their creations within Minecraft's virtual landscapes.

Is This Year the Internet Finally Learns to See?

Initially, the Internet was built for text, and technology has learned how to read at a pretty advanced level. However, the web has become increasingly visual, and tech has not fully kept up: the Internet can read, but it can’t see.
+ Using Neural Networks to Combine Random Images

 

Algorithmia is now free, forever

Algorithmia has grown rapidly since we launched last year. We now have more than 16,000 developers using the Algorithmia library of algorithms to help them build smart apps in just five lines of code. Over the past year we’ve consistently heard from developers that they need a free tier so they can try, experiment, and integrate Algorithmia worry-free.

Today, we’re excited to announce a new, free tier to make access to Algorithmia even easier: all users now receive 5,000 free credits to use every month. Forever.

Free Forever pricing with Algorithmia makes getting started even easier. Algorithmia is about giving developers access to world class algorithms. With our new Free Forever tier, we’ll help more developers get started faster. This is perfect for researchers, hackers, and hobbyists looking to get started using state-of-the-art algorithms in their projects. Focus on building your apps without worry of running out of credits. Upgrade to a paid plan only when you’re ready to scale. All new accounts you also receive 5,000 bonus credits to be used at any time. Ready to get started? Sign-up here to start for free.

When you’re ready to upgrade, we have a pay as you go plan to fit the needs of your growing business. Packages start at $20 for 200,000 credits. We offer a 10% volume discount starting at $100 for 1.1MM credits. Our pay as you go plan includes the ability to purchase credits on demand, and auto-reload when you run out – all without minimum monthly fees, or contracts. You only pay for what you use. All pay as you go accounts also receive the 5,000 free monthly credits.

It has been a pleasure building Algorithmia with you, and it’s incredible to see the amazing apps the community has built with us. More features are coming soon, including support for more programming languages.

To recap: Algorithmia is free forever. No contracts. Upgrade when you’re ready.

Happy Coding!

–Team Algorithmia


What’s Changed 

Your dashboard now features a thermometer of the number of free credits remaining in your cycle, and when the cycle next resets. We also provide a rollup of your total credits remaining (granted + earned + purchased credits). Learn more here.

Screen Shot 2016-02-29 at 4.35.48 PM

On your account page, we display a 30-day account balance history to help you better understand your usage.

Algorithmia account details page


Pricing FAQ

Can I get started for free?
Yes! All accounts receive 5,000 free credits that reload automatically every 30-days. In addition, we offer all new users an additional 5,000 credits to use at anytime.

Do I need a credit card to sign up?
Nope. Simply sign-up and started making API calls today.

What’s the cost to run an algorithm?
All algorithms are charged a fee of 1 credit-per-second of execution time. Algorithm developers may charge an additionally royalty per call, which is specific to the version of the algorithm and guaranteed in perpetuity.

What is a credit worth?
The exchange rate for credits is 10,000 credits to $1 USD.

What happens if I run out of credits?
If you’ve purchased credits and have enabled auto-reload, you will be billed when your account balance reaches zero remaining credits.

If you don’t enable auto-reload or are on our Free Forever plan, you’ll receive alerts that your account balance is insufficient. To resume making API calls, simply purchase additional credits, or wait until your free credits reload at the end of your cycle.

Can I rollover unused credits?
Credits you’ve purchased never expire, and can be used at anytime. The 5,000 free credits you receive every 30-days do not rollover or accumulate in any way.

How do I buy credits?
From your Account page, select Purchase Credits. We offer two base packages: 200,000 credits for $20, and 1MM credits with a 10% bonus for $100.

Need volume pricing? Contact us.

How do I get billed?
We charge your credit card at the time of purchase. If you enable auto-reload, you will be billed when your account balance reaches zero remaining credits, so that your API calls continue uninterrupted.

I’m an algorithm developer, can I spend the credits I earn?
Yes! Earned credits can be used to call any algorithm at anytime.

Where can I see my usage metrics?
A detailed transaction log, available credits, and more are available under the Account tab on your Profile page.

Do you have academic researcher or student discounts?
Yes! Contact us about Algorithmia for Academia.

More questions?
Contact us at any time via email.

Hey Zuck, We Built Your Office A.I. Solution

Office Facial Recognition A.I.

Like many, we were pretty inspired by Mark Zuckerberg’s 2016 personal challenge to build some artificial intelligence tools to help him at home and work. We spend a lot of time at Algorithmia helping developers add algorithmic intelligence to their apps. So, with a hat tip to Zuck, we recently challenged ourselves to see what kind of A.I. solution we could come up with during a recent internal hackathon.

UPDATE (12.29.16): Zuckerberg made good on his 2016 personal challenge in a post where he details how he built an AI assistant to control aspects of his home, like the lights, temperature, appliances, music and security, what he learned over the 100-150 hours spent working on it, and what’s next for his AI assistant.


What We Made


In less than 24-hours, we created an automated front desk A.I. that uses facial recognition to identify and greet our coworkers as they arrive at the office.

We used an Amazon Fire tablet taped to the wall to act as our front desk kiosk, which used the front-facing camera to shoot video. As the user walks up to the tablet, we start sampling frames from the video, which are sent to the CMU OpenFace library to check if there’s a match.

If we’ve seen you before, we welcome you to our office by doing three things: 1) our front desk A.I. announces that you’ve arrived in our Slack channel. 2) Slackbot then sends you a summary of Git commits since the last time you checked in. 3) The office Spotify changes to your favorite song.

If you’re new here, we have you run through a training exercise where you mimic some emojis, and pick your song. The next time you arrive at the office, you’ll be in the system, and ready to go.

The icing on the cake: we built this entirely on Algorithmia, which means we didn’t have to setup or configure servers.


Building The Facial Recognition Service


From the start, our biggest concern was that we needed a facial recognition algorithm that could build an accurate model with as few images as possible, since we didn’t want our users having to train for more than a few seconds. Using the CMU OpenFace library we were able to accomplish this with as few as 10 images, which was perfect for handling our training and facial recognition tasks.

We had just heard about this library, and were eager to test their claim that the update improved recognition accuracy from 76.1% to 92.9% in half the execution time. Although we haven’t done any benchmarking, we were impressed by the anecdotal results, and are looking forward to making the CMU OpenFace library publicly available in the Algorithmia Marketplace as soon as possible. The speed and accuracy could be a game-changer for anybody interested in deep neural network training, and closes the gap from weeks to days for facial recognition.

We created a simple training routine where the user looks at the camera and makes a series of faces. This ensures we capture enough variety of facial expressions for the model. While the user is training, we’re sampling images from the video, labeling them, and getting them ready to process.

Once you’re done making faces, we send them as a batch to the library where it detects faces using OpenCV, and then calculates the position of the face in real time using dlib. It takes about a minute for this entire process to train in the background to be used in the future.

Real-Time Face Pose Estimation

Example of face pose estimation using dlib

We wrapped this entire process in a couple algorithms running on Algorithmia, which operate like microservices. When the user first walks up, the tablet is taking photos of the user, it’s sending them to our FaceClassify algorithm, which continually checks the image with OpenFace to see if we recognize the user. If we recognize you, we send back the UID for the user, and kick off the GetUserData algorithm to retrieve data about you.

The GetUserData algorithm grabs the user’s name and Spotify song URI they selected when they first trained. We then pass the name to the GreeterActions algorithm, which handles both our GitHub and Slack integrations.

Our Greeter Bot for Slack uses an incoming webhook from our app to send a message to the team that somebody has arrived, and are checked in.

Welcome messages for Slack

Greeter Bot welcomes user to the office via Slack

We then grab all the commits from GitHub since you were last in the office, format them, and pass it to our Slack webhook. The webhook handles both sending a direct message to you with the commit summary, as well as announcing that you’ve arrived in our team channel.

Sending GitHub commits into Slack

User receiving all the GitHub commits they’ve missed


Integrating Spotify


 We wanted the process of choosing your office walk-up music to be simple, intuitive, and fun. So we created a choose-your-own adventure flow:

Music Selection

The user first selects a genre of music, and then chooses between “happy,” “dancing,” or “celebration” music. The user is then presented with three songs that matched the genre-mood.

To get the music playing, the first thing we needed to do was create a service that could take requests for tracks and play them. We landed on Pi MusicBox which is a free, headless audio server based on Mopidy for the Rasberry Pi.

Spotify on PiMusicBox

PiMusicBox running Spotify on the Raspberry Pi

Getting Pi MusicBox up and running was straightforward, but we realized that it didn’t have an officially documented API endpoint we could hack on – it’s intended to act more like a replacement for Sonos that let’s you stream music from Spotify, Google Music, SoundCloud, Webradio, Podcasts, and more. So, we had to reverse engineer it.

The first thing we noticed was that all communication was handled with websockets. This controlled the various functions like play, pause, change song, etc. Once we figured out the pattern, it was as easy as setting up another microservice on Algorithmia to pass this information through:

import Algorithmia
import websocket

def apply(input):

ws = websocket.WebSocket()
ws.connect("ws://[AlgoJamzBoxIP]:6680/mopidy/ws")

ws.send('''{"method":"core.tracklist.clear","jsonrpc":"2.0","id":600}''')
print "Got back: '%s'" % ws.recv()
ws.send('''{"method":"core.tracklist.add","params":[null,null,"''' + str(input) +'''"],"jsonrpc":"2.0","id":601}''')
print "Got back: '%s'" % ws.recv()
ws.send('''{"method":"core.playback.play","jsonrpc":"2.0","id":602}''')
print "Got back: '%s'" % ws.recv()

return 'Done'

It’s kind of ugly, but with that figured out, we now have a way of calling the endpoint directly from the tablet using:

var input = <INPUT>;
Algorithmia.client("YOUR_API_KEY")
.algo("algo://jambox/CallTrack/0.1.0")
.pipe(input)
.then(function(response) {
console.log(response.get());
});

This passes in the Spotify URI, connects to the Raspberry Pi, and the song plays on the music in the office stereo.


Conclusion


We’re pleased with how quickly we could create and stack together serverless microservices to power our automated front desk A.I. We’ll be adding the CMU OpenFace library to the platform soon, which will enable all kinds of interesting use cases for app developers – including Zuckerberg’s.

Interested in building your own? Sign-up here to get started with Algorithmia.

We have some cleanup to do on the code before we’re ready to share the sample app with everybody, but in the mean time we built this hack using the following technology:

  • Raspberry Pi
  • PiMusic Box
  • Algorithmia
  • CMU OpenFace
  • Slack
  • Spotify
  • GitHub Pages
  • Node | Angular
  • Ratchet (mobile-first UI based on Bootstrap)
  • Fire Tablet

Thanks for reading!

How the Algorithm Economy and Containers are Changing the Way We Build and Deploy Apps Today

The algorithm economy creates a new value chain

In the age of Big Data, algorithms give companies a competitive advantage. Today’s most important technology companies all have algorithmic intelligence built into the core of their product: Google Search, Facebook News Feed, Amazon’s and Netflix’s recommendation engines.

“Data is inherently dumb,” Peter Sondergaard, senior vice president at Gartner and global head of Research, said in The Internet of Things Will Give Rise To The Algorithm Economy. “It doesn’t actually do anything unless you know how to use it.”

Google, Facebook, Amazon, Netflix and others have built both the systems needed to acquire a mountain of data (i.e. search history, engagement metrics, purchase history, etc), as well as the algorithms responsible for extracting actionable insights from that data. As a result, these companies are using algorithms to create value, and impact millions of people a day.

“Algorithms are where the real value lies,” Sondergaard said. “Algorithms define action.”

For many technology companies, they’ve done a good job of capturing data, but they’ve come up short on doing anything valuable with that data. Thankfully, there are two fundamental shifts happening in technology right now that are leading to the democratization of algorithmic intelligence, and changing the way we build and deploy smart apps today:

  1. The Algorithm Economy
  2. Containers

The confluence of the algorithm economy and containers creates a new value chain, where algorithms as a service can be discovered and made accessible to all developers through a simple REST API. Algorithms as containerized microservices ensure both interoperability and portability, allowing for code to be written in any programming language, and then seamlessly united across a single API.

By containerizing algorithms, we ensure that code is always “on,” and always available, as well as being able to auto-scale to meet the needs of the application, without ever having to configure, manage, or maintain servers and infrastructure. Containerized algorithms shorten the time for any development team to go from concept, to prototype, to production-ready app.

Algorithms running in containers as microservices is a strategy for companies looking to discover actionable insights in their data. This structure makes software development more agile and efficient. It reduces the infrastructure needed, and abstracts an application’s various functions into microservices to make the entire system more resilient.


The Algorithm Economy

Algorithm marketplaces and containers create microservices
The “algorithm economy” is a term established by Gartner to describe the next wave of innovation, where developers can produce, distribute, and commercialize their code. The algorithm economy is not about buying and selling complete apps, but rather functional, easy to integrate algorithms that enable developers to build smarter apps, quicker and cheaper than before.

Algorithms are the building blocks of any application. They provide the business logic needed to turn inputs into useful outputs. Similar to Lego blocks, algorithms can be stacked together in new and novel ways to manipulate data, extract key insights, and solve problems efficiently. The upshot is that these same algorithms are flexible, and easily reused and reconfigured to provide value in a variety of circumstances.

For example, we created a microservice at Algorithmia called Analyze Tweets, which searches Twitter for a keyword, determining the sentiment and LDA topics for each tweet that matches the search term. This microservice stacks our Retrieve Tweets With Keywords algorithm with our Social Sentiment Analysis and LDA algorithms to create a simple, plug-and-play utility.

The three underlying algorithms could just as easily be restacked to create a new use case. For instance, you could create an Analyze Hacker News microservice that uses the Scrape Hacker News and URL2Text algorithms to extract the text for the top HN posts. Then, you’d simply pass the text for each post to the Social Sentiment Analysis, and LDA algorithms to determine the sentiment and topics of all the top posts on HN.

The algorithm economy also allows for the commercialization of world class research that historically would have been published, but largely under-utilized. In the algorithm economy, this research is turned into functional, running code, and made available for others to use. The ability to produce, distribute, and discover algorithms fosters a community around algorithm development, where creators can interact with the app developers putting their research to work.

Algorithm marketplaces function as the global meeting place for researchers, engineers, and organizations to come together to make tomorrow’s apps today.


Containers

Putting algorithms in containers enables the algorithm economy

Containers are changing how developers build and deploy distributed applications. In particular, containers are a form of lightweight virtualization that can hold all the application logic, and run as an isolated process with all the dependencies, libraries, and configuration files bundled into a single package that runs in the cloud.

“Instead of making an application or a service the endpoint of a build, you’re building containers that wrap applications, services, and all their dependencies,” Simon Bisson at InfoWorld said in How Containers Change Everything. “Any time you make a change, you build a new container; and you test and deploy that container as a whole, not as an individual element.”

Containers create a reliable environment where software can run when moved from one environment to another, allowing developers to write code once, and run it in any environment with predictable results — all without having to provision servers or manage infrastructure.

This is a shot across the bow for large, monolithic code bases. “[Monoliths are] being replaced by microservices architectures, which decompose large applications – with all the functionality built-in – into smaller, purpose-driven services that communicate with each other through common REST APIs,” Lucas Carlson from InfoWorld said in 4 Ways Docker Fundamentally Changes Application Development.

The hallmark of microservice architectures is that the various functions of an app are unbundled into a series of decentralized modules, each organized around a specific business capability.

Martin Fowler, the co-author of the Agile Manifesto, describes microservices as “an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API.”

By decoupling services from a monolith, each microservice becomes independently deployable, and acts as a smart endpoint of the API. “There is a bare minimum of centralized management of these services,” Fowler said in Microservices: A Definition of this New Architectural Term, “which may be written in different programming languages and use different data storage technologies.”

Similar to the algorithm economy, containers are like Legos for cloud-based application development. “This changes cloud development practices,” Carlson said, “by putting larger-scale architectures like those used at Facebook and Twitter within the reach of smaller development teams.”


tl;dr

  • The algorithm economy and containers are changing the way developers build and ship code.
  • The algorithm economy allows for the building blocks of algorithmic intelligence to be made accessible, and discoverable through marketplaces and communities.
  • Containerizing algorithms enables them to be packaged as microservices, making them accessible via an API, and hosted on scalable, serverless infrastructure in the cloud.