Algorithmia Blog

Emergent // Future – Amazon’s Echo Look, Machine Learning models at scale and more

Issue 50

This week we let Amazon “Look” at us using computer vision, check out why you want to deploy your machine learning model as a scalable, serverless microservice, and look at why machine intelligence is the key to building sustainable businesses. Plus what we’re reading this week and things to try at home.

Spread the love of E//F on Twitter and Facebook


Amazon Look AI

Amazon debuted the Echo Look, an Alexa-enabled speaker that uses computer vision to tell people what clothes to wear.

Of course that’s just the start. Security, mood monitoring, and other visual services seem like obvious next steps.

It’s a smart way for Amazon to begin collecting data about what we like to wear. By adding a camera and moving its personal voice assistant from the kitchen to the bedroom, Amazon is familiarizing us with the idea that wherever we are in our homes (or our cars), Amazon is at our beck and call.


Mobile ML

For training machine learning models that will be used in mobile apps, it’s pretty obvious that it should happen off-device, whether it’s on your own computers or on computers that you rent.

But to put your machine learning system into production and make predictions you have the choice of doing it on the device or as a serverless microservice in the cloud. Going the microservice route for inference keeps your app simple and separates the app layer from the service layer.

Your app simply communicates with your hosted trained model via a scalable API endpoint, allowing you to continually improve the model and redeploy without having to update the app itself.


What We’re Reading 


Things to Try at home🛠


Emergent // Future is a weekly, hand-curated dispatch exploring technology through the lens of artificial intelligence, data science, and the shape of things to come. 

Sign-up to Emergent // Future here.

Follow @EmergentFuture for more on frontier technology

Lovingly curated for you by Algorithmia