Algorithmia

Emergent // Future – Amazon’s Echo Look, Machine Learning models at scale and more

Issue 50

This week we let Amazon “Look” at us using computer vision, check out why you want to deploy your machine learning model as a scalable, serverless microservice, and look at why machine intelligence is the key to building sustainable businesses. Plus what we’re reading this week and things to try at home.

wave Spread the love of E//F on Twitter and Facebook


Amazon Look AI

Amazon debuted the Echo Look, an Alexa-enabled speaker that uses computer vision to tell people what clothes to wear.

Of course that’s just the start. Security, mood monitoring, and other visual services seem like obvious next steps.

It’s a smart way for Amazon to begin collecting data about what we like to wear. By adding a camera and moving its personal voice assistant from the kitchen to the bedroom, Amazon is familiarizing us with the idea that wherever we are in our homes (or our cars), Amazon is at our beck and call.


Mobile ML

For training machine learning models that will be used in mobile apps, it’s pretty obvious that it should happen off-device, whether it’s on your own computers or on computers that you rent.

But to put your machine learning system into production and make predictions you have the choice of doing it on the device or as a serverless microservice in the cloud. Going the microservice route for inference keeps your app simple and separates the app layer from the service layer.

Your app simply communicates with your hosted trained model via a scalable API endpoint, allowing you to continually improve the model and redeploy without having to update the app itself.


What We’re Reading books

  • Why Systems of Intelligence are the Next Defensible Business Model. To build a sustainable and profitable business, you need strong defensive moats around your company. Startups today need to build systems of intelligence — AI powered applications — “the new moats.” (Greylock)
  • The End of Human Doctors. To understand if automation is possible, we need to understand the medical process. In particular, we need to know what is going on in the heads of doctors, since that is what we want to replicate or improve upon. (Luke Oakden-Rayner)
  • A Brief History of CNNs in Image Segmentation: From R-CNN to Mask R-CNN. Convolutional Neural Networks have become the gold standard for image classification and have improved to the point where they now outperform humans on the ImageNet challenge. (Medium)
  • The Myth of a Superhuman AI. The most common question about AI is whether it will become so much smarter than us that it will take all our jobs and resources, and humans will go extinct. Is this true? (Backchannel)
  • Meet the People Who Train the Robots to Do Their Own Jobs. Before the machines become smart enough to replace humans they need to be taught. (New York Times)

Things to Try at home🛠


Emergent // Future is a weekly, hand-curated dispatch exploring technology through the lens of artificial intelligence, data science, and the shape of things to come. 

rocket Sign-up to Emergent // Future here.

Follow @EmergentFuture for more on frontier technology

Lovingly curated for you by Algorithmia

Product manager at Algorithmia helping to give developers super powers.

More Posts - Website

Follow Me:
TwitterFacebookLinkedIn