Sometimes the best advertising is a small, nondescript company name etched onto an equally nondescript door in a back alley, only accessible by foot traffic. Lucky for us, Paul Borza of TalentSort—a recruiting search engine that mines open-source code and ranks software engineers based on their skills—was curious about Algorithmia when he happened to walk by our office near Pike Place Market one day.
“It’s funny how I stumbled on Algorithmia. I was waiting for a friend of mine in front of
The Pink Door, but my friend was late so I started walking around. Next door I noticed a cool logo and the name ‘Algorithmia.’ Working in tech, I thought it must be a startup so I looked up the name and learned that Algorithmia was building an AI marketplace. It was such a coincidence!”
Paul Needed an Algorithm Marketplace
“Two weeks before I had tried monetizing my APIs on AWS but had given up because it was too cumbersome. So rather than waste my time with bad development experiences, I was willing to wait for someone else to develop a proper AI marketplace; then I stumbled upon Algorithmia.”
Paul Found Algorithmia
“I went home that day and in a few hours I managed to publish two of my machine learning models on Algorithmia. It was such a breeze! Publishing something similar on AWS would have taken at least a week.”
We asked Paul what made his experience using Algorithmia’s marketplace so easy:
“Before I started publishing algorithms, I wanted to see if Algorthmia fit our company’s needs. The “Run an Example” feature was helpful in assessing the quality of an algorithm on the website; no code required. I loved the experience as a potential customer.”
“To create an API, I started the process on the Algorithmia website. Each API has its own git repository with some initial boilerplate code. I cloned that repository and added my code to the empty function that was part of the boilerplate code, and that was it! The algorithm was up and running on the Algorithmia platform. Then I added a description, a default JSON example, and documentation via Markdown.”
“The beauty of Algorithmia is that as a developer, you only care about the code. And that’s what I wanted to focus on: the code, not the customer sign-up or billing process. And Algorithmia allowed me to do that.”
Paul is Smart; Be like Paul
Paul’s algorithms are the building blocks of TalentSort; they enable customers to improve their recruiting efficiency. The models are trained on 1.5 million names from more than 30 countries and have an accuracy rate of more than 95 percent at determining country of origin and gender. Also, the algorithms don’t call into any other external service, so there’s no data leakage. Try them out in the Algorithmia marketplace today:
Paul’s relentless curiosity led him to Algorithmia’s marketplace where his tools became part of more than 7,000 unique algorithms available for use now.
At Algorithmia, we’ve always been maniacally focused on the deployment of machine learning models at scale. Our research shows that deploying algorithms is the main challenge for most organizations exploring how machine learning can optimize their business.
In a survey we conducted this year, more than 500 business decision makers said that their data science and machine learning teams spent less than 25% of their time on training and iterating models. Most organizations get stuck deploying and productionizing their machine learning models at scale.
The challenge of productionizing models at scale comes late in the lifecycle of enterprise machine learning but is often critical to getting a return on investment on AI. Being able to support heterogeneous hardware, conduct versioning of models, and run model evaluations is underappreciated until problems crop up from not having taken these steps.
At the AWS re:Invent conference in Las Vegas this week, Amazon announced several updates to SageMaker, its machine learning service. Notable were mentions of forthcoming forecast models, a tool for building datasets to train models, an inference service for cost savings, and a small algorithm marketplace to—as AWS describes—“put [machine learning] in the hands of every developer.”
“What AWS just did was cement the notion that discoverability and accessibility of AI models are key to success and adoption at the industry level, and offering more marketplaces and options to customers is what will ultimately drive the advancement
–Kenny Daniel, CTO, Algorithmia
Amazon and other cloud providers are increasing their focus on novel uses for machine learning and artificial intelligence, which is great for the industry writ large. Algorithmia will continue to provide users seamless deployment of enterprise machine learning models at scale in a flexible, multi-cloud environment.
Deploying at Scale
For machine learning to make a difference at the enterprise level, deployment at scale is critical and making post-production deployment of models easy is mandatory. Algorithmia has four years of experience putting customer needs first, and we focus our efforts on providing scalability, flexibility, standardization, and extensibility.
We are heading toward a world of standardization for machine learning and AI, and companies will pick and choose the tools that will make them the most successful. We may be biased, but we are confident that Algorithmia is the best enterprise platform for companies looking to get the most out of their machine learning models because of our dedication to post-production service.
Being Steadfastly Flexible
Users want to be able to select from the best tools in data labeling, training, deployment, and productionization. Standard, customizable frameworks like PyTorch and TensorFlow and common file formats like ONNX increase flexibility for users for their specific needs. Algorithmia has been preaching and executing on this for years.
Standard, customizable frameworks increase flexibility for users for their specific needs. Algorithmia has been preaching this for years.
–Kenny Daniel, CTO, Algorithmia
For at-scale enterprise machine learning, companies need flexibility and modular applications that easily integrate with their existing infrastructure. Algorithmia hosts the largest machine learning model marketplace in the world, with more than 7,000 models, and more than 80,000 developers use our platform.
“I expect more AI marketplaces to pop up over time and each will have their strengths and weaknesses. We have been building these marketplaces inside the largest enterprises, and I see the advantages of doing this kind of build-out to accelerate widespread
–Diego Oppenheimer, CEO, Algorithmia
It is Algorithmia’s goal to remain focused on our customers’ success, pushing the machine learning industry forward. We encourage you to try out our platform, or better yet, book a demo with one of our engineers to see how Algorithmia’s AI layer is the best in class.
At Algorithmia, we have much to be thankful for—it’s even one of our core tenets. So in light of Thanksgiving, we have compiled a list of all that we’re particularly appreciative of this year. Some of our staff are thankful for the little things—snacks and a dog-friendly office—and some are glad of more practical things—the freedom to develop skills and experience for career development. Regardless, Algorithmia is eternally grateful for our customers and contributors.
“I’m thankful for the flexibility in where we live and when and how we get our work done!”
–Stephanie, Developer Advocate
“I am thankful that I work at a company that has a great set of values that we live by. One of them is actually, “We are thankful”! We are thankful for every single one of our users, customers, and contributors. We do not exist without them and always strive to make their experiences better.”
–Jonah-Kai, Head of Growth Marketing
“I’m thankful for working with some incredibly talented people.”
–Besir, Algorithm Engineer
“I’m thankful for how helpful and supportive my team is.”
–Adnaan, Back End Engineer
“I’m thankful for board game night.”
–James, Product Designer
“I’m thankful that I get to work on complex and creative projects!”
–Whitney, Content Marketing Manager
“I’m thankful for the growth mindset and intellectually curious culture!”
–Ken, Enterprise Sales Development Rep
“I am thankful for the team’s willingness to jump in and fix problems, always. I call it a winner attitude.”
“I am thankful for the remote friendly culture.”
–Rowell, Senior Platform Engineer
“I’m thankful for interesting, challenging, and creative opportunities every day.”
–Jon, Developer Advocate
“I’m thankful for the awesome views from the devpit (even if the blinds are down more often than not).”
–Ryan, Front End Engineering Lead
As we look toward the end of the year, we are also thankful to have the opportunity to give back to others and help underserved communities.
Source: Deep Ideas
If you remember anything from Calculus (not a trivial feat), it might have something to do with optimization. Finding the best numerical solution to a given problem is an important part of many branches in mathematics, and Machine Learning is no exception. Optimizers, combined with their cousin the Loss Function, are the key pieces that enable Machine Learning to work for your data.
This post will walk you through the optimization process in Machine Learning, how loss functions fit into the equation (no pun intended), and some popular approaches. We’ll also include some resources for further reading and experimentation.
The loss function is the bread and butter of modern machine learning; it takes your algorithm from theoretical to practical and transforms neural networks from glorified matrix multiplication into deep learning.
This post will explain the role of loss functions and how they work, while surveying a few of the most popular from the past decade.