Algorithmia Blog

Three Forces Accelerating Artificial Intelligence Development

Artificial intelligence is set to hit the mainstream, thanks to improved machine learning tools, cheaper processing power, and a steep decline in the cost of cloud storage. As a result, firms are piling into the AI market, and pushing the pace of AI development across a range of fields.

“Given sufficiently large datasets, powerful computers, and the interest of subject-area experts, the deep learning tsunami looks set to wash over an ever-larger number of disciplines.

When it does, Nvidia will be there to capitalize. They announced a new chip design specifically for deep learning, with 15 billion transistors (a 3x increase) and the ability to process data 12x faster than previous chips.

For the first time we designed a [graphics-processing] architecture dedicated to accelerating AI and to accelerating deep learning.

Improved hardware contributes to Facebook’s ability to use artificial intelligence to describe photos to blind users, and it’s why Microsoft can now build a JARVIS-like personal digital assistant for smartphones.

Google, meanwhile, has its sights set on “solving intelligence, and then using that to solve everything else,” thanks to better processors and their cloud platform.

This kind of machine intelligence wouldn’t be possible without improved algorithms.

“I consider machine intelligence to be the entire world of learning algorithms, the class of algorithms that provide more intelligence to a system as more data is added to the system,” Shivon Zilis, creator of the machine intelligence framework, told Fast Forward Labs.  “These are algorithms that create products that seem human and smart.”

If you want to go deeper on the subject, O’Reilly has a free ebook out on The Future of Machine Intelligence, which unpacks the “concepts and innovations that represent the frontiers of ever-smarter machines” through ten interviews, spanning NLP, deep learning, autonomous cars, and more.

So, you might be wondering: Is the singularity near? Not at all, the NY Times says.