There's an arms race among public cloud providers to provide businesses with the best machine learning capabilities.
Enterprises are increasingly interested in creating intelligent applications, and companies like Amazon, Microsoft and Google are rushing to help meet their needs.
Google fired its latest salvo on Tuesday, announcing a set of enhancements to its existing suite of cloud machine-learning capabilities. The first was a new Jobs API aimed at helping match job applicants with the right openings. In addition, the company is slashing the prices on its Cloud Vision API and launching an enhanced version of its translation API.
On top of that, Google is offering GPUs in its cloud both through the company's managed services and its infrastructure-as-a-service product. Companies that want to roll their own machine learning systems and algorithms can now take advantage of the new hardware.
These moves are important steps for the company as it continues to compete with Microsoft, Amazon, IBM and other vendors. One of the major benefits of the public cloud is that businesses have relatively cheap and easy access to massive amounts of computing power necessary for machine learning tasks.
Google's push is also focused on attracting more enterprises to its cloud platform, which has been a major focus under the leadership of its cloud chief, Diane Greene. Here's the rundown of what's new:
Cloud Jobs API
The new Google Cloud Jobs API gives companies a tool to help them match the best candidates to the right jobs, based on the skills that each person can bring to bear. The API is designed to work with a company's job search system. Users plug in their skills, experience and location, then the API takes that information to match them with jobs it thinks would suit them.
It's designed to help deal with the confusion that arises when people try to match themselves to a non-standardized set of job titles and descriptions. For example, someone looking for a sysadmin position could end up facing a wide variety of jobs that have similar titles but wildly divergent skill needs. Using the Jobs API means that job seekers would get recommendations based on their particular skill sets, and companies could see who the system thinks is the best fit for each gig.
Right now, the API is in public alpha for customers in the U.S. and Canada, with FedEx, CareerBuilder and Dice all giving it a shot. Google hasn't provided a timeline for when it would be more broadly available.
Cloud Vision Price Cuts
Speaking of APIs, Google is slashing the prices on one of its most popular machine- learning-based cloud services. The Google Cloud Vision API now costs about 80 percent less for companies to implement. That means it's easier for them to build apps that can recognize the content of images without needing to build the machine learning algorithms behind that functionality.
"One of the things we want to do is to push the overall market to making fully trained models a lot more affordable, and therefore a lot more used across common application types or enterprise customer use cases," Rob Craft, Google's group product manager for cloud machine learning, said in an interview.
Cloud Translation Premium
Google is also launching a new Cloud Translation API Premium service that's designed to provide additional translation speed and accuracy to businesses that need it, beyond what Google already provides with its existing APIs. Craft said that the new models behind the API provide a 55 percent to 85 percent reduction in error compared to Google's existing Cloud Translation API.
The API is powered by a switch in approach from using primarily heuristic name-pair machine learning to deep neural nets. Craft said that it makes sense for Google to offer the new premium service to businesses first, before making it available to a broader consumer base.
Furthermore, he said that business customers were clamoring for additional features like the ability to train translation models with custom dictionaries. For companies looking to do translation of particular document types like legal agreements or financial papers, giving translation models a dictionary of specialized terms to work with is important for maximum accuracy.
"It's a different kind of language translation, which means that we will need a tighter coupling between the customer and Google around what those documents contain," Craft said. "And the customer would need to opt in to allow us to train on those documents, et cetera, but the premium relationship gives us that opportunity."
Right now, the company is just focused on reducing error rates, and custom dictionaries are still on the horizon. Looking toward the future, Craft said that he sees additional opportunity for the company to launch other premium services that are focused on large enterprises and industry-specific needs.
The company isn't making the pricing for the new service public yet, but it will cost more than the existing Cloud Translation service. Google will make it publicly known at the new API's general availability, which isn't far off, Craft said.
Cloud Natural Language API
On top of that, the Cloud Natural Language API, which is used to parse sentences written by humans, is now generally available. That launch comes with some additional features, like the ability to determine the sentiment of text on a sentence-by-sentence basis, rather than generating one sentiment score for the content of an entire document.
GPUs are coming to the Google Cloud
While traditional CPUs still have a significant role to play in computing, Graphical Processing Units are better for running certain types of high-performance workloads, including some machine learning applications. For that reason, Google is bringing GPUs to its cloud infrastructure and managed machine learning offerings next year.
Users will be able to add GPUs to their infrastructure instances, for custom machine learning and high-performance computing tasks. It will be similar to compute instances that Microsoft and Amazon have introduced this year to give companies access to GPUs in the cloud.
In addition, Google Cloud Machine Learning will start taking advantage of the new hardware. The system, which lets users set up their own custom machine learning algorithms while managing the underlying infrastructure, will automatically tap into GPUs when the system believes that it's appropriate.
Google will be using AMD's Radeon-based FirePro S9300 x2 Server GPUs. That's a marked difference from Microsoft and Amazon, which are both using Nvidia GPUs. AMD is no stranger to the public cloud, though: it's also partnered with Alibaba on bringing GPUs to the Chinese tech giant's cloud offering.
All of this comes as Google continues to expand its cloud platform. Urs Hölzle, the company's senior vice president of technical infrastructure, said during a press event last month that Google plans to open a new cloud region roughly once a month next year, a massive undertaking that would bring its availability in line with other platforms.
Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.