For the past several years, tech giants have been trying to make artificial intelligence in its many guises HPC, data analytics, and other advanced workloads more available and easier to use for enterprises.
Traditional OEMs such as Hewlett Packard Enterprise, Dell Technologies, and Lenovo are using a combination of hardware, software, and services to make technologies that in years gone by would only be employed by research institutions and the largest of corporations more widely accessible.
The public clouds, particularly those that have their own hyperscale applications driven by machine learning and scale, also function a bit like OEMs when it comes to these workloads.
They have exposed the tools they build for their own use through their clouds, giving customers another option to be a modern computing organization. In many cases, this reduces time to market for AI-driven applications and can also reduce costs – particularly huge capital outlays for buying GPU-laden infrastructure but also for high salaried AI experts who are in short supply and high demand.
The global AI space is expected to grow from $27.23 billion in 2019 to almost $267 billion by 2027, according to a report from Fortune Business Insights. While on-premises deployments will grab revenue share, “the cloud deployment segment [will] gain traction owing to less implementation expenses,” the report states. “Also, the cloud offers tools and pre-trained networks, which makes building AI applications convenient.”
Amazon Web Services – the largest of the hyperscale cloud providers – offers a range of services, from Fraud Detector and Forecast (for predicting demand) to Kendra (enterprise search) and CodeGuru (automating code reviews). Microsoft Azure offers an AI platform that includes services reaching from machine learning to knowledge search to various apps and agents.
For about a decade, Google has focused on AI and machine learning, seeing such technologies as keys for advancing the capabilities throughout its ever-expanding array of services. That has been on display this week during the company’s virtual Google I/O 2021 developer conference. In his keynote address, Sundar Pichar, CEO of both Google and its parent company, Alphabet, spoke about how Google continues to infuse AI and machine learning into everything from search to security to Android-based devices.
Even a new facility aimed at accelerating Google’s quantum computing capabilities includes AI in its name: the Quantum AI campus in Santa Barbara, California, which will be down the road from the University of California campus where Urs Hotzle, senior vice president for technical infrastructure at Google, was a professor of computer science before joining the search engine giant as one of its earliest employees.
At the same time, Google Cloud took steps to make it easier for data scientists and developers to pull together AI-based applications and for enterprises to get those applications deployed. Vertex AI is a platform that encompasses a range of existing machine learning services with a unified user interface and API. Developers using Vertex AI can train an AI model using almost 80 percent fewer lines of code than platforms from other cloud providers, which opens up the development of such models and the management of machine learning projects to a wider range of data scientists and machine learning engineers with varying levels of skill, according to Google.
“Today, data scientists grapple with the challenge of manually piecing together ML point solutions, creating a lag time in model development and experimentation, resulting in very few models making it into production,” Craig Wiley, director of product for Vertex AI and AI applications at Google Cloud, wrote in a blog post. “To tackle these challenges, Vertex AI brings together the Google Cloud services for building ML under one unified UI and API, to simplify the process of building, training, and deploying machine learning models at scale. In this single environment, customers can move models from experimentation to production faster, more efficiently discover patterns and anomalies, make better predictions and decisions, and generally be more agile in the face of shifting market dynamics.”
Andrew Moore, vice president and general manager of cloud AI and industry solutions at Google Cloud, said the goals of Vertex AI were to remove orchestration burdens from data scientists and engineers and “create an industry-wide shift that would make everyone get serious about moving AI out of pilot purgatory and into full-scale production.”
Organizations using Vertex AI will get access to the same AI toolkit – which includes such capabilities as computer vision, language and conversation as well as structured data – Google engineers use internally for the company’s own operations, as well as new MLOps features like Vertex Vizier to speed up experimentation, Vertex Feature Store (a fully managed feature where data scientists and developers can offer, share and reuse machine learning features) and Vertex Experiments to use faster model selection to accelerate the deployment of machine learning models into deployment.
An experimental application called Vertex ML Edge Manager will enable organizations to deploy and monitor models on the edge via automated processes and APIs, enabling the data to stay on the device or on site. Other tools, such as Vertex Model Monitoring, Vertex ML Metadata and Vertex Pipelines are designed to streamline machine learning workflows.
Vertex AI also integrates with such open-source frameworks as TensorFlow, PyTorch and scikit-learn.
Google is promising more innovations around Vertex AI, which will be important for the company as it tries to gain ground on AWS and Azure, who together accounted for more than half of global cloud revenues in the first quarter in a market that saw spending reach more than $39 billion, a 37 percent year-over-year increase, according to Synergy Research Group.
Google Cloud is the third on the list and was among several other companies – those being Alibaba, Tencent and Baidu – as cloud providers that saw growth rates that surpassed the overall growth in the market.