Should NVIDIA Be Concerned About Amazon's Custom AI Chip?

The dawn of artificial intelligence (AI) has set off an arms race of sorts, with the largest companies in technology seeking to benefit from the nascent tech. NVIDIA Corporation (NASDAQ: NVDA) has been one of the biggest beneficiaries of this trend, as its graphics processing units (GPUs) were the early choice for training AI systems. The ability of GPUs to perform a significant number of complex mathematical calculations simultaneously made them the perfect choice for AI applications.

NVIDIA has seen its data center business explode, producing triple-digit year-over-year growth for seven consecutive quarters, and its stock price has risen over 1,000% since early 2015. A number of high-profile companies that once embraced NVIDIA's GPUs are now looking to develop other solutions to augment or replace the illustrious GPU. Amazon.com (NASDAQ: AMZN) may be the latest tech giant to enter the fray.

Alexa's connection to the cloud

Amazon was an early adopter of AI, and is working on custom chips that would have the ability to do on-device processing -- or at-the-edge processing -- rather than relying solely on a devices connection to the cloud, according to a report in The Information. Currently, when a user makes a request to Amazon's digital assistant Alexa, the information is transmitted to the cloud, which processes the request and submits a response back to the device. This process results in a slight delay. The ability to handle speech recognition locally would improve the response times for any device powered by the digital assistant, including the Echo family of smart speakers.

Amazon bolstered its processor expertise with the $350 million acquisition of Annapurna Labs in early 2015. The company developed networking chips for data centers capable of transmitting greater quantities of data while consuming less power. Amazon now has a stable of more than 450 employees possessing some level of chip experience, which could lead the company to develop other specialized chips. The report also implies that Amazon may be developing an AI processor for Amazon Web Services, its cloud computing segment.

To each their own

In early 2016, AI pioneer Google, a division of Alphabet (NASDAQ: GOOGL) (NASDAQ: GOOG), revealed that it had created a custom AI chip dubbed the tensor processing unit (TPU). The application-specific integrated circuit (ASIC) was designed to provide more energy efficient performance for the company's deep learning AI applications, which are capable of learning by processing massive quantities of data. The chip formed the foundation for TensorFlow, the framework used to train the company's AI systems.

The latest version of the TPU can handle both the training and inference phases of AI. As the name implies, an AI system "learns" during the training phase, while the inference phase is when the algorithms do the job for which they have been trained. Google recently announced that access to these processors will now be available to its cloud customers.

Apple (NASDAQ: AAPL) has long been a proponent of user privacy and has taken a different path than its tech brethren. The company's mobile devices add electronic noise to any data transmitted to the cloud while stripping away any personally identifiable information, which provides a greater degree of user privacy and security. With the release of the iPhone 8 and iPhone X models, the company developed a neural engine as part of its new A11 Bionic chip -- a cutting-edge processor that could handle many AI functions locally. This vastly reduces the amount of user information that is transmitted to the cloud, which helps to secure the data.

Microsoft Corporation (NASDAQ: MSFT) decided early on to stake a claim in a customizable processor known as a field programmable gate array (FPGA) -- a specialty chip that can be configured for a specific purpose by the customer after it has been manufactured. These have become the foundation for Microsoft's Azure cloud computing system and offer flexible architecture and lower power consumption than traditional offerings like GPUs.

Growth continues unabated

While each of these companies has employed a different processor strategy, they still use a significant number of NVIDIA's GPU's in their operations.

NVIDIA's growth has continued to shine. In its most recent quarter, NVIDIA reported a record revenue of $2.91 billion, which grew 34% over the prior-year quarter. The company's data center segment -- which houses AI sales -- grew 105% year over year to $606 million and now accounts for 21% of NVIDIA's total revenue.

The competition was inevitable, but so far no solution can completely replace the GPU. NVIDIA can rest easy -- for now.

10 stocks we like better than Nvidia When investing geniuses David and Tom Gardner have a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has tripled the market.*

David and Tom just revealed what they believe are the 10 best stocks for investors to buy right now... and Nvidia wasn't one of them! That's right -- they think these 10 stocks are even better buys.

Click here to learn about these picks!

*Stock Advisor returns as of February 5, 2018

John Mackey, CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool's board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool's board of directors. Teresa Kersten is an employee of LinkedIn and is a member of The Motley Fool's board of directors. LinkedIn is owned by Microsoft. Danny Vena owns shares of Alphabet (A shares), Amazon, and Apple. The Motley Fool owns shares of and recommends Alphabet (A shares), Alphabet (C shares), Amazon, Apple, and Nvidia. The Motley Fool has the following options: long January 2020 $150 calls on Apple and short January 2020 $155 calls on Apple. The Motley Fool has a disclosure policy.