How NXP Semiconductors Is Moving Into Artificial Intelligence

What is "Industry 4.0?"

To NXP Semiconductors (NASDAQ: NXPI), a leading provider of embedded chips (electronics found in things like automobiles and factories), Industry 4.0 centers around four technologies: fast networking, fast processing (with artificial intelligence), human-machine interfaces, and cybersecurity.

The confluence of these four technologies has the potential to more fully automate factories, automobiles, and wearable devices. According to IHS, there will be over 70 billion internet-connected devices by 2025.

But while the NXP conversation largely centers around its impending acquisition by Qualcomm (NASDAQ: QCOM) (which may or may not happen), investors may be missing out on some exciting recent news on NXP products geared toward Industry 4.0.

Man smart, chips smarter

On artificial intelligence, NXP is perceived to lag competitors such as NVIDIA, the leading producer of AI-focused graphics processing units (GPU). But NXP has recently found ways to make its embedded processors and microcontrollers more "intelligent." Last month, the company unveiled a new software tool for its EdgeScale platform-as-a-service offering (which was only unveiled in February). The software allows customers to more easily load machine learning algorithms from other sources into NXP's embedded chips and systems-on-chips (SOCs). Here's why the new offering could be important to NXP's customers and investors.

Pushing to the edge

The Internet of Things is an architecture whereby "edge" devices will be able to send and receive information from a cloud-based server over an internet connection. This has a lot of benefits but also some complications. Companies that use cloud-based infrastructure-as-a-service need to pay for processing power and networking. Also, the time it takes for data to get from the cloud to the "edge" can mar efficiency, especially if a sensor is supposed to react to stimuli in real-time.

Companies are therefore racing to develop products that facilitate "edge" computing. For instance, Amazon released its FreeRTOS operating system for AWS late last year at its November Re: Invent conference, then announced a partnership with NXP competitor Texas Instruments to integrate the latter's SimpleLink MCU microcontroller hardware platform with FreeRTOS.

Responding in kind, NXP came out with its EdgeScale platform-as-a-service offering in February. EdgeScale is designed to securely work with NXP's Layerscape embedded chip platform, a line of ARM-based microprocessors that are more computing-intensive than low-power microcontrollers. Thus, it appears EdgeScale is a more advanced or specialty edge-computing platform.

EdgeScale is also an open architecture, connecting with all major cloud providers, giving end users the ability to customize their software via APIs, with a point-and-click dashboard to quickly and securely provision large numbers of devices.

Can the edge really handle machine learning?

Despite ever-more capable software, the limitations of memory pricing and battery power still means machine-learning algorithms need to be developed and stored in the cloud. But in June (just four months after EdgeScale's release), NXP unveiled an open-source software that compresses machine learning algorithms, which can then be sent over EdgeScale to NXP-enabled devices. IT pros can take their machine learning models that they've built, compress the algorithm, and then send it to NXP's chips via a network. The algorithm can then function at the device level, without having to send information back to the cloud.

Playing to its strengths

NXP's new software also has a feature that determines the trade-offs between different types of AI models, rating them according to power consumption, processing power, and application-specific capabilities. This "shop-and-compare" software tool could be very useful for customers given the large number of different AI tools and programming languages that are constantly being developed.

Given NXP's more than 25,000 customers across a range of industries and applications, it has a wide range of expertise it can draw from in order to provide this kind of unbiased guidance. Rather than attempting to compete on machine-learning processing itself, NXP seems focused on making it as seamless as possible to link its chips with various external ML architectures.

Which companies will be the AI winners?

While AI and machine learning are very likely to transform business, many of these tools are open-source, and the technology can change quickly. Thus, NXP is taking an interesting approach in not yet developing its own AI-core processor but rather playing to its strengths as an expert in "smart" device applications across a wide array of industries. It's interesting how each large semi company is leveraging its brands and technology to capture some of the benefits of AI, even if they're not all competing on which company has the fastest AI chip.

10 stocks we like better than NXP SemiconductorsWhen investing geniuses David and Tom Gardner have a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has quadrupled the market.*

David and Tom just revealed what they believe are the 10 best stocks for investors to buy right now... and NXP Semiconductors wasn't one of them! That's right -- they think these 10 stocks are even better buys.

Click here to learn about these picks!

*Stock Advisor returns as of June 4, 2018

John Mackey, CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Billy Duberstein owns shares of Amazon, Nvidia, and Texas Instruments. The Motley Fool owns shares of and recommends Amazon and Nvidia. The Motley Fool owns shares of Qualcomm. The Motley Fool recommends NXP Semiconductors. The Motley Fool has a disclosure policy.