Redefining the PC Experience With AI


By Akash Palkhiwala, Chief Financial Officer, Qualcomm Incorporated

We are entering the age of generative AI. In a massive leap forward, generative AI will use neural networks to enable new experiences and use cases, but it can’t be done efficiently in the cloud alone. On-device AI will change the PC industry by creating a new class of powerful and efficient AI PCs, where your device becomes a true intelligent assistant that is more intuitive and seamlessly integrated into your life. The companies that thrive in the new PC future will be the ones which look forward into a world of integrated, personalized experiences enabled by AI.

When you look at the dominant conversations about generative AI today, they are focused on using the cloud to run, train, and then infer models. But not everything needs to go to the cloud. The record-setting adoption of generative AI-enabled experiences leads to massive demands for compute processing. This makes it essential to pursue an approach that adopts the best of both the cloud and device. This is hybrid AI, in which workloads are distributed and coordinated among cloud and edge devices to deliver more powerful, efficient, and highly optimized experiences.

Applications that are pervasively and continuously running AI on the device are learning about you to anticipate and respond to your needs, delivering immediacy and reliability without delays from capacity constraints. It will use the context and content available on device to personalize interactions, prompts, and responses. And because queries don’t have to leave the device, it helps protect that personal information and enhance privacy.

With on-device intelligence, users can simply converse naturally and conversationally with an AI assistant. Rather than going to the start button to navigate from one app to another, you will go to your AI assistant, which will manage the interaction with the device much more efficiently. For example, if would you like to book a flight, you can simply ask the assistant for that information. Even more importantly, if your AI assistant sees you make an airline reservation, it can provide hotel information to you even before you even ask for it.

Efficiently running AI models on device requires heterogeneous computing. In addition to the CPU and the GPU, the NPU is now the third key processing core for on-device AI and measuring the performance of a PC. An on-device AI engine off-loads tasks from the CPU and GPU for accelerated performance and battery efficiency. This enables the processing necessary to run demanding AI applications ubiquitously and with sustained usage in ultra-portable designs at very low power.

The size of this opportunity is massive. In fact, 94% of business leaders indicate reported that AI is critical to success over the next five years[1]. Among IT decision makers, AI is the biggest factor impacting their buying decision[2]. New research says that nearly half of IT decision makers are already ready to switch PC brands based on AI performance[3].

We’re at the cusp of an exciting generational change in PCs. We now understand the importance of integrating AI both in hardware and software, which opens doors to exciting opportunities and new avenues for innovation. This new age of how we interact with our AI PCs is already starting to come to market.

[1] Deloitte State of AI in the Enterprise, 5th Edition, October 2022

[2] IDC 2023 US Commercial PCD Survey, August 2023

[3] IDC 2023 US Commercial PCD Survey, August 2023

The views and opinions expressed herein are the views and opinions of the author and do not necessarily reflect those of Nasdaq, Inc.



Original: Artificial Intelligence Feed: Redefining the PC Experience With AI