Intel wants 100 million AI PCs by 2025, and they come with benefits

Forget E, O, and U... It's all about AI.

Like it or not, artificial intelligence is the future, according to tech companies. It’s as much a hardware effort as a software one, which is where Intel’s AI PC Acceleration Program comes in. Through it, the company is making a hard commitment to seeing 100 million AI PCs hit the market by 2025.

The initiative begins with the launch of Meteor Lake on December 14, 2023, now known as Intel Core Ultra. Despite making room for AI transistors, this doesn’t come at the sacrifice of cores. In fact, senior director of client technology and performance marketing Robert Hallock says that the team’s managed to cram two more ‘Low-Power Island E-Cores’ onto the SoC. These are designed for maximum energy efficiency when the more powerful standard E-Cores and P-Cores don’t need to be powered. Video playback is a good example. There is also a separate neural processing unit (NPU) block that houses two neural compute engines for AI acceleration.

With so much going on under the hood, you’d be forgiven for thinking it would eat into the power budget. Instead, Hallock tells us that AI PCs are often faster to idle after a task thanks to better prioritisation. Pointing a task to the right core means it’s either completed quicker or uses less power on a more efficient core to begin with. This should mean better battery life, but we’ll need to test for ourselves.

The initial wave of Intel Core Ultra processors will target laptops and AIO desktops. Those hopeful for a DIY solution will need to wait a little longer. Hallock believes it’ll take “two to three years” for the market to fully form and “eventually all consumer processors will have accelerators for AI enhancement.”

All about the apps

Alongside new processors, Intel provides over 100 partners with access to its engineering talent, core development tools like OpenVINO, and go-to-market opportunities. Adobe, Audacity, BlackMagic, XSplit, and Zoom are among the most notable to join, with plenty more predicted to follow. How software developers use Intel’s new hardware remains to be seen, but Hallock has an idea of what to expect.

“The immediate impact is going to be OS-level with assistants, generative AI, and non-gaming workflows,” He tells us. “It’ll enhance content creation by improving compression, reducing bandwidth overheads, and automating tiring tasks like video clean-up and object removal. Stream quality will be better than ever by reducing overall CPU and GPU load. And, eventually, it’ll help game developers generate voices, textures, quest text, NPC information, and more.”

Ethical dilemmas

Using AI in content creation is contentious, whether it’s a crediting, creative value, or outsourcing issue. On the topic of ethics, Hallock says that we can only judge on an app-by-app basis.

“In big RPGs, no developer can voice every character unless you’re making millions,” he says. “When you have a big studio firing all its staff and replacing them with AI, it feels bad. But when you have a small indie studio that can enhance its output and truly compete, it levels the playing field. The voice generation in particular is really good. There are legislative questions, such as ‘should AI creations be marked or identifiable?’ but this is in the hands of each developer.”

In the meantime, Intel debuted a tool called FakeCatcher in 2022 that can determine fake videos with 96 per cent accuracy. It’s a completely separate tool from the AI PC Acceleration Program but can help navigate ethical waters as AI shapes up.

Local vs. hybrid

Granting users the power of hardware-based AI isn’t exactly new. Nvidia leverages tensor cores in GeForce RTX graphics cards to enable DLSS upsampling. Since this pulls data from a cloud-based learning model, however, Nvidia’s solution is what’s known as hybrid AI. There’s a lot of value in centralising the model, as it learns faster and there’s parity between all devices.

Intel Core Ultra, on the other hand, enables developers to create completely localised AI models. Removing reliance on the internet immediately enhances company security and privacy with offline workflows reducing the risk of data leaks. Eventually, consumers could also benefit as AI can perform tasks on your PC without companies sucking up all your data.

Naturally, implementation rests on the developers, and it’s a big ask to think companies don’t want your data. Fortunately, for every Google, there’s a counterculture startup out there willing to profit in a different way. It’s only a matter of time.

EDIT (20/10/23): We’ve corrected some of the technical information surrounding how the cores and NPU work in Meteor Lake, thanks to input from the Intel team.

Damien Mason
Damien Mason
Senior hardware editor at Club386, he first began his journey with consoles before graduating to PCs. What began as a quest to edit video for his Film and Television Production degree soon spiralled into an obsession with upgrading and optimising his rig.

Deal of the Day

Hot Reviews

Preferred Partners

Related Reading