Intel’s Panther Lake CPUs Make the Best Case for ‘Fake Frames’ in Gaming

1 week ago 13

If your gaming machine isn’t rendering any frames itself, what’s the point of having a costly computer at all? I know some of you out there are mumbling resentments, asking that question already knowing the answer. To add to the grumbling, Intel has finally come off the line with its first edition of Panther Lake CPUs, dubbed Core Ultra Series 3, and a new version of its existing upscaler promising to push gaming on lightweight devices. I’ve already tested it out on the Asus Zenbook Duo. It’s a great laptop with a great chip. Now here comes the nitty-gritty of what that actually means for players desperate for one machine to finally do it all.

Which brings me to the elephant in the room. Along with Panther Lake, Intel is offering players the chance to test multi-frame generation—aka “fake frames”—for themselves without needing an increasingly expensive Nvidia or AMD GPU. Intel’s head of Arc graphics, Tom Petersen, previously told Gizmodo he’s used it in multiple games, and he doesn’t even mind the odd graphical glitches it creates. The more important thing for him, he recently told Digital Foundry, is a general sense of smoothness for gaming—mostly cutting down on CPU timings to eliminate awkward in-game stuttering.

I can compromise on some things for a good gaming experience, but not on others. I’m hardly the crabbiest stickler for graphical purity. Frame generation is one of those software sleight-of-hand tricks that PC gamers have come to loathe. For some players, the barest concept of “fake frames” drives them mad, especially since they’re paying thousands of dollars for a gaming-capable PC. This latest rendition won’t change their minds.

Why is Intel’s frame generation different from the rest?

Intel Xess Frame Gen Panther Lake 1© Kyle Barr / Gizmodo

Intel’s latest XeSS 3 model is one of several AI upscalers that take an image rendered at a lower resolution and then use AI to massage those pixels into something resembling the promised resolution. This enhances frame rates at the cost of some visual fidelity. To the PC purists’ chagrin, many modern PC games enable upscalers such as Nvidia’s DLSS (Deep Learning Super Sampling) and AMD’s FSR (FidelityFX Super Resolution) by default. Every major gaming console from Sony, Microsoft, and Nintendo all includes some upscaling capabilities.

While Nvidia’s DLSS 4.5 and AMD’s FSR Redstone are both locked to proprietary hardware, Intel’s XeSS (Xe Super Sampling) is device agnostic (though AMD’s open FSR 3 and 3.5 models remain active on many existing titles). Either way, XeSS multi-frame gen uses AI to inspect the scene to create multiple frames, which are then interpolated in between two fully rendered frames. XeSS had access to 2x frame generation before. Multi-frame generation only exacerbates the promise and problems inherent to frame interpolation.

Intel touted the gaming abilities of the Core Ultra Series 3’s chips, and they are enticing. There are two varieties of these CPUs among the 14 offerings the chipmaker showed off earlier this year. The versions with an “X” in their name, namely the Intel Core Ultra X7 and X9, include the extra 12Xe3 GPU cores. These graphics cores based on the Arc B390 microarchitecture are supposed to offer strong performance for tasks like rendering and gaming, all without pushing these laptops’ total power package into the stratosphere.

What does frame generation look like on an Intel PC?

While XeSS is hardware agnostic, it does come with certain advantages on an Intel machine. Like Nvidia, Intel has special software to override in-game graphics settings to ensure they’re using the latest XeSS 3 model. You can set universal XeSS settings or frame generation to 2x, 3x, or 4x on a per-game basis. Unfortunately, the software sometimes fails to recognize which games are actually installed.

A title like Cyberpunk 2077 can get close to 50 fps when running on “Ultra” settings at 1080p on the Asus Zenbook Duo. You’ll only maintain around 36 fps in benchmarks with the Zenbook Duo’s max 2,880 x 1,800 resolution. Once you enable XeSS, the frame rate can jump to closer to 45 fps, more or less depending on if you opt for “performance” or “quality” settings. If you want to play with any ray tracing at all, you can only expect playable frame rates with XeSS.

Cyberpunk 2077 running on an Asus Zenbook Duo with ray tracing “low” settings and max resolution. It can still hit that frame rate without frame gen thanks to XeSS upscaling.

Once frame generation comes online, that’s when I can start to play the game at near 60, near 80, and closer to 90 fps with 2x, 3x, or 4x frame generation, respectively. But here’s the major caveat. Frame generation causes two major issues: latency and visual artifacts. The game running at 4x frame gen feels noticeably floatier—though not unplayable. I could notice how when flicking the camera quickly, I would see ghostly flickering of streetlights and screens over the streets of Night City.

What Intel doesn’t tell you is that to avoid any visual glitches, you want as close to 60 fps as possible already before you enable frame generation. If I drop Cyberpunk 2077 down to 1080p, then I can edge closer to 40 fps to keep ray tracing going and see less odd flickering. There were fewer problems with the existing 2x frame gen, in any case.

In a game like Hogwarts Legacy, I can net more than 90 fps indoors when the laptop is only rendering less than 30 fps. When running around, I noticed numerous graphical glitches with creeping shadows that climbed up the player character’s robes.

Here you can see Hogwarts Legacy without and with frame generation. The game runs fairly well even without generated frames. 

Without frame generation, with Intel XeSS on balanced settings, I can hit a solid 40 fps to 50 fps in indoor environments and between 30 fps and 40 fps outdoors with all the graphics cranked up to max. Do I need 60 fps? I may as well reduce some graphics settings rather than make excuses for awkward graphical artifacting.

So what if we go truly ludicrous and try and play Cyberpunk 2077 with 4x frame gen with ray tracing settings set to ultra? Sure, I can get more than 60 fps in the game when the actual generated frame rate is close to 20 fps at the Zenbook Duo’s highest, 3K resolution. All I can say is that the laptop is doing its best, but there are some obvious and glaring visual artifacting that even the most tolerant player would find hard to excuse.

Maybe I’ll like frame gen more on a handheld

Intel Panther Lake 2Chippin’ in. © Kyle Barr / Gizmodo

Gamers are no monolith. Some players may not care about reducing their resolution to half of what their screen actually supports if it means a playable game. I’m one of those players who cannot stand to look at fuzzy textures and reduced UI detail for the sake of higher frame rates. I would rather reduce graphics settings than reduce my resolution.

Mind you, those base frame rates I can get in games without frame generation are still impressive. On a Zenbook Duo, games like Cyberpunk 2077 are indeed playable and visually stunning in scenes. It’s the closest we’ve seen to the single-chip performance of AMD’s Strix Halo chips and at a much lower TDP (thermal design power) and without AMD’s touted GPU architecture. And there’s the rub. You’ll need to sacrifice something for gaming on a mobile Intel Panther Lake device.

For smaller, cheaper devices, upscaling and—yes—even frame generation make much more sense. It’s worth a look on a laptop (even one as expensive as the $2,300 Asus Zenbook Duo), though I wouldn’t hinge my hopes on it. Intel has already called its shot by promising we’ll see a handheld-specific chip, dubbed Intel Core G3. We may even see Intel-based handheld gaming PCs from companies like Acer and MSI later this year, or so we hope. There, 1080p gaming is the norm. When gaming on a smaller screen, it’s much harder to spot any visual inconsistencies. For the sake of playable frame rates on the go, Intel may already be kicking in AMD’s teeth on its way through the door.

For now, Intel’s fake frames on laptops will still be a mixed bag. Some who buy an expensive laptop with one of Intel’s higher-end GPU chips may not care about floatier controls or awkward visuals if they can push their frame rate to near the max of what their display is capable of. Others may be more gun-shy, and rightfully so.

Read Entire Article