Anonymous edits have been disabled on the wiki. If you want to contribute please login or create an account.


Warning for game developers: PCGamingWiki staff members will only ever reach out to you using the official press@pcgamingwiki.com mail address.
Be aware of scammers claiming to be representatives or affiliates of PCGamingWiki who promise a PCGW page for a game key.

Topic on Talk:The Last of Us Part I

DLSS 2.5.1 has better temporal stability than the games default DLSS 3.1.1 in The Last of Us part 1 Pc

3
Johnhitman2194 (talkcontribs)

Okay, I've sent about 7 hours testing graphical settings on an RTX 3070Ti in The Last of Us part 1 PC and I've figured that replacing the default DLSS 3.1.1 with DLSS 2.5.1 showed greater temporal stability than even the games native TAA. DLSS 3.1.1 is great but the temporal stability isn't that great even DLSS 3.1.2 suffers from the same issue. Now with DLSS on quality at 1440p I'm able to use High environmental textures, High character model textures, Object textures to low, effects textures to low and rest all on Ultra without crashing anymore and the VRAM usage is down to 7.4 GB instead of the ridiculous 10GB on native

Now the game runs at 80+ FPS without problems

Mrtnptrs (talkcontribs)

I assume performance is the same between these three DLSS versions at same upscaling quality-modes right? VRAM usage going down when using DLSS is normal, as the whole game and its effects are rendered at a lower resolution and then upscaled. I assume there is no noticeable difference regarding VRAM when looking at DLSS 3.1.1 vs 2.5.1 right? And what signs of temporal instability do you see in case of using DLSS 3.1.1 DLL from the game vs DLSS 2.5.1 DLL drop-in replacement? So, in what ways does 2.5.1 seem to improve the situation compared to 3.1.1?

Pippo baudo (talkcontribs)

Not the first occasion that 2.5.1 performs better than 3.x.x @Johnhitman2194 can you check with dlss-tweaks which profile dlss 3.x.x is using?