It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
These past weeks i found myself, once again, in a position vulnerable to ridicule.

What happened?

I dared to ask if it was possible to add DLSS and FSR to a certain game on a open/specific forum.

Long story short, i noticed more than a 400 watt GPU output on 'lowest' game settings. With a fixed fps of 60 and a 4k resolution.
I also presented another example of a 'similar' type game and called its engine, more efficient.

Of course these were all lies, the developers had way more important issues to deal with, costs ofc ( the board partners again ) and please how do i even imagine someone else's engine to be better!!!! What kind of fan are you now... etc etc

With a heavy heart i set upon answering all the accusations... during the second round i found myself with a sudden spark of inspiration and cooked up this example

if at 1080p your GPU has a 100W output
this would mean, in 2k in similar settings the output doubles, 200W
for 4k it doubles again, 400W

pixel count increase etc

If another game only has a 250W output on 4k, 'similar' settings and FPS

Then you could conclude the 'other's engine is more efficient!

Right ?
avatar
Mr. Zim: if at 1080p your GPU has a 100W output
this would mean, in 2k in similar settings the output doubles, 200W
or 4k it doubles again, 400W pixel count increase etc

If another game only has a 250W output on 4k, 'similar' settings and FPS Then you could conclude the 'other's engine is more efficient! Right ?
Pixel count & GPU usage doesn't exactly double as those "1k vs 2k" meaningless marketing labels aren't close to being accurate:-

1920x1080 = 2.07 Megapixel
2560x1440 = 3.69 Megapixel (+78% vs 1080p)
3840x2160 = 8.29 Megapixel (+125% vs 1440p / 300% vs 1080p)

So going from 1080p to 1440p will increase pixel count around +78% but quite often you'll see only +50-60% GPU real-world workload for same game as not everything a GPU does scales 1:1 with resolution increases. 4k vs 1440p will be a 2.25x increase in pixels but that probably will result in doubling the GPU requirements vs 1440p. 4k vs 1080p is 4x the pixels but quite often you'll need a GPU that's 3.5x more powerful.

As for the games, it depends. If one game was "heavier" because it was better looking, then it could be justified. The problem is, Unreal Engine 5 (and sometimes UE4) force TAA (Temporal Anti-Aliasing) which makes everything look a blurry mess unless you use DLSS to 'hack' around it, and we haven't had an actual "honest comparison" in years of what actually native UE5 looks like (without TAA) vs DLSS. "DLSS looks better than native" you often hear is because only the native version has been made to look artificially worse from TAA blur (something they started doing since UE4).

UE5 is also terribly unoptimised in general to the point average looking games are running badly anyway. Turning down the settings from Ultra to High / Medium / Low, doesn't increase frame-rates like before, it just makes them look worse whilst still running badly. In fact it makes me appreciate just how well optimised many Unreal 3 games like Dishonored, Bioshock Infinite, etc, were on £150 GPU's. Even CryEngine which has always had a reputation for being "heavy" both looks and runs much better on a £200 GPU in 10 year old Indie walking simulators from 2015 than half the 2025 AAA games do on £400 GPU's, which is quite ridiculous ("Nice looking" wire fence there, if only they could remember how sharp & clear transparent wire fence textures managed to looked in 2004 games without TAA...

So an engine where half the games are only playable because of upscaling I certainly wouldn't call "efficient". DLSS style upscaling should have always been an enhancement, not the "minimum requirement" crutch the industry have butchered it into.
avatar
AB2012: 1920x1080 = 2.07 Megapixel
2560x1440 = 3.69 Megapixel (+78% vs 1080p)
3840x2160 = 8.29 Megapixel (+125% vs 1440p / 300% vs 1080p)
This is nice, i can work with this in the future to strengthen certain points.
avatar
AB2012: So an engine where half the games are only playable because of upscaling I certainly wouldn't call "efficient". DLSS style upscaling should have always been an enhancement, not the "minimum requirement" crutch the industry have butchered it into.
The funny thing is, the game i compared, btw both games have a world map, quite large with moving parts and while illustrated differently, too is also without DLSS or any other activateable scaling option

I know it might be a bit like comparing apples to pears. Since a comparison might actually be only valid if everything was similar, graphical wise.

But as a gamer, and what we may expect from developers from a product i think it is perfectly valid to call the 250W output on 4k in similar settings better optimized, as a product. Regardless of the how and why and graphical choice.
Post edited July 09, 2025 by Mr. Zim