It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I know that the following view is based a bit upon speculation, but it is also reinforced with an analysis of real-world performance capabilities.

Basically, I don't think that there is actually going to be as much a performance impact as everyone is fearing. Obviously, and with the supposedly enhanced AI and reworking of systems, there will be a CPU impact.....but I do not believe it is going to be very meaningful. If you examine the "old" system requirements, they were way off base and very very understated. My supposition is that the "new" requirements are actually the real requirements from Day 1 release but just a little more demanding.

If you examine the new charts, you need a 12900k to run 60 fps at 1080p with RT Ultra coupled with a 3080 TI. Now, this is where my real world experience comes into play. I had a 10900k paired with the 3080. Using all Ultra settings Ultra RT DLSS balancedat 1440p, I had to lock the fps to 50 with RivaTuner (beats in-game limiter) in order to get a mostly smooth experience with dips only occurring in front of V's apartment, behind Tom's Diner (not a demanding area, just a memory leak which hasn't been fixed) and City Center with the Digifishes ( drops to 47). Take note that 1440p with DLSS balanced is a resolution of 1707x960....a bit smaller than 1080p. The limiting factor with my setup was, has, and will be the CPU (proven by lowering RT to lowest setting without any fps change of lows). Crowd Density and RT hit the CPU hard (proven by lowering crowd density to medium thus increasing fps above 60 fps except in those memory leak areas). To verify that notion, I upgraded the 3080 to the 3090 for the VRAM boost (not for the performance boost of only about 10% on average). So now, I can run RT Psycho and Screen space reflections Psycho at 1440p DLSS balanced.........at a mostly locked 50fps. Also take note that if I had a 4k monitor, I'd be using it with DLSS Performance which is actually..........................1920x1080p. Now, we need a 4080 paired with 12900 to get 60 fps PT with 4K.........and that's simply lame-ass Frame Generation. Also take note that Intel 11th Gen did not outperform 10th Gen by very much at all, and sometimes performed worse.

Any thoughts....Edit: related to this?
Post edited July 06, 2023 by DemonKiller49
I find it hard to play FPS games at lower than 90fps now, so I'll likely keep RT off and use quality FSR2 at 4k for a smooth 90fps experience that still looks amazing. That's what I did last time I loaded it up.

RT looks nice, but 90fps feels nicer.
avatar
StingingVelvet: I find it hard to play FPS games at lower than 90fps now, so I'll likely keep RT off and use quality FSR2 at 4k for a smooth 90fps experience that still looks amazing. That's what I did last time I loaded it up.

RT looks nice, but 90fps feels nicer.
Normally, I'd agree with you, but I'm using a G-sync Ultimate monitor. It feels like a non-G-Sync 90 fps. A lot of high-end stuff is needed to make this game smooth. Let's see...a 3090, G-sync Ultimate, RivaTuner. lol It's buttery smooth on my end....'cept behind Tom's diner.
Post edited July 06, 2023 by DemonKiller49
avatar
DemonKiller49: Normally, I'd agree with you, but I'm using a G-sync Ultimate monitor. It feels like a non-G-Sync 90 fps.
Not sure what you mean here. I've had a g-sync monitor for a couple years and it is indeed a wonderful thing, but 60fps is still 60fps.
avatar
DemonKiller49: Normally, I'd agree with you, but I'm using a G-sync Ultimate monitor. It feels like a non-G-Sync 90 fps.
avatar
StingingVelvet: Not sure what you mean here. I've had a g-sync monitor for a couple years and it is indeed a wonderful thing, but 60fps is still 60fps.
Well, I didn't say 60 fps. I said 50. I also said G-sync Ultimate...not G-sync. Two different things and two different worlds of performance and capability. But, play your game how you want. The topic wasn't even about your preferences, guy. lol
Post edited July 06, 2023 by DemonKiller49
avatar
DemonKiller49: Well, I didn't say 60 fps. I said 50. I also said G-sync Ultimate...not G-sync. Two different things and two different worlds of performance and capability. But, play your game how you want. The topic wasn't even about your preferences, guy. lol
G-sync "ultimate" doesn't make 50fps into anything higher either. But yes, play at 50fps if you want. Genuinely hope you enjoy the experience.
avatar
DemonKiller49: Well, I didn't say 60 fps. I said 50. I also said G-sync Ultimate...not G-sync. Two different things and two different worlds of performance and capability. But, play your game how you want. The topic wasn't even about your preferences, guy. lol
avatar
StingingVelvet: G-sync "ultimate" doesn't make 50fps into anything higher either. But yes, play at 50fps if you want. Genuinely hope you enjoy the experience.
Have a good day there, fella. lol You didn't contribute anything at all to this technical discussion whatsoever. Noone said G-Sync Ultimate made it any higher, Einstein. It makes it smoother without that bogus LFC frame doubling below 45 ish fps that Freesync and G-sync compatible use. Regardless, quit trying to force your preference on others. No one cares about your graphics or fps preferences in the least. You came here to nitpick and argue with others over their play choices pretending that yours are the only ones that matter. Stick to the topic at hand if you are capable. Oh wait....you never even addressed the topic at hand, clown. Arrogance and smarminess get you nowhere, buddy. I mean...this was a technical look at actual maximal performance output and requirements of then vs now. What kind of mouth breather reads all that and all he says is, " Duh...me like 90fps"? Not asked for, and certainly not on-topic. One day reading comprehension will find you. Don't worry.
Post edited July 06, 2023 by DemonKiller49
Okie dokie.
avatar
DemonKiller49: I know that the following view is based a bit upon speculation, but it is also reinforced with an analysis of real-world performance capabilities.

Basically, I don't think that there is actually going to be as much a performance impact as everyone is fearing. Obviously, and with the supposedly enhanced AI and reworking of systems, there will be a CPU impact.....but I do not believe it is going to be very meaningful. If you examine the "old" system requirements, they were way off base and very very understated. My supposition is that the "new" requirements are actually the real requirements from Day 1 release but just a little more demanding.

If you examine the new charts, you need a 12900k to run 60 fps at 1080p with RT Ultra coupled with a 3080 TI. Now, this is where my real world experience comes into play. I had a 10900k paired with the 3080. Using all Ultra settings Ultra RT DLSS balancedat 1440p, I had to lock the fps to 50 with RivaTuner (beats in-game limiter) in order to get a mostly smooth experience with dips only occurring in front of V's apartment, behind Tom's Diner (not a demanding area, just a memory leak which hasn't been fixed) and City Center with the Digifishes ( drops to 47). Take note that 1440p with DLSS balanced is a resolution of 1707x960....a bit smaller than 1080p. The limiting factor with my setup was, has, and will be the CPU (proven by lowering RT to lowest setting without any fps change of lows). Crowd Density and RT hit the CPU hard (proven by lowering crowd density to medium thus increasing fps above 60 fps except in those memory leak areas). To verify that notion, I upgraded the 3080 to the 3090 for the VRAM boost (not for the performance boost of only about 10% on average). So now, I can run RT Psycho and Screen space reflections Psycho at 1440p DLSS balanced.........at a mostly locked 50fps. Also take note that if I had a 4k monitor, I'd be using it with DLSS Performance which is actually..........................1920x1080p. Now, we need a 4080 paired with 12900 to get 60 fps PT with 4K.........and that's simply lame-ass Frame Generation. Also take note that Intel 11th Gen did not outperform 10th Gen by very much at all, and sometimes performed worse.

Any thoughts....Edit: related to this?
Ditch Ultra Settings. Ditch DLSS and FSR. Just run your games at native resolutions without software gimmicks if you want higher frame rates.
I actually just installed and was giving 2077 a try now that it's said to be doing a lot better than at launch; It's only using 20% cpu, on 1080, and smooth as butter.

But then again, medium/custom video settings using a 1050ti, Ryzen 7 5700G and 32Gb Ram, and the more complex video filters (Ambient inclusion, blur, anti-aliasing, ray-tracing, etc) are turned or low. Still looks and plays just fine, and feels a lot like Deus Ex.