Posted July 06, 2023
I know that the following view is based a bit upon speculation, but it is also reinforced with an analysis of real-world performance capabilities.
Basically, I don't think that there is actually going to be as much a performance impact as everyone is fearing. Obviously, and with the supposedly enhanced AI and reworking of systems, there will be a CPU impact.....but I do not believe it is going to be very meaningful. If you examine the "old" system requirements, they were way off base and very very understated. My supposition is that the "new" requirements are actually the real requirements from Day 1 release but just a little more demanding.
If you examine the new charts, you need a 12900k to run 60 fps at 1080p with RT Ultra coupled with a 3080 TI. Now, this is where my real world experience comes into play. I had a 10900k paired with the 3080. Using all Ultra settings Ultra RT DLSS balancedat 1440p, I had to lock the fps to 50 with RivaTuner (beats in-game limiter) in order to get a mostly smooth experience with dips only occurring in front of V's apartment, behind Tom's Diner (not a demanding area, just a memory leak which hasn't been fixed) and City Center with the Digifishes ( drops to 47). Take note that 1440p with DLSS balanced is a resolution of 1707x960....a bit smaller than 1080p. The limiting factor with my setup was, has, and will be the CPU (proven by lowering RT to lowest setting without any fps change of lows). Crowd Density and RT hit the CPU hard (proven by lowering crowd density to medium thus increasing fps above 60 fps except in those memory leak areas). To verify that notion, I upgraded the 3080 to the 3090 for the VRAM boost (not for the performance boost of only about 10% on average). So now, I can run RT Psycho and Screen space reflections Psycho at 1440p DLSS balanced.........at a mostly locked 50fps. Also take note that if I had a 4k monitor, I'd be using it with DLSS Performance which is actually..........................1920x1080p. Now, we need a 4080 paired with 12900 to get 60 fps PT with 4K.........and that's simply lame-ass Frame Generation. Also take note that Intel 11th Gen did not outperform 10th Gen by very much at all, and sometimes performed worse.
Any thoughts....Edit: related to this?
Basically, I don't think that there is actually going to be as much a performance impact as everyone is fearing. Obviously, and with the supposedly enhanced AI and reworking of systems, there will be a CPU impact.....but I do not believe it is going to be very meaningful. If you examine the "old" system requirements, they were way off base and very very understated. My supposition is that the "new" requirements are actually the real requirements from Day 1 release but just a little more demanding.
If you examine the new charts, you need a 12900k to run 60 fps at 1080p with RT Ultra coupled with a 3080 TI. Now, this is where my real world experience comes into play. I had a 10900k paired with the 3080. Using all Ultra settings Ultra RT DLSS balancedat 1440p, I had to lock the fps to 50 with RivaTuner (beats in-game limiter) in order to get a mostly smooth experience with dips only occurring in front of V's apartment, behind Tom's Diner (not a demanding area, just a memory leak which hasn't been fixed) and City Center with the Digifishes ( drops to 47). Take note that 1440p with DLSS balanced is a resolution of 1707x960....a bit smaller than 1080p. The limiting factor with my setup was, has, and will be the CPU (proven by lowering RT to lowest setting without any fps change of lows). Crowd Density and RT hit the CPU hard (proven by lowering crowd density to medium thus increasing fps above 60 fps except in those memory leak areas). To verify that notion, I upgraded the 3080 to the 3090 for the VRAM boost (not for the performance boost of only about 10% on average). So now, I can run RT Psycho and Screen space reflections Psycho at 1440p DLSS balanced.........at a mostly locked 50fps. Also take note that if I had a 4k monitor, I'd be using it with DLSS Performance which is actually..........................1920x1080p. Now, we need a 4080 paired with 12900 to get 60 fps PT with 4K.........and that's simply lame-ass Frame Generation. Also take note that Intel 11th Gen did not outperform 10th Gen by very much at all, and sometimes performed worse.
Any thoughts....Edit: related to this?
Post edited July 06, 2023 by DemonKiller49