It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
low rated
I have RTX 2060 (6 GB VRAM), AMD Ryzen 5 2600, 16 GB RAM. I game on a 1080p, 16:9, 75 hz screen. Overall, i would say it's doing its job well - i mean, if i could rewind time, i would have gotten a 2060S or a 2070, or waited for the 30 series (which had not yet been announced), but it's doing its job.

However, sometimes when playing on Ultra settings, i do get stutters. For example, most of the time Rise of the Timb Raider runs in 60 FPS on ultra, but when i got to the last fight in the Baba Yaga DLC, where there are a lot of flying objects, it dropped to 40 FPS which made me lower some settings.

So, i have 3 options. Choose Ultra settings, and stick with that no matter what. Choose Ultra and manually lower it whenever the game starts to stutter. Or choose lower settings, which will somewhat decrease the experience. I mean, gameplay is the most important, but i like good graphics. Who does not?

So, what if games automatically detected whether your FPS is dropping, and lowered the graphic when it does, then getting them back up when the graphic card has less work to do? Would you like that?
Hmmm before you do anything, try this. Play the game again, but when you start it, alt+tab and open up your task manager. Then set the priority of the game from Normal, to Above Normal. Some background processes which have equal priority may be giving you issues.

Likewise usually when i do heavy processing in linux or MingW, i do the nice command, which gives the program set a much lower priority and doesn't make my other programs studder at all.

Though that's probably not the answer you were looking for. As for games, i mostly heard of PS4 games having dynamic resolution or settings to combat FPS issues, not so sure on PC games. So not sure.
Post edited November 23, 2020 by rtcvb32
VRS anyone?

...and beside numerous optimization techniques, dynamic resolution has been around for a while now.
Post edited November 23, 2020 by WinterSnowfall
There has been a couple of games that did experiment with it, but the result was always crap and very jarring. It hits you like a ton of bricks if suddenly draw distance got smaller, or textures disapeard, or details became very blurry in the middle of a game. The conclusion at the time was that it was a bad idea.
avatar
WinterSnowfall: VRS anyone?

...and beside numerous optimization techniques, dynamic resolution has been around for a while now.
vhicle renting service?
virtual racing school?
Post edited November 23, 2020 by amok
I was going to mention G-sync monitors (which adjust the refresh rate of the monitor to match your graphic card's frame rate which somehow removes stutering), but I just read an article that says that Nvidia dropped it in favor of Free sync.

The article said:

"This feature will work on 10-series GTX cards and of course the newer 20-series RTX lineup." meaning that some Freesync cards will work on those Nvidia cards.

Seems confusing. I'm glad I never bothered with it.
Post edited November 23, 2020 by hudfreegamer
avatar
GeraltOfRivia_PL: So, what if games automatically detected whether your FPS is dropping, and lowered the graphic when it does, then getting them back up when the graphic card has less work to do? Would you like that?
No. Some games already do "Dynamic Resolution Scaling", etc. And as amok said, it's far more jarring / irritating to have your resolution suddenly halve than just lowering the preset from Ultra to High permanently whilst remaining at the same resolution. And it's still amusing how many people insist on struggling with stutter / poor performance whilst refusing to drop them "because that's how real gamers play / how they are benchmarked", when in reality "High" is probably the first high-quality preset that's actually optimised, and "Ultra" for many games is more like the developers saying "let's see how much over the top cr*p we can fill this with starting with Chromatic Aberration, 8x MSAA, 128x Tesselation, severe myopia levels of DoF, etc" whilst a lot of low-end gamers with bit of common sense have long figured out that tweaking can often gain +20-40% with virtually no visual loss. Edit: In fact half the time games can look better when you turn off some of the excessive over-the-top shader junk.
avatar
hudfreegamer: There's another alternative. G-sync monitor (which work with newer Nvidia video cards). I've heard they work by adjusting their frame rate to match whatever your computer is able to do. Somehow that removes stuttering. AMD has something similar for their video cards called Free sync. So, you need an AMD video card and monitor with Free sync to do that, but it's the same basic idea.
It works well. Just to clear something up - GSync is the nVidia specific version. Freesync used to be AMD specific however nVidia has been supporting this too ("GSync Compatible") and I can confirm that a Freesync monitor + nVidia card + DisplayPort cable works very well. The bulk of the stutter (VSync on) or tearing (Vsync off) disappears when frame rate drops underneath the maximum refresh rate but remains within the Freesync range of the monitor. My next monitor will definitely be another Freesync one.
Post edited November 23, 2020 by AB2012
avatar
hudfreegamer: There's another alternative. G-sync monitor (which work with newer Nvidia video cards). I've heard they work by adjusting their frame rate to match whatever your computer is able to do. Somehow that removes stuttering. AMD has something similar for their video cards called Free sync. So, you need an AMD video card and monitor with Free sync to do that, but it's the same basic idea.
avatar
AB2012: It works well. Just to clear something up - GSync is the nVidia specific version. Freesync used to be AMD specific however nVidia has been supporting this too ("GSync Compatible") and I can confirm that a Freesync monitor + nVidia card + DisplayPort cable works very well. The bulk of the stutter (VSync on) or tearing (Vsync off) disappears when underneath the maximum refresh rate but within the Freesync range of the monitor. My next monitor will definitely be another Freesync one.
I was editing my previous post when you replied. I just learned about Nvidia supporting Free sync. Thanks for the added clarification.
avatar
amok: vhicle renting service?
virtual racing school?
As much as I want it to be "various ranting statements", it's actually Variable Rate Shading. But point taken, I should not throw acronyms around ;).
low rated
But maybe there are some ways to implement this?

Say, a game detects that too many pedestrians may cause you to lose FPS, so it generates fewer of them
avatar
GeraltOfRivia_PL: Say, a game detects that too many pedestrians may cause you to lose FPS, so it generates fewer of them
Well, in all fairness, Carmageddon devs found a good way to fix this...
avatar
GeraltOfRivia_PL: I have RTX 2060 (6 GB VRAM), AMD Ryzen 5 2600, 16 GB RAM. I game on a 1080p, 16:9, 75 hz screen. Overall, i would say it's doing its job well - i mean, if i could rewind time, i would have gotten a 2060S or a 2070, or waited for the 30 series (which had not yet been announced), but it's doing its job.

However, sometimes when playing on Ultra settings, i do get stutters. For example, most of the time Rise of the Timb Raider runs in 60 FPS on ultra, but when i got to the last fight in the Baba Yaga DLC, where there are a lot of flying objects, it dropped to 40 FPS which made me lower some settings.

So, i have 3 options. Choose Ultra settings, and stick with that no matter what. Choose Ultra and manually lower it whenever the game starts to stutter. Or choose lower settings, which will somewhat decrease the experience. I mean, gameplay is the most important, but i like good graphics. Who does not?

So, what if games automatically detected whether your FPS is dropping, and lowered the graphic when it does, then getting them back up when the graphic card has less work to do? Would you like that?
I should be able to set it, how I see fit. I don't need this dynamic setting change stuff, to happen when I'm in the middle of action and gameplay.

Here's some thoughts:

1. Take your low FPS and lock it there or 5 frames above it. So, in this case, lock your fps to 40fps w/ NVidia Inspector or MSI Afterburner. Or...lock it down to 45fps, for a bit of wiggle room.

2. Forget Ultra, turn it down to High or Very High...or the setting right below Ultra.

3. Play w/ settings. Maybe there's one taxing setting...that you should just turn off just to get your desired 60fps. I remember in TR 2013, TressFX could cut framerates in half. In some games way back, PhysX murdered framerates in like say Alice Madness Returns and Mafia 2 - so find that one crazy feature (if it's there and especially if it's NEW!) and turn it off. Sometimes, it's really one just feature kicking your PC in the read.

4. You have a 2060, so turn DLSS 2.0 on (on games that actually support it) and get some frames back.
Post edited November 30, 2020 by MysterD
avatar
GeraltOfRivia_PL: Say, a game detects that too many pedestrians may cause you to lose FPS, so it generates fewer of them
There's no need to go that high level. Graphic engines work with triangles. And we've had dynamic LODing ever since the days of 3dfx, if memory serves. Modern games will already optimize quite a lot of things.

Settings still play a part in how it works though. "Medium" might allow only fluctuations between certain lower levels of LOD, while "High" may remove the caps. Having the game go from looking gorgeous when you're staring at an empty wall, but then turning into a cubist painting once you turn to face an open landscape is not, I think, what anyone would want, therefore dynamic resolution is usually employed when a fixed framerate is desired.
Post edited November 24, 2020 by WinterSnowfall
There was a mod for Skyrim that does this, which I needed to use with my old 1060card. It would reduce resolutions while moving to dynamically keep a range of FPS for you. It was great until I didn't need it anymore once I bought my GTX1080.