Posted June 12, 2023
Yea, I could tell :\
I am aware. That's how CPUs are advertised but doesn't take into account the number of calculation or the complexity of the task. However those are things that the developer would be aware of and consider when determining that clock rate that would adequately accommodate those operations.
Ufff.. there's a ton of other factors as well, which is why there is an absolutely massive difference between a 2 GHz Pentium 4 released in 2002 vs a 2 GHz core i7 released in 2023. But it's not just the CPU. You could have a CPU running at a teraherz and that does absolutely nothing for you if your program is waiting for data to arrive from RAM. Caches, latencies, bandwidth, branch prediction, etcetra are critically important, while other features such as pipeline depth can have a big impact on IPC, and that is still not translating directly to performance due to other differences. My point is: a 5GHz CPU can be slower or faster than a 2GHz CPU.
I'll say this as someone who writes software for a living and has coded for over two decades: virtually nobody is counting instructions, game developers least of all. If you're writing embedded applications (with microcotrollers that have predictable throughput and single-cycle SRAM) or RTOS use with tight timing requirements, you may count instructions; if you're optimizing a tight loop, as for example someone working a high end video codec might, you might actually count instructions. If you're a compiler developer, you might count instructions (keeping in mind that a measure of instructions isn't a measure of performance). Game developers? Practically speaking never. Games today are far too high level and abstracted from the machine code for anyone to care, plus there are too many parts of the pipeline that the game developer does not actually control.
I'm going to suggest that the most game developers don't even understand machine code / assembly at all these days, and would have no idea how to measure retired insns, let alone know what to make of that knowledge.
All that is to say: those CPU frequency specs are pretty much completely meaningless.
So what they do instead is exactly what we've stated in this thread: pick a baseline system, test that the software runs passably, and call that the minimum. That's all "minimum" means in this context.. no point taking the word any more literally than that.
clarry: No game on any modern PC expects to run a fixed number of instructions before a task switch occurs. I didn't say it did. However I expect that the developer can determine how many instructions will be called at any point in the execution. Which then leads into how complex those instructions are and how long they will take to complete with a given set of specs.
Given the variability of hardware and software stack outside of the developer's control, it would be a completely pointless exercise, and in any case these specs never include the other variables that impact performance.
So it's easier and much more pragmatic to just grab a system that represents the low end you're willing to deal with (giving some headroom), test on it, and call it the minimum.
No, I understood what you're saying. But nothing that persuading me to think of it as anything other than a resource to utilize the same I was saying about the rest.
Aight. Well, as I said, you are welcome insist that everyone is doing it wrong.. but that doesn't change how the industry sees it :\
clarry: But since you are aware that future systems are not guaranteed to be compatible, you should now understand why they list every OS they support. The opposite in fact. If I need compatibility options for a currently modern game to run on whatever is contemporary 20 years from now I need to know what it was designed to run on. Listing every system imaginable someone might have been using the year it came out doesn't tell me that.
Yeah, in some cases it is handy to know what system the developer originally used while writing the software.. but again this really shows that you have zero software development experience. Software isn't so much "designed to run on" a particular version of an OS as it is just written... Naturally it tends to run on the system you start with, but it is hardly ever tied to that specific version and will happily run on other systems just fine; the OS is fairly well abstracted away and doesn't play such a big role when it comes to high level software.
It's usually the dependencies that ultimately dictate what the baseline is, but developers may not care; they pick a set of systems they're willing to test on and "support." It's not uncommon to see a game that "requires" Windows 10 but runs fine on 7, for example. I also wouldn't waste time testing on an OS released 14 years ago.
I am aware. That's how CPUs are advertised but doesn't take into account the number of calculation or the complexity of the task. However those are things that the developer would be aware of and consider when determining that clock rate that would adequately accommodate those operations.
I'll say this as someone who writes software for a living and has coded for over two decades: virtually nobody is counting instructions, game developers least of all. If you're writing embedded applications (with microcotrollers that have predictable throughput and single-cycle SRAM) or RTOS use with tight timing requirements, you may count instructions; if you're optimizing a tight loop, as for example someone working a high end video codec might, you might actually count instructions. If you're a compiler developer, you might count instructions (keeping in mind that a measure of instructions isn't a measure of performance). Game developers? Practically speaking never. Games today are far too high level and abstracted from the machine code for anyone to care, plus there are too many parts of the pipeline that the game developer does not actually control.
I'm going to suggest that the most game developers don't even understand machine code / assembly at all these days, and would have no idea how to measure retired insns, let alone know what to make of that knowledge.
All that is to say: those CPU frequency specs are pretty much completely meaningless.
So what they do instead is exactly what we've stated in this thread: pick a baseline system, test that the software runs passably, and call that the minimum. That's all "minimum" means in this context.. no point taking the word any more literally than that.
![avatar](http://images.gog.com/01ee6faf70c0cb789d9ebe3f9a0d2998efe6cfd4e6aed394d71514d6ef139cfb_avm.jpg)
So it's easier and much more pragmatic to just grab a system that represents the low end you're willing to deal with (giving some headroom), test on it, and call it the minimum.
No, I understood what you're saying. But nothing that persuading me to think of it as anything other than a resource to utilize the same I was saying about the rest.
![avatar](http://images.gog.com/01ee6faf70c0cb789d9ebe3f9a0d2998efe6cfd4e6aed394d71514d6ef139cfb_avm.jpg)
It's usually the dependencies that ultimately dictate what the baseline is, but developers may not care; they pick a set of systems they're willing to test on and "support." It's not uncommon to see a game that "requires" Windows 10 but runs fine on 7, for example. I also wouldn't waste time testing on an OS released 14 years ago.