Maighstir: Many games I run seem to be using Mono, which is a third-party project to run .NET applications on non-Windows platforms (existed long before Microsoft released .NET Core, but I think now works together with the latter). So yeah, I see .NET rather frequently together with games.
gamesfreak64: mono ? the only mono i know of is the mono dir and file that seems to be around in all unity folders and even in other games that don't use unity.
mono is open source .net, basically. .net was a project by microsoft to replace the win32 API with something else, but also inject it as an STL into programming languages, which would then force anyone who only knows that programming language to be stuck on windows. I remember when C# became a thing, i was warning people about how it's an overall bad idea, since it would lock them to windows development (since C# is the STL), which is especially bad if they want to become console developers or do any kind of development that isn't for windows GUI interface. No one listened to me (and given how formal programming education works, most really just didn't have a choice), so i'm thankful for mono, since it branches them out a little bit. They still won't be doing arduino development or anything, but at least they have a little freedom.
I think the big problem is the fps, many devs are not aware how to actually make a game perform well on any machine ( i googled and i found some discussions and it became clear that usually the average dev , or newbie dev is too much occupied in creating the perfect game with lots of jawdropping effects and eyecandy.
Thats why most devs usually seem to 'ignore' any questions about that subject , simply because they do not know.
Most gamedevs don't even program anymore: why do so if the engine is provided for you. I can respect this, even if i don't agree with it. When i was thinking about making 2d games recently, i realized that it's really just a matter of physics, but basically all 2d game engines are like 90% similar, whether it be shmups, RPG, or whatever. After spitting out sounds, throwing graphics onto the screen, and getting user input, all you do is implement the basic physics or whatever and the rest like menus, the graphics themselves, gameplay, etc are not part of the engine. While i think it's bad to get yourself stuck to an engine like that (portability issues and maintainability issues in the long run), i do understand how it can save time, effort, and money in development. However, just like programmers rely on the compiler to optimize their algorithms (which doesn't happen), these people rely on the game engines to optimize their game, which doesn't happen.
It's because of things like this, that i believe the person who uses a program to avoid learning layer X so they can do layer Y, should learn layer X to appreciate and understand layer Y. So electricians should learn basic chemistry, assembly programmers should learn electronics, programmers should learn assembly, unity devs should learn programming, game players should learn either unity or programming, etc. Allows everyone to know what they can reasonably expect everyone's else stuff to do, not just so that their end product isn't a mess, but so they also treat their tool providers with respect when the tools don't meet the unreasonable expectations, and they also gain the benefit of knowing out to swim if their boat ends up with a hole in it and their life vest can't support their weight.
Anyway got a nice game at 3.99 a visual novel written in unity ofcourse ( and very poorly optimised as usuall) and it gets hot again it consumes CPU like crazy > 60% which sets the cpu temp at 60 -69 or more degrees celcius.
That should be ok, but after Googling it the result reads it isnt really recommended to do so.
If your CPU is running too hot to run at 100% most of the time, you need to underclock it. 6 seconds of overheating 19 times over isn't as bad as a straight 60 seconds of overheating, but damage is damage. I forget what the actual recommendations are, but i remember that I had a computer that would shut itself off at that temperature, and considered "49" to be "critical." You have to understand that your CPU will shoot up when you're not looking, so if you're worried about the longevity of your computer, you need to keep those values at safe levels at sustained "max CPU." Beware those blasted auto-overclocking CPUs like mine: "1.0ghz CPU, but it has turbo boost to clock itself at 1.3ghz." You need to be realistic: you aren't going to manage that CPU control at all times, and trying to avoid something that's eating too many cycles at that time because it makes your CPU hot is just going to turn into a nightmare. So you want to lock the clock at bios or something ('cause, trust me, if a programmer can find a way to change it, they will, so they can offload the cost of optimization onto you, since that's how programmers these days are taught), so that you don't have to worry about it. Remember, the only person who has the incentive to worry about your computer is YOU, as everyone else has EULA and disclaimers galore to protect them from blame, and hardware manufacturers themselves gain from you having to buy new hardware, since that's more business for them.