It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
Try lowering your resolution 1 down....
avatar
Sidewinder: ATI cards are nearly always shit for performance and glitches on release of most games. Make sure you update your drivers, or maybe revert 4 versions, usually fixes it.
Sadly, I remember the ATI driver dance when I used to have one. Great card when it worked, but every new driver release seemed to break as much as it fixed.
I beg to differ. I've always been using ATI/AMD cards and NEVER had a problem. Even when using a card that is underpowered compared to what is mainstream at the time, there may be slight performance issues but never glitches. Cards I've used includes the old 3D Rage Pro chipset, 9800, HD4850 and currently HD5770. None of them were bleeding edge when I buy them, and I've never encountered any issues in over 16 years of gaming.

In fact, I've consistently seen more Nvidia users mentioning glitches than ATI/AMD users. Granted, Nvidia may have a bigger share in the gaming market; but the ratio of Nvidia users having glitches compared to the ATI/AMD counterpart far exceeds the ratio of usercounts. This is even for AAA titles optimized for the Nvidia cards, like Arkham Asylum. A quick browse in forums like Steampowered will confirm this. A lot of this seems to have to do with Nvidia's drivers - whereas ATI/AMD users are usually able to solve issues by upgrading to the latest drivers, for Nvidia users, they need to upgrade or rollback their drivers depending on which game it is. Sometimes Nvidia users also report conflicts between two Nvidia cards, like when they have 2 different cards (e.g. one for graphics, one for physx). Some users also report issues caused by certain Nvidia technologies, e.g. Nvidia 3D vision.

I understand that not all ATI/AMD users will agree, and there's a chance that there are also Nvidia users who had similarly good experiences as I do, but I'm just stating that when it comes to comparison, Nvidia isn't necessarily the "better" card. A lot seem to have to do with the rest of the components of the PC. Other than my first, I've always researched the individual components right down to the make, model and variant as well as their compatibility with other desired parts, and then built the damn rig myself.

One last observation is that people who encounter glitches are often using multi-GPU setups. On many AAA titles, there is usually a much higher usercount of Crossfire or SLI users compared to single GPU card users. Often times, I've also seen multi-GPU users complain of worse performance than their single-CPU counterparts, even when their multi-GPU cards were of higher caliber. Sometimes less is more; I believe mutli-GPU setups just aren't as mainstream yet for developers to give them very serious thoughts and optimization.

avatar
Sidewinder: ATI cards are nearly always shit for performance and glitches on release of most games. Make sure you update your drivers, or maybe revert 4 versions, usually fixes it.
avatar
darkwoof: I believe mutli-GPU setups just aren't as mainstream yet for developers to give them very serious thoughts and optimization.
Having written multi-GPU drivers years ago, I'm amazed when anything actually works on them. It's a horrendous kludge that can easily reduce performance if games aren't designed to take account of its limitations.
Every ATI card I've ever owned hasn't been worth the gold in the electronics compared to the competing level nVidia Card, and I've tried a couple dozen of them over the last 16 years. I've been involved in multiple betas, and releases and it's almsot always an ATI card with a glitch. To claim ortherwise shows a severe lack of actual experience with gaming and blind allegiance to a brand. Between my MX440 and my 8800GTX I tried 3 of the "nVidia Killer" cards by ATI and always went back to my MX440 for better performance, it was assinine (and no this isn't conjecture but actual monitored in game FPS numbers and playability levels).

Just because you personally have never had a problem doesn't mean that you are the majority, and while yes there are SOME games that offer nVidia cards a few isues, they are a minority comapred to the issues ATI has had since the birth of the 3D accelerator, I still miss my Voodoo 2 SLI rig.

Hell I was running a HD4850 in my media center PC... have had to replace the fan on it 8 times in 3 years, and it still got out performed by my GeFroce 8800GTX 256MB card, on the same board, processor and hard drive.

avatar
darkwoof: I beg to differ. I've always been using ATI/AMD cards and NEVER had a problem. Even when using a card that is underpowered compared to what is mainstream at the time, there may be slight performance issues but never glitches. Cards I've used includes the old 3D Rage Pro chipset, 9800, HD4850 and currently HD5770. None of them were bleeding edge when I buy them, and I've never encountered any issues in over 16 years of gaming.

In fact, I've consistently seen more Nvidia users mentioning glitches than ATI/AMD users. Granted, Nvidia may have a bigger share in the gaming market; but the ratio of Nvidia users having glitches compared to the ATI/AMD counterpart far exceeds the ratio of usercounts. This is even for AAA titles optimized for the Nvidia cards, like Arkham Asylum. A quick browse in forums like Steampowered will confirm this. A lot of this seems to have to do with Nvidia's drivers - whereas ATI/AMD users are usually able to solve issues by upgrading to the latest drivers, for Nvidia users, they need to upgrade or rollback their drivers depending on which game it is. Sometimes Nvidia users also report conflicts between two Nvidia cards, like when they have 2 different cards (e.g. one for graphics, one for physx). Some users also report issues caused by certain Nvidia technologies, e.g. Nvidia 3D vision.

I understand that not all ATI/AMD users will agree, and there's a chance that there are also Nvidia users who had similarly good experiences as I do, but I'm just stating that when it comes to comparison, Nvidia isn't necessarily the "better" card. A lot seem to have to do with the rest of the components of the PC. Other than my first, I've always researched the individual components right down to the make, model and variant as well as their compatibility with other desired parts, and then built the damn rig myself.

One last observation is that people who encounter glitches are often using multi-GPU setups. On many AAA titles, there is usually a much higher usercount of Crossfire or SLI users compared to single GPU card users. Often times, I've also seen multi-GPU users complain of worse performance than their single-CPU counterparts, even when their multi-GPU cards were of higher caliber. Sometimes less is more; I believe mutli-GPU setups just aren't as mainstream yet for developers to give them very serious thoughts and optimization.

avatar
Sidewinder: ATI cards are nearly always shit for performance and glitches on release of most games. Make sure you update your drivers, or maybe revert 4 versions, usually fixes it.
avatar
darkwoof:
Post edited May 18, 2011 by Sidewinder
avatar
Sidewinder: snip
Fanboys always claim they're not fanboys *shrug*
**edit I'd like to add that Catalyst 11.3 seems to work well for me.
Post edited May 18, 2011 by BrowncoatGR
avatar
Sidewinder: snip
avatar
BrowncoatGR: Fanboys always claim they're not fanboys *shrug*
Not cliaming I'm not a fan of nVidia, but I also vastly preferred 3DFX wen they were still alive. The Glide API was far and away superior to OpenGL, but propietary licensing strangleholds (pricing) killed the company.

For every game that nVidia cards have an issue with, there's at least 3 that have issues with ATI.
Post edited May 18, 2011 by Sidewinder
True. Some recent examples, BC2, Crysis 2
avatar
Sidewinder: Every ATI card I've ever owned hasn't been worth the gold in the electronics compared to the competing level nVidia Card, and I've tried a couple dozen of them over the last 16 years. I've been involved in multiple betas, and releases and it's almsot always an ATI card with a glitch. To claim ortherwise shows a severe lack of actual experience with gaming and blind allegiance to a brand. Between my MX440 and my 8800GTX I tried 3 of the "nVidia Killer" cards by ATI and always went back to my MX440 for better performance, it was assinine (and no this isn't conjecture but actual monitored in game FPS numbers and playability levels).

Just because you personally have never had a problem doesn't mean that you are the majority, and while yes there are SOME games that offer nVidia cards a few isues, they are a minority comapred to the issues ATI has had since the birth of the 3D accelerator, I still miss my Voodoo 2 SLI rig.

Hell I was running a HD4850 in my media center PC... have had to replace the fan on it 8 times in 3 years, and it still got out performed by my GeFroce 8800GTX 256MB card, on the same board, processor and hard drive.
Have you even read my post before commenting? It's not just my experience, it's what I've observed as a whole from the gaming community via forums like this one. In any threads on graphical issues there are usually between 3-5 times more users using Nvidia cards than ATI/AMD ones, which exceeds the Nvidia vs ATI market share ratio the last time I checked. The fact that many of these titles were supposedly optimized for Nvidia cards does not help it one bit.

It's also seems somewhat hypocritical that you are accusing others of basing facts on their own experience (even though I've stated otherwise and gave you my source of info), and yet your first and largest paragraph reply talks about your good experience in using Nvidia cards. I certainly don't mind people sharing their opinions, but for a person like yourself who down the line went to pick faults with someone's experience, that's just weird, if not deliberate.

The HD4850, if you've done some reading from other's experience, does run a little hotter than comparable cards. My friend and myself who owns the card, however, had no issues with multiple failures as you did. If you had to replace your fan 8 times, you're either buying the wrong brand (remember, cards with the same number doesn't mean they are exactly the same), or more likely, your PC isn't properly ventilated - especially when used for a non-heavy duty role like a media centre PC. We do HD video rendering, 3D modelling/rendering, software development and of course gaming on ours, and had no issues. I also have a TV card running on it and 3 HDDs, so I don't think it generates any less heat than yours. Check your setup.

Can't comment on the 8800GTX, though from what I read its a pretty good card even when its old. Beating many of the lower and mid-end ATI cards and even some of the newer Nvidia ones. But performance does not equate to stability and I stand by my statements on how more Nvidia users seem to be having issues, until the real-life user accounts on forums changes of course. It should never be about brand loyalty or fanboy-rism; consumers should go for what works.
After years of headaches starting on the day I opened it's box, my 8800GTX 512mb finally died and I got an excuse to buy an ATI HD5770. Only AU$150, every game my old 8800 couldn't run due to incompatibility now works beautifully, and the performance increase over the GTX260 I borrowed (which had all the same problems the 8800 did) was fantastic. Even all the GoG games that wouldn't work on my nVidia started working. =]

It'll be interesting to see who does better in the next generation of cards, since this is my first non-nVidia card.. Hopefully ATI will get their physics and 3D act together, and they'll be even more fantastic.
Hehe, should I even mention the misinformation, lying, cheating in tests, removing quality and bad driver making process that nVidia still has? What, you forgot about that? Or that older games mostly fail to work after a time? Or that they sell you rushed, not so optimised hardware that eats power and overheats if it is not mainstream model? Or their other drivers, like motherboard that, at some point you had to mix files from several versions to make things work? I could go on and on... And I tested HW for a living.

Overall both have their faults, but sorry, generally speaking nVidia had more problems, lied more, never admitted to anything, didn't have a good connection with fans, and always had an underwhelming feeling - and I say that after having a 9800 GTX for some time. It was OK, but nothing special.

The only thing they did right and on time was to make their Meant to be Played program and spammed developer to basically buy them out to use their tech first, and cripple the competition, as it was and still is evident today. At that time ATI was working to make people happy, provide info on new drivers and what exactly was changed, and overall deal with the community. So you actually bash someone for trying to deal with you opposed to someone just doing what ever suits them at the moment. Nice...
I love these ATI vs Nvidia arguments, its like watching little kids argue which is better chocolate or strawberry?

Since the 3DFX days, I have gone from Nvidia to ATI back to Nvidia and now ATI (6970). Always getting the best bang per buck card in each generation. I have never had any problems with graphics, always kept my drivers up to date and never overclocked my GPU. Am I just a lucky bastardo or is it just keeping your system clean and getting standard PC configs, nothing too fancy?

Now if only the devs can get Eyefinity working in The Witcher 2........
avatar
vivasawadee: I love these ATI vs Nvidia arguments, its like watching little kids argue which is better chocolate or strawberry?

Since the 3DFX days, I have gone from Nvidia to ATI back to Nvidia and now ATI (6970). Always getting the best bang per buck card in each generation. I have never had any problems with graphics, always kept my drivers up to date and never overclocked my GPU. Am I just a lucky bastardo or is it just keeping your system clean and getting standard PC configs, nothing too fancy?

Now if only the devs can get Eyefinity working in The Witcher 2........
Exactly, clean house is a must, like with a car or an actual house ;) PC is not a console that only lets you press a few buttons and that is it for your interaction with the hardware. It is a machine with infinite possibilities and uses, milions of programs and the largest platform on the planet, by far. But, you have to use your brain, not your cock.

p.s. 3DFx rules forever! ;-)
Post edited May 19, 2011 by madant
avatar
vivasawadee: I love these ATI vs Nvidia arguments, its like watching little kids argue which is better chocolate or strawberry?

Since the 3DFX days, I have gone from Nvidia to ATI back to Nvidia and now ATI (6970). Always getting the best bang per buck card in each generation. I have never had any problems with graphics, always kept my drivers up to date and never overclocked my GPU. Am I just a lucky bastardo or is it just keeping your system clean and getting standard PC configs, nothing too fancy?

Now if only the devs can get Eyefinity working in The Witcher 2........
+1.

There's only been a handful of times in the last 5-6 yrs where either "team" had any sort of advantage [and only for a couple of months max] and usually it just comes down to $$/FPS, whatever game you want to max out, noise, heat. x800 and 6800 were soooo damn close back when HL2 was being leaked. 7900 x x1900xtx was basically dead even with a shader advantage to the ati and a vertex advantage to the 7900. The 8800gt outclassed the 2xxx. The 3870/50 showed that ati were still in the game. The 4850 clearly destroyed any fears of a return to $600 GPUs and forced nvidia to released a half decent 260gtx. Now we've got the 550Ti vs. 6870 issue to resolve which is again practically even +/- 5%.

Competition is excellent as it means we get some damn fine punch for very little $$. Joining a team only reduces your chances of getting a good deal.
avatar
vivasawadee: I love these ATI vs Nvidia arguments, its like watching little kids argue which is better chocolate or strawberry?

Since the 3DFX days, I have gone from Nvidia to ATI back to Nvidia and now ATI (6970). Always getting the best bang per buck card in each generation. I have never had any problems with graphics, always kept my drivers up to date and never overclocked my GPU. Am I just a lucky bastardo or is it just keeping your system clean and getting standard PC configs, nothing too fancy?

Now if only the devs can get Eyefinity working in The Witcher 2........
Exactly, it should always be about performance and stability at the right price. There's little purpose in having brand loyalty in a market where the manufacturer's only goal is earning your hard-earned dough. It's not as if either of them are working towards something more noble such as investing in an open technology for the greater good at their own expenses.

I do believe it has a lot to do with choosing the right hardware with other compatible hardware, at the right time and at the right price. Not unduly overclocking, keeping your system well maintained, ventilated, patched is a must. Lastly, latest is not always greatest!