About 5 months ago I ordered a Gigabyte HD6850 card and for those 5 months, it worked great. Loved it. Then, about 2 weeks ago, I encountered a problem where my monitor would randomly lose signal while playing certain games. It only happens with certain games (Crysis 2, Far Cry), and with others it never seems to happen (Crysis, Warhead, Borderlands, etc). It never happened when I wasn't gaming. I was CONVINCED it was a driver issue because the problem popped up after I'd installed some NVIDIA drivers for a different game, and also because prior to that install I'd been able to play Crysis 2 and Far Cry 2 on this card with no problems whatsoever.But alas, driver scrubbing failed so eventually I just did a clean install of Windows 7. Fresh install, first thing I did was load up Crysis 2 and BLORK my monitor cuts out.
Now, I'm no computer genius. I still didn't think it was an issue with the card because why would it only occur during some games and -never- pop up during others? But, I'd recently ordered another Gigabyte HD6850 for Crossfire. It was just sitting in its box because my new Crossfire motherboard hadn't arrived yet, but I figured what the hell, I'll test Far Cry 2 and Crysis 2 on that card. And what do you know? No monitor cut-outs. Not one. Not after 5+ hours of gaming on both games. I even pulled out the new card and put the old card back in just to be sure and sure enough, monitor loses signal with the old card.
Conclusion: the card is at fault. I'm currently awaiting an approval for RMA. I thought I'd share this experience in case anybody has ever experienced the like of it before (I've never heard of such a thing) and also to let any potential Gigabyte 6850 buyers know that it was a great card while it worked, but 5 months is not an acceptable shelf life. I have faith Gigabyte will come through with the RMA, though. They've always been good to me in the past.
*edit* Card was OC'd with MSI - hardly pushed to it's limit. 850mhz core clock, 1200mhz mem clock, no voltage tweaking obvs.