• Fur Affinity Forums are governed by Fur Affinity's Rules and Policies. Links and additional information can be accessed in the Site Information Forum.

Video Cards... >.<

Kougar

Member
Yeah, video cards are pricey, expensive beasts, but there is a trick to them that many don't quite know. The sad thing is video cards priced $160 and under are typcially decent cards... at the time. However they just lack the raw power to last, and so anyone that buys a card for under $200 will typically need to buy another one within just a year or two to keep playing the newest games with high detail settings.

The trick is to save up, then when a new generation comes out buy one of the top two model cards. Sure it costs $500, and that's alot of money, but the card will offer enough power to run anything for the next several years easily. Ya can spend $400-500 once every 3-4 years, or spend $200 every year or two.

The x850 XT PE series is a good example, they still can offer some power despite being an extremely old card by todays standards. The 8800GTX is another one. It costs a foturne, but it offers performance exceeding that of two x1950XTX cards in crossfire. That card will likely have a useable lifespan of 3-5 years as it also comes with Shader Model 4.0 and DX10 tech onboard, which won't be outdated anytime soon. This October/November Nvidia will be releasing the G92 refresh part... same card, smaller process size, so even higher clocks and less power consumption than an XFX 8800 Ultra XXX Edition.

I learned this trick after buying a $160 9600XT... great card, but even then it could barely handle CS:S with HDR enabled a year later with medium to high settings. Buying the top end once it's released will let ya skip buying a card for the next generation of cards and their 2nd generation refreshed parts after that. A 7800GT of old offers the performance of todays 8600GT card. A 7600GT offers the performance of yesterdays top end 6800GT card. Even the fastest overclocked 8600GTS out there is outgunned in raw performance by yesterdays 7950GT 512mb card.
 
I still own and use a Geforce 6600GT. I can still play alot of the latest games with little issue. Some games I can afford to crank up the eye candy a little bit. I just avoid using the official download off nVidia's site and use some tweaked beta drivers. My video quality and performance is better then most using the same card due to better drivers plus I've had features meant for higher end cards (7 series) for the longest time on mine with little to no performance hit when I use them. If you absolutely demand the highest level of eye candy in all your games you're gonna need to spend $500 or more on a videocard. If you aren't an eye candy freak you can afford to use a lower end card and you'll just have to learn what eye candy can & can't be used in certain games on your card.
 

Aden

Play from your ****ing HEART
A Radeon X1600 came with my laptop that I bought last year, and I have a feeling it won't be in the running for much longer than 2 years to come. But Half-Life 2 looks niiiiice...I can manage 1440x900 (no AA) and HDR with most of the settings on high with framerates above 30. I'm just worried about future, "next-gen" games - HL2 is a few years old already.
 

Janglur

Active Member
Paid $120 for my Radeon x550 when it was new. Still going strong!

ALso overclocked from 400/550 to 500/687
 
J

Jelly

Guest
Janglur said:
Paid $120 for my Radeon x550 when it was new. Still going strong!

Yup. I guess it's enough for me. I can play all the new games and crank up the graphics a bit. Still looks fine to me (not that there are many PC games that require that and are worth playing).
 

net-cat

Infernal Kitty
I've got a GeForce 7600 GT. It's sitting on a shelf, collecting dust. Apparently, for 7000 series video cards to be compatible with XP x64 SP2, you have to install from scratch using a slipstream disc. So, until I actually get around to doing that, I'll just be putting along with my Radeon X300.

Coincidentally, this was the first nVidia card I've ever purchased. It'll probably be the last, too.
 
net-cat said:
I've got a GeForce 7600 GT. It's sitting on a shelf, collecting dust. Apparently, for 7000 series video cards to be compatible with XP x64 SP2, you have to install from scratch using a slipstream disc. So, until I actually get around to doing that, I'll just be putting along with my Radeon X300.

Coincidentally, this was the first nVidia card I've ever purchased. It'll probably be the last, too.

PCI-e or AGP?
 

net-cat

Infernal Kitty
PCI-E, but I've heard of people having the same problems with AGP.
 

Janglur

Active Member
I've yet to find a game the x550 can't do.

Especially the x550 overclocked! Both GPU and Memory are OC'd 25%. I'd go higher but i'm broke and not suicidal.
 

Zero_Point

Member
I've got 2 7950GT KO cards from eVGA in SLi at the moment. I bought all the stuff for my current rig in the hopes that it will run Unreal Tournament 2007 nearly flaw-lessly. At least it runs Supreme Commander well enough. *thumbs up*
 

yak

Site Developer
Administrator
Radeon 1950PRO here, runs Oblivion at highest settings with AA/AF at max too.
I honestly was.. disappointed... to see that i could play just about every game out there with no lags at all, i was expecting worse.
Cost me 200$, and i do not intend to upgrade it for at least 2-3 years to come.
 

Dragoneer

Site Developer
Site Director
Administrator
yak said:
Radeon 1950PRO here, runs Oblivion at highest settings with AA/AF at max too.
That's also an amazing overclocker, too. One of the best from the ATI famil for cranking up the volume. =D
 

yak

Site Developer
Administrator
Gol22 said:
Does it REEAAALLY matter if its AGP or PCI?

I cant get that off my mind.... i just want a good video card to play games and stuff on!

Yes it matters, but only if your card will be 200$+.
Anything below is *not* a good video card, honestly. PCI-e x16 pumps more data through the data channel, but your card has to be on the power level to use that data transfer rate, other wise AGP will be more then enough.
Also, AGP is a dying standard and buying anything for it now is.. not a forward-looking approach.
 

net-cat

Infernal Kitty
Unfortunately, enough stuff has changed in the last year and a half that most people who want to upgrade will have to get a new computer.

DDR2 in place of DDR
Socket 775 in place of Socket 478
Socket AM2 in place of Sockets 939 and 754
PCI-E in place of AGP
SATA in place of ATA
 

Kougar

Member
Gol22 said:
Does it REEAAALLY matter if its AGP or PCI?

I cant get that off my mind.... i just want a good video card to play games and stuff on!

In what way are you asking? Performance? It doesn't really matter yet...

...but you will pay up to twice as much to get the same performaning cards with AGP as you would with using PCIe... you'll get much more bang for your buck with PCIe these days. If you are buying a "good" performing AGP card you could afford to buy a new motherboard + the PCIe version of the same card at the same price.

net-cat said:
Unfortunately, enough stuff has changed in the last year and a half that most people who want to upgrade will have to get a new computer.

DDR2 in place of DDR
Socket 775 in place of Socket 478
Socket AM2 in place of Sockets 939 and 754
PCI-E in place of AGP
SATA in place of ATA

*coughs* Yer calling that new?? :twisted:

DDR2 -> DDR3
LGA775 -> Updated LGA775 VRM that is Core 2 Duo / Penryn compatible.
Socket AM2 -> Socket AM2+ (And Socket AM3, due out next year for the 45nm K10 chips from AMD)
PCIe -> PCIe 2.0 (Okay, so that really doesn't matter quite yet except those SLIing 8800GTX / Ultra people)
SATA -> SATA 3.0 gbps (Hot-plugging is useful)

Did come close to buying a DDR3 version of my current motherboard, but forking out $480 for 2gb of DDR3 kit is not high on my list. Although this kit will overclock to 1,500Mhz with 7-7-7-20 timings, and will outperform any DDR2 RAM that's out there... sweet stuff, but to bleeding edge at the moment to be affordable.
 

net-cat

Infernal Kitty
I can't say this particularly surprises me...

But yes. That's all bleeding edge stuff. (Except the SATA 3.0.) Very few people will actually be getting that.
 

ADF

Member
The thing that bothers me the most about graphics cards is developers simply don't utilize their power these days. They rely on users constantly upgrading their hardware to run new games instead of utilizing what they do have, you would think game companies had something going on with GPU companies.

Examples? Neverwinter Nights 2 and Gothic 3, here we have two games that look the same if not worse than games out during their time but run far worse. Games like Prey appear every now and then that look good but will run well on almost anything, but most developers don't bother because people can just upgrade.

Console ports are the worst for not utilizing PC hardware, look at games like Lost Planet and Overlord for examples of crappy optimization. 8 series cards make the 360 GPU look like a joke, but you wouldn't know that playing the same games as them on superior hardware.
 

net-cat

Infernal Kitty
I seem to recall the PC and Xbox360 versions of FFXI are limited to 30 FPS so as not to give them an advantage over PS2 users. Or something.

However, part of the reason consoles can get away with using inferior GPUs is because there is less software between the game and the GPU. In a PC, you have GPU -> Driver -> HAL -> GDI -> DirectX/OpenGL -> Game. In a console, you generally have GPU -> Console's API -> Game.
 

Kougar

Member
Unless you use Vista, then

GPU -> Driver -> HAL -> GDI -> DirectX/OpenGL -> Game
becomes
GPU -> Driver -> WPF -> DirectX/OpenGL -> Game

They finally did away with the very old HAL layer in Vista. I gotta agree though, it seems many of the newer game developers are taking advantage of the hardware power by cutting corners and not optimizing their own games anyway, leaving it up to the GPU drivers to try and fix some of it.
 
Top