Home > Call Of > Call Of Duty 4 Graphics Drop With 9800 Gx2 ?

Call Of Duty 4 Graphics Drop With 9800 Gx2 ?

We did not experience the “Water Detail” bug with SLI like we did in our evaluation of the ATI Radeon HD 3870 X2 in CrossFireX. I've asked the ad folks to look into it ganeshts: @chrisheinonen Can we use wireless HEOS speakers for the surround and rear channels with these AVRs? The Radeon HD 4870 X2 did not go quietly defeating the GeForce GTX 280 by a single frame per second. GX2 drops to low of 12fps, while GTX280 drops only to 17 (with more turned up! 1medium for 280 and 6 mediums for GX2). http://evendirectory.com/call-of/call-of-duty-5.html

When watching movies or viewing the internet, this beast will be off and not making heat or noise. Note I owned a 9700 and 9800pro, and many AMD chips (only when they oc'ed like crazy). Why would they do that if they werent threatened by these cards? Meaning an ultra should be easy if needed. https://www.vistax64.com/gaming/145938-call-duty-4-graphics-drop-9800-gx2.html

For most of these graphics cards the relatively low 1440x900 resolution did not provide much of a challenge, so now we are stepping up to the native resolution of a 22” NVIDIA is able to get more chips per wafer and a higher percentage of those will be good compared to a large design. There is also an RPG element to the multiplayer where you are able to unlock new weapons, add-ons to those weapons, and other rewards. An error (403 Forbidden) has occurred in response to this request.

It is less expensive to make use of two chips, even if their combined size is larger than a monolithic one because yields are so much better. But you have to remember AMD has to undo years of mismanagement at the hands of ATI's management. 600 dollar cards keep showing up because they sell. Sure you can give it to me, but you better give me what to expect at home also, which is the MINIMUM I can expect. It's a problem of engineering rather than science: yes faster hardware could be built, but it doesn't matter how fast your product is if people who are interested can't afford it.

TOPICS FOLLOW ABOUT CPUs Motherboards SSD/HDD GPUs Mobile Enterprise & IT Smartphones Memory Cases/Cooling/PSU(s) Displays Mac Systems Cloud Trade Shows Guides Facebook Twitter RSS About Advertising Privacy Policy Contact Us Terms My current 8800GT can't do what I want in ALL games at 1920x1200 (native on Dell 24in). By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy. http://www.tomshardware.com/forum/262753-33-nvidia-9800-crashes-call-duty Maybe I heard it, but it sounded like distortion if I did, NOT WIND.

I know you guys wanted the x2 to trounce or remain competitive with this "bump". All rights reserved. Inside we will compare an XFX GeForce 9800 GX2 with two SLI setups and the ATI Radeon HD 3870 X2 in Crysis, Clive Barkers Jericho, and COD 4. we used the very latest beta driver that ATI can provide.

The Radeon HD 3870 X2 cracks the 50fps barrier, making it almost twice as efficient as a standard Radeon HD 3870 graphics card which nice to see. http://techreport.com/review/14355/nvidia-geforce-9800-gx2-graphics-card/5 So after the dust cleared, the 9800GX2 has not outlived its usefulness as a performance video card. Sorry. Reply chizow - Tuesday, March 18, 2008 - link While this is true, it only perpetuates the performance myths Nvidia propagates with its misleading product differentiation.

Sign up now Username Password Remember Me Lost your password? check my blog Accept. I've also tested at 1280x1024 with the lower-end graphics cards, since some of them struggled to deliver completely fluid rate rates at 1680x1050. But it definately took the crown back from AMD with authority.

World in Conflict has always been accused of being too CPU bound to be any good, but these results paint an interesting picture of the situation. World In Conflict from page 13 at pcperspective "Wow, what a difference a generation makes. So I think its reasonable to assume that people who buy screens capable of such resolutions will put some thought into using either SLI or crossfire, or a dual gpu solution this content The 280 had EVERY detail maxed out which GX2/9800GTX SLI couldn't do playable.

Look at the appples to apples COD4 scores, where the 9800GTX SLI drops to 11fps with 4xAA+16xAF, while the 280 holds at 32fps. RyanSmithAT: @scottwasson Be sure to cherish your fiber; it sounds like Google is done adding customers https://t.co/kz1i90rZUX RyanSmithAT: @BrettHowse The workout I get. Log in Don't have an account?

It performs well and is available at a price point that more people can live with.

The age old GeForce 8800 GTX is still going strong with 56fps making it only slightly slower than the Radeon HD 3870 X2 which manages 60fps. Fortunately, the HD 4850 also scores on power, performance, features and price. Performance World in Conflict Performance Final Words Tweet PRINT THIS ARTICLE Post Your Comment Please log in or sign up to comment. The single player campaign may be a little short for some, but the multiplayer more than makes up for it.

The article looks like it uses version 3.5. We’ve updated our terms. So until then we're talking GTX280 with no driver tricks vs 2 AMD's or 2 Nvidia's previous gens. have a peek at these guys The first version we can talk about today: that's the Radeon HD 4850.

It seems to match more with a 8800GTS 512MB, but with an underclocked core and shaders, paired with faster memory. All rights reserved. At 2560x1600 though we had to lower the texture setting options to “Normal” in order to achieve playable performance on the ATI Radeon HD 3870 X2. DX9) and the maker of DirectX (Alex St.

Let's just say it's high, but not as high as GT200 :) Again, we're not allowed to go into the architectural details of the RV770, the basis for the Radeon HD Reply Final Destination II - Friday, June 20, 2008 - link If reality doesn't fit in his brain, the average Nvidia-cocksucker has to talk till bubbles come out of his mouth. To me, it smacks of laziness. With the Radeon HD 3870 X2 we did experience this problem again and had to leave the water detail on “Normal” for this evaluation.

I was meaning 2 x 4870's because you can't get the 4870X2 until later (per AMD's own statement here at anandtech, it will come in 2 months IIRC).