Log in

View Full Version : Mindblowing Graphics Coming Soon To A PC Near You


Suhit Gupta
04-15-2004, 07:00 PM
<div class='os_post_top_link'><a href='http://www.hardocp.com/article.html?art=NjA2' target='_blank'>http://www.hardocp.com/article.html?art=NjA2</a><br /><br /></div>The new nVidia and ATI chips are finally here. The hype about the NV40 and the R420 chips from the two respective companies has been going on for several months now. We have all (ok, at least all us gamers) been waiting anxiously for Doom 3 and Half Life 2 for months/years and Id and Valve have been torturing us by delaying the games for what seems like forever. In any case, it was obvious that none of the current generation of cards (the ATI 9800 or the GeForce 5900) were going to be able to play either game smoothly, thus the crazy yearning for the next-gen cards from the two graphics cards giants. Well, nVidia finally <a href="http://www.theregister.co.uk/2004/04/14/nvidia_6800_ultra/">announced</a> the release of their GeForce 6800 cards with the NV40 chip. There have been several articles on the specs already. <br /><br />"With a mind-boggling 222 million transistors on-board - that's just short of 80 per cent more than Intel's latest Prescott CPU - nVidia claims that the GeForce 6800 Ultra will deliver up to eight times the pixel shading performance of its previous generation hardware. It should also deliver up to twice the vertex shader performance and close to twice the frame buffer bandwidth. On top of that it should also offer four times the shadow processing power and be up to four times more efficient at dealing with hidden surface removal. All this should have a massive impact on future games, with more advanced light and shadow effects than anything seen before." The new chips can handle GDDR 3 SDRAM across a 256-bit memory interface. Memory runs at 1.1GHz on the 6800 Ultra for 32.5GBps of bandwidth. The Ultra can churn out 6.4 billion texels per second and process 600 million vertices in the same time.<br /><br /><img src="http://www.digitalmediathoughts.com/images/6800_board.jpg" /><!><br /><br />Nvidia's also touted the parts' on-chip programmable video processing engine. That already gives it support for MPEG 2 and WMV 9 with motion compensation which can also be applied to other formats, such as MPEG 4, H.264 and DiVX. Its programmable nature makes it particularly suitable for pro tasks such as '3:2 pulldown', the process of converting an moving image encoded as interlaced fields to a frame-based picture sequence more suitable for playback on a computer display. It can also be put to gamma correction, colourspace conversion and a host of video effects. Apparently, people that have seen this card in action have said that watching a movie through this card made them look extremely cinematic.<br /><br />Having said all this about nVidia's new card, onto the current market leader ATI. They will be releasing a very similar card, the X800 Pro with the R420 chip, which is set to be a 6800 competitor and the more we read about the two cards/chips, we realize how incredibly similar they are. ATI plans to <a href="http://www.theregister.co.uk/2004/04/07/ati_r420/">make the announcement</a> about their card on April 26th. ATI is also planning (http://www.theregister.co.uk/2004/04/15/ati_r420/<br />) to release the x880 XT, PCI-Express version of their card by June 14th.<br /><br />Pricing and Availability: So this is what I can gather as of right now. Since the ATI card has not been announced yet, the ATI pricing is not available yet but I am guessing that it will be exactly the same as nVidia's cards. The GeForce 6800Ultra will be "priced at $499, contain 16 pipelines, require two Molex power connectors and in a two slot design clocked at 400/550. The GeForce 6800 will be priced at $299, contain 12 pipelines, require one Molex power connector, in a one slot design and clocks are yet to be decided."

Lee Yuan Sheng
04-15-2004, 07:03 PM
450W PSUs recommended. Only for the serious.

Suhit Gupta
04-15-2004, 07:04 PM
450W PSUs recommended. Only for the serious.
Defnitely. In fact some people are even saying that 500W is the way to go.

Suhit

Tim Williamson
04-15-2004, 07:04 PM
I have no idea what most of those specs mean, but if it makes games/graphics faster, then I say W00T! 8O

Suhit Gupta
04-15-2004, 07:25 PM
I have no idea what most of those specs mean, but if it makes games/graphics faster, then I say W00T! 8O
Since there are so many stats, it is hard for me to tell which ones you don't understand, but here are some resources -

1) http://www.kuro5hin.org/story/2003/10/28/9853/1617

2) http://www.graphicdesignetc.com/3DVocabulary.html

3) http://www.tomshardware.com/graphic/20010227/geforce3-03.html

Hope these help
Suhit

Jason Dunn
04-15-2004, 10:31 PM
450W PSUs recommended. Only for the serious.

While I don't doubt that these cards draw a serious amount of power, you most certainly don't need 450W to drive this card. ;-) Those are always "worst case scenarios" - a user with a PCI soundcard, PCI ethernet card, two or more hard drives, two optical drives, and the video card on top of that. Unless you're running under heavy load, you won't need that much power.

As an example, a Shuttle with a 250W power supply can use the burliest video cards on the market today - but they only have one AGP and one PCI slot max, everything else is on board, and room for only one optical drive and one or two hard drives.

Lee Yuan Sheng
04-16-2004, 01:10 AM
Errm, Jason, these cards draw power not just from the AGP slot, but also require a molex as well! The 6800 Ultra requires TWO molex connecters! I'm guessing that will draw a good many watts all by itself. AGP delivers about 30W max, and a molex connector delivers another 30W, so I'm guessing the 6800 itself will draw 50W, and the Ultra version probably another 20W? Nvidia isn't recommending 480W PSUs for nothing.

Mobos need about 50w, I recall, then each drive requires about 20-30w, the CPU itself takes about 70-80w (considering target demographic), and say we toss in a few other things (fans, blinky lights, etc), and then on top of that toss in losses due to inefficiency, and I think 250W isn't quite enough. 300W isn't either. And for chappies (no blinky lights here though) like me, 350W is barely enough. Thankfully I'm not crazy enough to buy a card like that. My GF4 4200Ti is good enough (it plays Knights of the Old Republic well enough =D)

I just saw a pic of the 6800 Ultra, geez, it needs two slots worth of space.

Jason Dunn
04-16-2004, 01:46 AM
Wow. Ok, I stand corrected. ;-)

Philip Colmer
04-16-2004, 10:35 AM
"With a mind-boggling 222 million transistors on-board - that's just short of 80 per cent more than Intel's latest Prescott CPU - nVidia claims that the GeForce 6800 Ultra will deliver up to eight times the pixel shading performance of its previous generation hardware."
Just in time for Longhorn to use it all up, eh? :D

ATI is also planning (http://www.theregister.co.uk/2004/04/15/ati_r420/
) to release the x880 XT, PCI-Express version of their card by June 14th.
Now this is good news - we need to start seeing more PCI Express products being announced. The poor old PCI bus just can't keep up any more with the demands of SATA and real-time video frames being thrown around. I'm hoping that Matrox are going to announce an Express version of their RT product in the next year, just in time for me to refresh my PC :wink:

--Philip

Suhit Gupta
04-16-2004, 02:23 PM
While I don't doubt that these cards draw a serious amount of power, you most certainly don't need 450W to drive this card. ;-) Those are always "worst case scenarios" - a user with a PCI soundcard, PCI ethernet card, two or more hard drives, two optical drives, and the video card on top of that. Unless you're running under heavy load, you won't need that much power.
Yeah, Lee is completely right in that the new gen cards will not only be drawing power from the AGP slot like all graphics card, and will not only be using power from two additional Molex slots. Check out the following screenshots -

1) http://www.hardocp.com/image.html?image=MTA4MTc0NzQ0ODZxTE1PbWV1dFNfMV82X2wuanBn

2) http://www.hardocp.com/image.html?image=MTA4MTc0NzQ0ODZxTE1PbWV1dFNfMV83X2wuanBn

[H]ardOCP rand tests using 431W and their tests ran fine but they say "Note that we don’t have any other wild power draining devices though". Clearly for power users, this would become a problem.

Suhit

James Fee
04-16-2004, 03:08 PM
Seems like I need to upgrade my GeForce 2 card! 8O

Suhit Gupta
04-16-2004, 04:25 PM
Seems like I need to upgrade my GeForce 2 card! 8O
Wow, I didn't know people still used those ;-). Yeah, you should totally upgrade, you are almost five generations behind. BTW, be prepared for a headache at first (if you have never seen the results)... I have found that most people, when they upgrade their video card to any of the latest ones that can really push out a large number of frames per second on typical games (first person shooters) get a bit of a headache. The amount of visual data can sometimes be a bit disorienting at first :), but you will soon get used to it, and in fact love it.

Suhit

Jason Dunn
04-16-2004, 04:47 PM
Just in time for Longhorn to use it all up, eh? :D

Heh. We'll probably see one or two new generations of GPU before we see Longhorn, but that's a good thing, because it means that the GPU power needed for the Avalon interface will make it's way down into less expensive cards.

Suhit Gupta
04-16-2004, 04:52 PM
Heh. We'll probably see one or two new generations of GPU before we see Longhorn, but that's a good thing, because it means that the GPU power needed for the Avalon interface will make it's way down into less expensive cards.
Yeah, good point. In fact, ATI is rumoured (maybe) to be coming out soon, like late Q4 this year, with a All-In-Wonder version of the x800 with an HDTV tuner built on. ATI typically releases the next gen card in the Spring and the All-In-Wonder version of that card in the Fall. Man o' man, can you image the serious graphics on that card? I am drooling already. (We seriously need a drool emoticon :-P)

Suhit

Jason Dunn
04-16-2004, 04:53 PM
[H]ardOCP rand tests using 431W and their tests ran fine but they say "Note that we don’t have any other wild power draining devices though". Clearly for power users, this would become a problem.

Jeese. I wonder how long until computers need dual power supplies - one just for the graphics card...this is getting totally insane. I honestly think these power requirements will slow down sales - most people will not be willing to re-configure their PCs just to get a damn video card. Only the hardcore gamers will buy this card, which is perhaps all that nVidia is hoping for.

Suhit Gupta
04-16-2004, 04:59 PM
Jeese. I wonder how long until computers need dual power supplies - one just for the graphics card...this is getting totally insane. I honestly think these power requirements will slow down sales - most people will not be willing to re-configure their PCs just to get a damn video card. Only the hardcore gamers will buy this card, which is perhaps all that nVidia is hoping for.
So many servers have already moved to this model. One of the servers that I administer had a ton of trouble starting up until I upgraded the PSU from 350W to 450W. Hard drives are usually the biggest culprits, they need a lot of juice to spin up for the first time. So often there will be two power supplies, one dedicated to the RAID array and another for the rest of the machine. This is completely different from having two PSUs for redundancy's sake. But the craziest I have seen is the 4 PSU setup, two because the machine just needs that much power and 2 more for the backup. I am sure more hardcore stuff exists, I just haven't personally seen it, yet.

Suhit

Christian
04-16-2004, 05:08 PM
I'm mostly hoping that more production 6800 (Ultra)'s will feature dual DVI output. I've been running two LCDs for a while and would not consider any video card that didn't feature dual DVI. I'm currently running an Asus GeForce FX 5200 Video Suite, but the gaming performance leaves a lot to be desired. I'm also curious if the non-ultra 6800 could run in a SFF? :wink:

Suhit Gupta
04-16-2004, 05:21 PM
I'm mostly hoping that more production 6800 (Ultra)'s will feature dual DVI output. I've been running two LCDs for a while and would not consider any video card that didn't feature dual DVI. I'm currently running an Asus GeForce FX 5200 Video Suite, but the gaming performance leaves a lot to be desired.
Well you are in luck then because the 6800 Ultra does support dual DVI - proof is here (http://www.hardocp.com/image.html?image=MTA4MTc0NzQ0ODZxTE1PbWV1dFNfMV8xMF9sLmpwZw==).
I'm also curious if the non-ultra 6800 could run in a SFF? :wink:
Given how huge the card is, I don't think it would even fit :) (forgetting about power concerns even though it is the non-Ultra). I think the SFF PCs are going to have to stay with on-board graphics chips.

Suhit

Christian
04-16-2004, 05:28 PM
Well you are in luck then because the 6800 Ultra does support dual DVI - proof is here (http://www.hardocp.com/image.html?image=MTA4MTc0NzQ0ODZxTE1PbWV1dFNfMV8xMF9sLmpwZw==).
I saw that the reference design includes dual DVI - the FX reference designs did too. Unfortunately, 99% of the production FX cards then dropped the dual DVI for DVI/VGA.

Given how huge the card is, I don't think it would even fit :) (forgetting about power concerns even though it is the non-Ultra). I think the SFF PCs are going to have to stay with on-board graphics chips.

Given that certain SFF designs can already fit the AIW9800 and other high end video cards, I still have hopes that the (single slot) 6800 will fit in either current or future SFFs. Really my concern is more with the power consumption and cooling.

Suhit Gupta
04-16-2004, 05:41 PM
I saw that the reference design includes dual DVI - the FX reference designs did too. Unfortunately, 99% of the production FX cards then dropped the dual DVI for DVI/VGA.
Good point. I think the high end cards will do the dual DVI. Typically what happens is that most manufacturers will make the single DVI version with a high end chip in order to sell them for a slightly lower cost.
Given that certain SFF designs can already fit the AIW9800 and other high end video cards, I still have hopes that the (single slot) 6800 will fit in either current or future SFFs. Really my concern is more with the power consumption and cooling.
Interesting, I did not know that the 9800 fit in a SFF. Is this the 9800XT? I doubt whether you can use any of your other slots if you put in one of the top of the line video cards... on that note how many PCI slots for you have? If you have only one then the 6800 (ultra or not) will go over in size. But if you have two PCI slots then you will probably still have one to spare. One thing you could do is change the cooling device on the video card. Have you thought of liquid cooling?

Suhit

Christian
04-16-2004, 05:49 PM
Interesting, I did not know that the 9800 fit in a SFF. Is this the 9800XT? I doubt whether you can use any of your other slots if you put in one of the top of the line video cards... on that note how many PCI slots for you have? If you have only one then the 6800 (ultra or not) will go over in size. But if you have two PCI slots then you will probably still have one to spare. One thing you could do is change the cooling device on the video card. Have you thought of liquid cooling?

I don't own a SFF myself yet, but have been planning to do so in the near future. I was looking at a case like the Antec Aria (mATX, AGP + 3 PCI slots) but most smaller SFF like the Shuttle XPCs tend to have only 1 PCI slot. For good discussions concerning exactly which video cards fit inside different SFFs (and with what aftermarket cooling solutions) see www.sfftech.com. In terms of size, I'm mostly curious if the non-ultra 6800 could be used without taking up an adjacent PCI slot - perhaps with an aftermarket cooling solution. All I've heard is that the non-ultra is intended to be a single-slot card.

Suhit Gupta
04-16-2004, 05:55 PM
In terms of size, I'm mostly curious if the non-ultra 6800 could be used without taking up an adjacent PCI slot - perhaps with an aftermarket cooling solution. All I've heard is that the non-ultra is intended to be a single-slot card.
So I am of the opinion that since the GPU dissipates a ton of heat, almost as much as a CPU, the PCI slot right next to the AGP slot should be left alone regardless. And in a SFF, I think it would be even more important because the air circulation is probably not as great. BTW, how many PCI cards do you need to put on? If you get an nForce3 mobo, you are set with sound and network, so once you pop in the video card, you really don't need anything else, do you? Maybe firewire card?

Suhit

Lee Yuan Sheng
04-16-2004, 06:05 PM
Hmm, SFF + high end gaming components = bad idea in my book.

Why not take a look at a generation back? I think the ATI 9600XTs are great cards for their price, and certainly don't place such heavy demands compared to the new cards.

Christian
04-16-2004, 06:11 PM
So I am of the opinion that since the GPU dissipates a ton of heat, almost as much as a CPU, the PCI slot right next to the AGP slot should be left alone regardless. And in a SFF, I think it would be even more important because the air circulation is probably not as great. BTW, how many PCI cards do you need to put on? If you get an nForce3 mobo, you are set with sound and network, so once you pop in the video card, you really don't need anything else, do you? Maybe firewire card?

You're certainly right that the heat is a big issue. Its probably too early to tell whether it is a solvable one. Concerning PCI cards, I would either add an Audigy2 Platinum Pro (for the wide range of connections), a PVR or no PCI cards at all. With the Aria I could certainly keep 1 or 2 slots empty for better air circulation. It comes with a 300W PSU, which is high for a SFF but borderline for the 6800. We'll see :)

Either way, I'm really more excited at the prospect of 6 series midrange cards that I can actually afford. :wink:

Christian
04-16-2004, 06:13 PM
Hmm, SFF + high end gaming components = bad idea in my book.

Why not take a look at a generation back? I think the ATI 9600XTs are great cards for their price, and certainly don't place such heavy demands compared to the new cards.

I am certainly looking at the ATI 9600XT, except that finding one with dual DVI is quite the challenge. :wink:

omikron.sk
04-16-2004, 06:14 PM
Uff, 3-4 years ago I've read an article about new graphic technologies and there was a joke next to it (my weak translation):

The most modern graphical technologies require three-base-eletrict-current.
Never mind.

Man! This is slowly becoming true!

Suhit Gupta
04-16-2004, 06:22 PM
I am certainly looking at the ATI 9600XT, except that finding one with dual DVI is quite the challenge. :wink:
HIS makes a card that has gotten reasonable (not necessarily out of this world) reviews. It is the HIS Excalibur 9600XT TURBO 128MB / HIS Excalibur 9600 DUAL DVI-I 256MB. Cost is pretty good too. See if that helps.

Suhit

Jason Dunn
04-16-2004, 06:22 PM
So I am of the opinion that since the GPU dissipates a ton of heat, almost as much as a CPU, the PCI slot right next to the AGP slot should be left alone regardless. And in a SFF, I think it would be even more important because the air circulation is probably not as great.

It's less of a problem than you think. Because they're so small, there's less air to move, and thus you can get by with a single fan and 100% passive CPU cooling. I have an ATI 9600 Pro that has passive cooling (and takes up the PCI slot unfortunately), and have zero heat issues.

BTW, how many PCI cards do you need to put on? If you get an nForce3 mobo, you are set with sound and network, so once you pop in the video card, you really don't need anything else, do you? Maybe firewire card?

Nothing, and that's the point. ;-) All Shuttle PCs, for example, have onboard Firewire, USB, audio, LAN, etc. About the only thing you might need to add is a RAID card (although some have that on-board), or perhaps video capture if you don't want to go external.

James Fee
04-16-2004, 06:35 PM
Seems like I need to upgrade my GeForce 2 card! 8O
Wow, I didn't know people still used those ;-).
Well it is a GeForce2 GTS so its a great card. I don't play games on the computer so I've not run into any issues with what I use it for.

BUT, I'm thinking of building a new computer so at that point, I'm getting a new card.

Suhit Gupta
04-16-2004, 07:49 PM
Well it is a GeForce2 GTS so its a great card. I don't play games on the computer so I've not run into any issues with what I use it for.

BUT, I'm thinking of building a new computer so at that point, I'm getting a new card.
I was just pulling your leg there a bit :). No doubt that the GeForce2 GTS was an extremely solid card and if you don't game much then that should suffice for most applications.

Suhit

Suhit Gupta
04-17-2004, 01:24 AM
BTW, there is a nice article that [H]ardOCP has just posted on the technical detail of the GeForce 6 series. It might answer some of the questions people had about the stats quoted earlier - http://www.hardocp.com/article.html?art=NjA1

Suhit