But for me, one of the biggest shortcomings is the limited function of the hybrid crossfire. I mean, there was next to no benefit by adding an HD3450 to the 780G/790GX onboard graphics, at least not for gaming. There was/is only two models of cards that would do hybrid crossfire.
So I'm wondering with the designation HD 4200, would any of the HD 4XXX series cards be capable of hybrid crossfire? If so, an HD 4650 or 4670 low profile card would be ideal for a living room media center/ gaming box!
Anybody know why AMD hasn't developed hybridd crossfire more?
board layout seems to be bad. 2 PCI, 1 PCIe slots instead of 2x PCIe 1 PCI and the PCIe seems to be blocked by the chipset HS anyway. I hope the full review will address the PCIe 1x slot space.
also CPU HSF retention frame seems to be too close to memory slots. will a Scytche Ninja Mini fit in there ?
I've actually gotten one of this board from Newegg before they deactivated the item. Guess they let the cat out of the bag early.
I've installed a HD tuner card in the PCIe slot and it worked just fine.
Not sure about the CPU HSF retention frame. I'm using a 7780 and stock fan and seems to fit just fine. Send me a Ninja Mini and I'll let you know... :P
In the mean-time, I am in driver hell as AMD does not support this chipset officially yet, and the current catalyst drivers/utils does not work with it. Anyone know where I can fine some beta/pre-release drivers?
the point wasn't to be funny (well given it was, but not for that reason), the point was that they are putting the board through a round of torture tests to see if it breaks. the comparison was rather accurate, regardless of how "PC" it was
I can't believe no one noticed this. I was reading through the article, and was like, "Impossible, you can't unlock the Athlon II x2 because it's a true dual core." Then I realized that he meant the phenom II 550, please make a correction to the article I was completely thrown off.
I noticed that too. But everything in the article, including the screen shots, seems to indicate they did in fact unlock an Athlon II X2 550BE. Yes, I know, we were all led to believe Athlon II's are a new true dual-core die based on the Phenom II architecture so there should be nothing to unlock.
*ATi only supports *some* videos, in essence those with 4 or less ref frames @1080p, or 9 or less ref frames @720p - Nvidia supports ALL videos, as long as the card has PureVideo2 (I think, dunno about the 9300 chipset).
*ATi does provide VC1 decoding support while nvidia does not, but VC1 decoding is lighter on the processing requirements compared to H.264 anyway, so don't know if that's a big deal.
Good to hear that the integrated graphics is good enough for all but non-casual gaming. Surely this will be very popular in media PCs once it is released. Indeed it is good enough to put off the purchase of something now if you are going to be building a media PC.
What was the performance like when it was overclocked to over 1GHz, as you article says you did?
Next step, AMD's southbridge...
(weird, this comment post page was just redirecting to Anandtech's search page just now)
It's not good enough for all but non-casual gaming, you don't have to be a hardcore gamer to plop down $20 on last year's title and find you can't play at native resolution on a cheap 22" LCD.
There is no reason to think it's particularly popular in HTPCs, several IGPs do HD decoding and any small gains are irrelevant compared to driver support for any of them, a point also mentioned in the article.
It's just senseless to pretend performance matters in an IGP compared to basic HD and driver support, when systems cost at least $300 and a proper video card costs no more than the cost of a couple discounted games or one new game.
I appreciate that IGPs are boring and an article has to try and find the differences, but it is also useful to remember that tiny evolutionary differences should be seen for what they are, IGPs almost always get a tiny bit faster so basically 785 is just another disappointment in being nothing more than a minor, industry typical, gain.
That's not to put down the IGP or IGP products in general, it is great they have the performance and capabilities they do now, but in the end we have reached the point where gaming performance from them is still irrelevant and HTPC was already as mature as it needs be and will be for the next few generations so ultimately their goal should be to lower price so it is more attractive in systems where a system integrator would otherwise pick something else (like an Intel chipset) to lower costs.
*For games like the sims that reach popular mass, IGPs do a good enough job, and the difference in performance matters for this. Also with the introduction of the 3D accelerated desktop, the baseline performance expected of an IGP took a sudden jump and Intel was caught with their pants down. Another such jump will occur as compute shaders/opencl becomes more and more prevalent (timeframe big unknown tho), performance will matter more, but by then we're probably talking about IGPs as part of the CPU.
*The differences in HD support are big enough that one would be unsuitable while the other barely passes. 6ch vs 2ch LPCM, H.264 L4.1 vs L5.0 support, ... These differences still exist today so I don't agree that HTPC has reached a point of maturity, they still need to step up their game.
Any word on this being used as the basis of a 795GX chipset? Would be a nice option to the 785G if it included higher clock speeds (binned chips?) and side-port memory.
When will this land on lappy land ? HD3200 had very low penetration, i would if this chipset will be associated with Turion CPUs. It was a nice way to have a decent GPU with HDMI and some bells and whistles.
While one can argue the relevance of the L5.0 spec outside of "scene releases" if there are currently no broad consumer products like BluRay that use it, but there are 2 problems with that:
*As indicated, there are official trailers released in this spec, and I assume as time passes, more material will use it.
*Regardless of its use, NVIDIA DOES support L5.0 spec, this will drive the HTPC nut (and warezerers) to their cards = lost sales
The hardware should be capable, it's purely a driver issue.
It is rumoured to keep the 40 shaders of the previous generation.
And yet we know that the smallest 4xxx series chip used 80 shaders.
So, is the 785G a rebranded 3450 chip, or is it a cut down 4350 chip?
And why did AMD decide to cede leadership of the IGP to nVidia/Intel after ruling the roost with the 780G for so long? This new IGP should have had a genuine 80 shader 4xxx series GPU.
1. RS785G is just respin RS780G. It's has same 40 ALU, but DX10.1 capable. Additional it's has UVD2 instead UVD+.
So, it's looks like step from HD2400 to HD3400 in 3D, and from HD3xxx to HD4xxx in video acceleration.
2. Anybody knows are there HD4xxx changes in the texture and ROP units? If it so then HD4200 is indeed HD4xxx core, but most weakest of all.
There are quite a few differences between rv6xx and rv7xx chips. ROPs can do twice the amount of z comparisons, plus handle msaa natively without shader resolve.
Texture units no longer handle FP16 textures in one clock, but much smaller in size plus seem to have improved efficiency. And more importantly, tmus are no longer shared across shader clusters, but each cluster has its own quad-tmu (thus, a 40SP part with 4 tmus would have 1 shader array with length 8, instead of a rv6xx part where such a part will have 2 arrays with length 4).
All rumours seem to say rv620 graphic core, though.
1. I did a typo. I known about changes inside HD4xxx, my question was are this changes applied to HD4200? Without this information we can't say what is it. rv620 or cutted rv710.
2. rv730 has length 4. It's looks like very flexible architecture.
well without these changes it wouldn't be a rv7xx part.
And rv730 has shader array length of 8, just like rv710 (rv770/rv740 have array length 16). Not to say that a even more cut down version couldn't have 4, but this would mean two shader arrays (with 40 shader units), hence 2 quad-tmus (8 texture units).
1. Note. rv670 have 4 arrays with length 16.
2. When I say "cutted rv710" I mean that one array will be removed at all. And then cutted rv710 will be architectural look like rv620. And only information about texture and rop units can discover the truth.
I don't follow you there. Sure rv670 had 4 arrays with length 16. Hence 16 tmus (need as many tmus as shader length since they are shared across arrays). But with rv7xx, you can't have 2 shader arrays with 4 tmus since each array has its own quad-tmu (I guess though it would in theory be possible to have two dual-tmu blocks but then you're talking more difference to rv7xx parts). Hence one array with length 8. But the rv620 had two arrays with length 4.
But I think it really is rv620 - there was some statement somewhere it supports hybrid-crossfire with HD34xx - I doubt that would be possible with a rv7xx part.
Anyway, looks like it will soon be on sale so someone will figure it out...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
30 Comments
Back to Article
buzznut - Thursday, July 16, 2009 - link
But for me, one of the biggest shortcomings is the limited function of the hybrid crossfire. I mean, there was next to no benefit by adding an HD3450 to the 780G/790GX onboard graphics, at least not for gaming. There was/is only two models of cards that would do hybrid crossfire.So I'm wondering with the designation HD 4200, would any of the HD 4XXX series cards be capable of hybrid crossfire? If so, an HD 4650 or 4670 low profile card would be ideal for a living room media center/ gaming box!
Anybody know why AMD hasn't developed hybridd crossfire more?
OblivionLord - Thursday, July 16, 2009 - link
What about non H.264 formats like VC-1 and also what about non DXVA compat videos? Those are also important factors.haplo602 - Thursday, July 16, 2009 - link
board layout seems to be bad. 2 PCI, 1 PCIe slots instead of 2x PCIe 1 PCI and the PCIe seems to be blocked by the chipset HS anyway. I hope the full review will address the PCIe 1x slot space.also CPU HSF retention frame seems to be too close to memory slots. will a Scytche Ninja Mini fit in there ?
span - Tuesday, July 28, 2009 - link
I've actually gotten one of this board from Newegg before they deactivated the item. Guess they let the cat out of the bag early.I've installed a HD tuner card in the PCIe slot and it worked just fine.
Not sure about the CPU HSF retention frame. I'm using a 7780 and stock fan and seems to fit just fine. Send me a Ninja Mini and I'll let you know... :P
In the mean-time, I am in driver hell as AMD does not support this chipset officially yet, and the current catalyst drivers/utils does not work with it. Anyone know where I can fine some beta/pre-release drivers?
adnoto - Thursday, July 16, 2009 - link
"Before we could even try our hands at waterboarding Gigabyte's latest multimedia wonder, testing came to a grinding halt."Disgusting.
faxon - Thursday, July 16, 2009 - link
the point wasn't to be funny (well given it was, but not for that reason), the point was that they are putting the board through a round of torture tests to see if it breaks. the comparison was rather accurate, regardless of how "PC" it wasdmv915 - Thursday, July 16, 2009 - link
I can't believe no one noticed this. I was reading through the article, and was like, "Impossible, you can't unlock the Athlon II x2 because it's a true dual core." Then I realized that he meant the phenom II 550, please make a correction to the article I was completely thrown off.johnsonx - Thursday, July 16, 2009 - link
I noticed that too. But everything in the article, including the screen shots, seems to indicate they did in fact unlock an Athlon II X2 550BE. Yes, I know, we were all led to believe Athlon II's are a new true dual-core die based on the Phenom II architecture so there should be nothing to unlock.More explanation would be welcomed.
sprockkets - Wednesday, July 15, 2009 - link
In order to take advantage of nVidia's PureVideoHD, you need Media Center Classic Cinema Ed, or say a special build of mplayer on Linux.Any such support for it on free programs other than crappy WinDVD or Cyberlink software or on Linux?
Spoelie - Thursday, July 16, 2009 - link
MPC:HC is free..sprockkets - Thursday, July 16, 2009 - link
Wait, OK, seems that both ATi and nVidia work with DXA right?Well, I'll probably still get the Zotac 9300 board w/Pentium DC anyhow.
Spoelie - Friday, July 17, 2009 - link
It works with both, with the following caveats:*ATi only supports *some* videos, in essence those with 4 or less ref frames @1080p, or 9 or less ref frames @720p - Nvidia supports ALL videos, as long as the card has PureVideo2 (I think, dunno about the 9300 chipset).
*ATi does provide VC1 decoding support while nvidia does not, but VC1 decoding is lighter on the processing requirements compared to H.264 anyway, so don't know if that's a big deal.
sprockkets - Thursday, July 16, 2009 - link
But does it work with ATi stuff? Same question goes for Linux. Otherwise, I'll have to stick with nVidia stuff.psychobriggsy - Wednesday, July 15, 2009 - link
Good to hear that the integrated graphics is good enough for all but non-casual gaming. Surely this will be very popular in media PCs once it is released. Indeed it is good enough to put off the purchase of something now if you are going to be building a media PC.What was the performance like when it was overclocked to over 1GHz, as you article says you did?
Next step, AMD's southbridge...
(weird, this comment post page was just redirecting to Anandtech's search page just now)
mindless1 - Thursday, July 16, 2009 - link
It's not good enough for all but non-casual gaming, you don't have to be a hardcore gamer to plop down $20 on last year's title and find you can't play at native resolution on a cheap 22" LCD.There is no reason to think it's particularly popular in HTPCs, several IGPs do HD decoding and any small gains are irrelevant compared to driver support for any of them, a point also mentioned in the article.
It's just senseless to pretend performance matters in an IGP compared to basic HD and driver support, when systems cost at least $300 and a proper video card costs no more than the cost of a couple discounted games or one new game.
I appreciate that IGPs are boring and an article has to try and find the differences, but it is also useful to remember that tiny evolutionary differences should be seen for what they are, IGPs almost always get a tiny bit faster so basically 785 is just another disappointment in being nothing more than a minor, industry typical, gain.
That's not to put down the IGP or IGP products in general, it is great they have the performance and capabilities they do now, but in the end we have reached the point where gaming performance from them is still irrelevant and HTPC was already as mature as it needs be and will be for the next few generations so ultimately their goal should be to lower price so it is more attractive in systems where a system integrator would otherwise pick something else (like an Intel chipset) to lower costs.
Spoelie - Thursday, July 16, 2009 - link
*For games like the sims that reach popular mass, IGPs do a good enough job, and the difference in performance matters for this. Also with the introduction of the 3D accelerated desktop, the baseline performance expected of an IGP took a sudden jump and Intel was caught with their pants down. Another such jump will occur as compute shaders/opencl becomes more and more prevalent (timeframe big unknown tho), performance will matter more, but by then we're probably talking about IGPs as part of the CPU.*The differences in HD support are big enough that one would be unsuitable while the other barely passes. 6ch vs 2ch LPCM, H.264 L4.1 vs L5.0 support, ... These differences still exist today so I don't agree that HTPC has reached a point of maturity, they still need to step up their game.
gipper - Wednesday, July 15, 2009 - link
Speaking of LPCM, has there been any progress with playing back Blu-ray discs or rips without the software downsampling the audio?brausekopf - Wednesday, July 15, 2009 - link
The chipset heatsink seems to take a lot of space.And if there is enough room will a card be roasted
by the heatsink?
Otherwise: I like to see the floppy port and hope
to see a PS/2 port.
mamisano - Wednesday, July 15, 2009 - link
Any word on this being used as the basis of a 795GX chipset? Would be a nice option to the 785G if it included higher clock speeds (binned chips?) and side-port memory.Can't wait to see the full review next month :)
AlB80 - Wednesday, July 15, 2009 - link
760G/780V/780G/790GX has the same crystal. If there will be new chip then all of them get it (maybe with 760G exception).RadnorHarkonnen - Wednesday, July 15, 2009 - link
When will this land on lappy land ? HD3200 had very low penetration, i would if this chipset will be associated with Turion CPUs. It was a nice way to have a decent GPU with HDMI and some bells and whistles.Thanks
Spoelie - Wednesday, July 15, 2009 - link
Will AMD update their drivers to include support for AVC L5.0/5.1 spec decoding? Currently 4.1 is supported as this is the spec BluRay uses.http://forums.amd.com/game/messageview.cfm?catid=2...">http://forums.amd.com/game/messageview....&thr...
While one can argue the relevance of the L5.0 spec outside of "scene releases" if there are currently no broad consumer products like BluRay that use it, but there are 2 problems with that:
*As indicated, there are official trailers released in this spec, and I assume as time passes, more material will use it.
*Regardless of its use, NVIDIA DOES support L5.0 spec, this will drive the HTPC nut (and warezerers) to their cards = lost sales
The hardware should be capable, it's purely a driver issue.
R3MF - Wednesday, July 15, 2009 - link
It is rumoured to keep the 40 shaders of the previous generation.And yet we know that the smallest 4xxx series chip used 80 shaders.
So, is the 785G a rebranded 3450 chip, or is it a cut down 4350 chip?
And why did AMD decide to cede leadership of the IGP to nVidia/Intel after ruling the roost with the 780G for so long? This new IGP should have had a genuine 80 shader 4xxx series GPU.
AlB80 - Wednesday, July 15, 2009 - link
1. RS785G is just respin RS780G. It's has same 40 ALU, but DX10.1 capable. Additional it's has UVD2 instead UVD+.So, it's looks like step from HD2400 to HD3400 in 3D, and from HD3xxx to HD4xxx in video acceleration.
2. Anybody knows are there HD4xxx changes in the texture and ROP units? If it so then HD4200 is indeed HD4xxx core, but most weakest of all.
mczak - Wednesday, July 15, 2009 - link
There are quite a few differences between rv6xx and rv7xx chips. ROPs can do twice the amount of z comparisons, plus handle msaa natively without shader resolve.Texture units no longer handle FP16 textures in one clock, but much smaller in size plus seem to have improved efficiency. And more importantly, tmus are no longer shared across shader clusters, but each cluster has its own quad-tmu (thus, a 40SP part with 4 tmus would have 1 shader array with length 8, instead of a rv6xx part where such a part will have 2 arrays with length 4).
All rumours seem to say rv620 graphic core, though.
AlB80 - Wednesday, July 15, 2009 - link
1. I did a typo. I known about changes inside HD4xxx, my question was are this changes applied to HD4200? Without this information we can't say what is it. rv620 or cutted rv710.2. rv730 has length 4. It's looks like very flexible architecture.
mczak - Wednesday, July 15, 2009 - link
well without these changes it wouldn't be a rv7xx part.And rv730 has shader array length of 8, just like rv710 (rv770/rv740 have array length 16). Not to say that a even more cut down version couldn't have 4, but this would mean two shader arrays (with 40 shader units), hence 2 quad-tmus (8 texture units).
AlB80 - Thursday, July 16, 2009 - link
1. Note. rv670 have 4 arrays with length 16.2. When I say "cutted rv710" I mean that one array will be removed at all. And then cutted rv710 will be architectural look like rv620. And only information about texture and rop units can discover the truth.
mczak - Thursday, July 16, 2009 - link
I don't follow you there. Sure rv670 had 4 arrays with length 16. Hence 16 tmus (need as many tmus as shader length since they are shared across arrays). But with rv7xx, you can't have 2 shader arrays with 4 tmus since each array has its own quad-tmu (I guess though it would in theory be possible to have two dual-tmu blocks but then you're talking more difference to rv7xx parts). Hence one array with length 8. But the rv620 had two arrays with length 4.But I think it really is rv620 - there was some statement somewhere it supports hybrid-crossfire with HD34xx - I doubt that would be possible with a rv7xx part.
Anyway, looks like it will soon be on sale so someone will figure it out...
AlB80 - Thursday, July 16, 2009 - link
I understood.