AOA Forums AOA Forums AOA Forums Folding For Team 45 AOA Files Home Front Page Become an AOA Subscriber! UserCP Calendar Memberlist FAQ Search Forum Home


Go Back   AOA Forums > Hardware > Graphics and Sound cards; Speakers and other Peripherals


Reply
 
LinkBack Thread Tools Rate Thread
  #1 (permalink)  
Old 19th December, 2004, 01:54 AM
Pitch's Avatar
AOA Staff
Asteroids Champion, Maeda Path Champion, Disco Racer Champion, Alpha Bravo Charlie Champion, Van Champion
 
Join Date: February 2004
Location: The cake is a lie.
Posts: 5,025
Send a message via MSN to Pitch

New VIA chipset to allow Dual ATI cards?

It seems that the VIA K8T890 Chipset has the ability to use two graphics card rather like nVidias SLi technology. However, this new process seems driver based wather than hardware.

Link
__________________


XBL/PNS = neolad
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #2 (permalink)  
Old 19th December, 2004, 06:47 AM
Member/Contributor/Resident Crystal Ball
 
Join Date: March 2004
Posts: 7,451

SO you can imagine the overhead...damn it all... we need better cpu's first! And cheap GDDR3 ram!
__________________
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #3 (permalink)  
Old 19th December, 2004, 03:18 PM
Chief Systems Administrator
 
Join Date: September 2001
Location: Europe
Posts: 13,075

I wonder how they get the rendering from one card to another? After all, that's pretty much what nVidia's SLi does - act as a high speed bridge directly between the two GPUs.

This might be interesting, or it might just be a feature that doesn't work very well. The benchmarks should show either way!
__________________
Any views, thoughts and opinions are entirely my own. They don't necessarily represent those of my employer (BlackBerry).
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #4 (permalink)  
Old 19th December, 2004, 06:56 PM
DiabloAbogado's Avatar
Member
 
Join Date: September 2004
Location: Kentucky
Posts: 345

Interesting question Aedan cause it should (as SLI does) pass both processed images back thru a single output. Hmm...

Although, I heard ATI is coming out with something similar to SLI, just calling it something else.
__________________
AMD 3200+ 2.2 GHz, MSI K8N NEO Platinum, 2x512 OCZ 3200 EL Plat , 2x60GB SATA (Raid 0), MSI DVD+/-RW, Koolance 300-V13, Hydor L30, Black Ice Xtreme (120mm), GF7300 GT
AOA Team fah
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #5 (permalink)  
Old 20th December, 2004, 02:51 AM
Pitch's Avatar
AOA Staff
Asteroids Champion, Maeda Path Champion, Disco Racer Champion, Alpha Bravo Charlie Champion, Van Champion
 
Join Date: February 2004
Location: The cake is a lie.
Posts: 5,025
Send a message via MSN to Pitch

I think my nect card will be a Gigabyte 3D1
__________________


XBL/PNS = neolad
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #6 (permalink)  
Old 20th December, 2004, 02:21 PM
Chief Systems Administrator
 
Join Date: September 2001
Location: Europe
Posts: 13,075

Quote:
Originally Posted by DiabloAbogado
Interesting question Aedan cause it should (as SLI does) pass both processed images back thru a single output. Hmm...
That's my concern - otherwise the motherboard will get involved in copying the image data across from the card on the 4X link to the one on the 16X link. At high frame rates, this might well equate to a lot of data!
__________________
Any views, thoughts and opinions are entirely my own. They don't necessarily represent those of my employer (BlackBerry).
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #7 (permalink)  
Old 20th December, 2004, 05:32 PM
Gizmo's Avatar
Chief BBS Administrator
BassTeroids Champion, Global Player Champion, Aim & Fire Champion, Puzzle Maniax Champion, Othello Champion, Canyon Glider Champion, Unicycle Challenge Champion, YetiSports 9: Final Spit Champion, Zed Champion
 
Join Date: May 2003
Location: Webb City, Mo
Posts: 16,178
Send a message via ICQ to Gizmo Send a message via AIM to Gizmo Send a message via MSN to Gizmo Send a message via Yahoo to Gizmo Send a message via Skype™ to Gizmo

I suspect that is exactly the case, Áedán. I understand that both ATI and nVidia are coming out with UMA video cards using the euphamism of 'turbo memory' or some such thing, where the card only has something like 32 meg of onboard memory, but uses like 256 meg of system memory as a frame buffer. Using this kind of an architecture, you could implement a cheap type of SLI by sharing the memory between the two cards, but you'd sure hammer the system memory bus doing it.
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #8 (permalink)  
Old 20th December, 2004, 06:56 PM
Member/Contributor/Resident Crystal Ball
 
Join Date: March 2004
Posts: 7,451

And that's why ati's chipset has a gddr square next to it... i posted pics a few weeks back. they use the mem at the chipset as the buffer for passing the info back to the 4x, i believe.
__________________
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #9 (permalink)  
Old 21st December, 2004, 10:42 AM
Member/Contributor/Resident Crystal Ball
 
Join Date: March 2004
Posts: 7,451

Quote:
Originally Posted by Áedán
That's my concern - otherwise the motherboard will get involved in copying the image data across from the card on the 4X link to the one on the 16X link. At high frame rates, this might well equate to a lot of data!

What about the 6800 PCI-E that has to use a bridge chip to take advantage of the pci-e slot? There is a lot of conversion going on there, and i think that ATI may have something going on themselves, as thier chips are natively pci-e.
__________________
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #10 (permalink)  
Old 21st December, 2004, 11:55 AM
Chief Systems Administrator
 
Join Date: September 2001
Location: Europe
Posts: 13,075

Converting the protocol from PCI-E to AGP isn't nearly as much overhead as the VIA solution, as far as I can see. With nVidia's PCI-E to AGP, it's a basic one time conversion. Remember that AGP is actually very similar to PCI, just run at a higher speed. PCI-E is a serialised version of PCI, so the conversion does not increase the amount of data passing over the bus. The latency increase in doing the conversion from PCI-E to AGP is nothing compared to the length of time it takes to render an image, so there's no issues there either.

Once GPUs get to the point where they can outrun AGPx8, then the current nVidia bridging solution will have an issue. From my perspective, it seems that nVidia make a slightly better choice, in that they've given PCI-E time to bed down and have the bugs worked out of the chipsets before they integrate it on chip. ATi saw direct PCI-E compatibilty as worth having, and integrated their PCI-E directly on chip. As a result, ATi had some issues with signal setup times, where nVidia have a slight cost issue with the second bridge chip.

Ultimately, it doesn't matter, as the current generation of GPUs can't max out AGP 8x anyway. The next generation is where things become more interesting. ATi will have had time to have worked out the issues with their PCI-E implemenation, and nVidia will have had time to directly implement PCI-E on their chips.

Going back towards topic again, the SLi bridge that nVidia uses between the two cards is to avoid having to copy data via the PCI-E bus. This is where the VIA solution falls down, as it's reliant on the PCI-E bus to copy all the image data from one card to the other, as well as pushing all the polygon/texture data up to the card. If ATi are sensible, they'll follow a similar concept to nVidia to avoid possible bottlenecks on the slower 4x PCI-E connection. Obviously it won't be called SLi though!
__________________
Any views, thoughts and opinions are entirely my own. They don't necessarily represent those of my employer (BlackBerry).
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #11 (permalink)  
Old 21st December, 2004, 06:35 PM
Member/Contributor/Resident Crystal Ball
 
Join Date: March 2004
Posts: 7,451

But they can max out 8x agp..in fact, they can run @ 12xagp..or at least that is what nvidia is trying to do with the 6800.
Quote:
Geforce 6800 Ultra and GT PCIe are based on the good old NV40 core paired with BR2. Nvidia calls it NV45. Nvidia was happy to tell us before that Nvidia managed to overclock NV40 AGP interface from 8X to 12 X or faster. With this overclock, Nvidia managed to match the AGP8X chip speed with 16X PCIe interface and make a chip which is almost as fast as the native PCIe one.
http://theinquirer.net/?article=20321
__________________
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
  #12 (permalink)  
Old 22nd December, 2004, 08:22 AM
Member/Contributor/Resident Crystal Ball
 
Join Date: March 2004
Posts: 7,451

So it brings up this too...Nvidia have quite a thing going...8xagp-resident chip overclocked to 12, maybe 16x agp...BR2 chip (thin, and under a seperate heatsink in the 6600's), and the SLi chipset, all on pci-e. I see 8 parts...the 2 gpu's, the 2 br2's, and chipset, the SLi "socket board", pci-e, and one hell of a driver to get it all to work.

Great..nvidia has sm3.0...not pci-e native! ATI has 3Dc..possibly a comparable technology, visually different from SM3.0, but pci-e native. WHo cares if it's only 4x...Nvidia has to overclock the agp specs, just to get it to work!
No wonder pci-e 6800U's are hard to find...they are overclocked so many ways....they almost have to be a perfect chip!

I bet when ATI gets it working...tides will turn.

Guess what all the engineer's at NVidia are working on right now? I think I know.
__________________
Digg this Post!Add Post to del.icio.usBookmark Post in TechnoratiFurl this Post!
Reply With Quote
Reply



Currently Active Users Viewing This Thread: 1 (0 members and 1 guests)
 
Thread Tools
Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Dual Cores And Nvida Cards cadaveca Graphics and Sound cards; Speakers and other Peripherals 5 7th October, 2005 09:16 PM
nForce2 Single Channel DDR chipset overthrows Dual Channel DDR? ravenlot AMD Motherboards & CPUs 4 13th July, 2003 09:33 PM
Dual Channel DDR P4 chipset is out When will Epox release one ? retrospooty AMD Motherboards & CPUs 17 21st November, 2002 03:50 AM
Triple Duals (Dual CPU, Dual Monitor & Dual Case) IFMU Case Modifications 6 29th October, 2002 06:37 AM
Trouble with 8K3A+ and dual vid cards. K6-III EPoX MotherBoards 9 27th May, 2002 04:16 AM


All times are GMT +1. The time now is 10:09 AM.


Copyright ©2001 - 2023, AOA Forums
Don't Click Here Don't Click Here Either

Search Engine Friendly URLs by vBSEO 3.3.0