| |||
performance numbers have been out since friday. people have been posting benches all weekend...as some stores sold thiers early(most likely to staff @ the store). Kinda kills having an NDA on it, when the scores, and the cards themselves, are already public. What's best is the first scores tha emerged had the x1900xtx performing worse than the "Performance Edition" x1800xt, and from my own experiences, far worse than even my own card, which hits 9256 in 3dmark05 @ default clocks and cpu @3ghz...dude posting the scores ended up having his card @ 500mhz on the gpu, and 600mhz on the mem, reporting 9042 3dmarks!! needless to say, this gives you a general idea as to how powerful the x1900xt really is, as the stock speeds are 650/675. I'vee been telling people here for the longest time to await R580, and there is a reason...because of the 8/48/16 config of the R580 vs. the 8/16/16 config of the x1800xt(3x the shader/fragment processors for x1900xt, while the raster and vertex processing remains the same), there are alot of apps where the difference in performance between the two cards is negligible. Nevermind the yield issues that surround the R520 chipset...that may have nothing to do with the chipset @ all, but have more to do with the power-regulation silicon. Most important to note is that the increase in fragment processing, mixed in with the ringbus controller which can hand fully rendered scenes back to the gpu if needed, and you end up with quite an incredible unit, that can put anything nvidia has to offer to shame in real-world work, regardless of what the current 3dmark app says. did I mention the gpu cna handle 512 threads simultaneously? and you thought dualcores where intense! ![]()
__________________ Last edited by cadaveca; 24th January, 2006 at 04:31 AM. |
| ||||
To be honest I consider it a complete letdown. In all fairness, the G71 Nvidia card, I also expect will be equally as underwhelming, only with lower image quality as well.
__________________ Notebook: Apple Macbook Pro 13" i7 2.7Ghz (3.4Ghz max) 8GB DDR3 1333Mhz (Mac OSX 10.6.7) Desktop: ASUS Rampage Formula X48 Intel Core 2 Quad Q9450 (Yorkfield) @ 3.60Ghz (Folding SMP Linux) Running Fedora 15 Linux (GNOME 3) Dual Dell 2407WFP ![]() Drivers, Games, Demos, Mods and Overclocking Tools At AOAFiles |
| |||
Of course, DSIO. these cards are just filler to keep the consumers happy until unified shading can get some real support, as has been the case since before DX9 was a standard. Ati's unified shading core has been massaged and tweaked since then! default scores match the 3dmark score in my sig.
__________________ |
| ||||
Quote:
I think the 6800GS (PCI-E) is the best mid range card on the market. I think the X1900 doesn't even fill a space. I think we are well overdue for the next generation, and I'm getting tired of waiting. But thats just me. When progress gets this stagnant for this long it bugs me.
__________________ Notebook: Apple Macbook Pro 13" i7 2.7Ghz (3.4Ghz max) 8GB DDR3 1333Mhz (Mac OSX 10.6.7) Desktop: ASUS Rampage Formula X48 Intel Core 2 Quad Q9450 (Yorkfield) @ 3.60Ghz (Folding SMP Linux) Running Fedora 15 Linux (GNOME 3) Dual Dell 2407WFP ![]() Drivers, Games, Demos, Mods and Overclocking Tools At AOAFiles |
| |||
x1800 will be dissappearing form the shelves shortly. the current gen of r520 gpu's stopped production last month. the x1900xt, with it's greater number of fragment processors, is the x1800's succesor. In regards to the x1800xl, it is by no means a bargain card. there are problems with the voltage regulation of these cards that can cause an untimely death.
__________________ |
| ||||
All the better to RMA them with ![]()
__________________ Notebook: Apple Macbook Pro 13" i7 2.7Ghz (3.4Ghz max) 8GB DDR3 1333Mhz (Mac OSX 10.6.7) Desktop: ASUS Rampage Formula X48 Intel Core 2 Quad Q9450 (Yorkfield) @ 3.60Ghz (Folding SMP Linux) Running Fedora 15 Linux (GNOME 3) Dual Dell 2407WFP ![]() Drivers, Games, Demos, Mods and Overclocking Tools At AOAFiles |
| ||||
Their available @ newegg. Anybody feel nice enough to pick me up one as a Birthday gift?? ![]() ![]() ![]() ![]() http://www.newegg.com/Product/Produc...&Go.x=0&Go.y=0
__________________ |
| |||
i'm going to have to point out the fact that even with triple the shaders the x1900xt barely beats the x1800xt (from that standpoint) that makes the series kinda useless at this point, being that nothing seems to fully support the 48 shaders, i'll keep the x1800xt till the g80 comes out then I'll buy ati's response to that, (yes I'm hopeless but whatever)
__________________ |
| |||
WEll, the reason for the used config for the x1900 has alot to do with where ATI has invested it's dollar as of late. You can be very sure that the list of software partners @ the "Get in the Game" website will ahve software out whether it be full games or patches to old ones, that support ATI's design decision. At this point in the market, ATI can only hope that thier driver will make up for the difference between thier architechture and the competition's, and I myself am fairly confident that it will.
__________________ |
| |||
truly, but I doubt it will until late q2 or early q3 06, being so, amd's socket m2 will debut and ati's rd580 chipset will probably delay release until that time (being that it's useless at this point to make a new 939 chipset) the x1900's should be pretty futureproof, but I chalk it up to the current amd dual cores, nice but premature, when the m2 is released the new dual cores will have 4 major game titles out at least (probabaly more) I'd expect the g80 and atis response to it to be the same.
__________________ |
| ||||
http://www.gamespot.com/features/6142842/p-2.html Dave and Dsio would you pls take a look at the above link, are this testings real? As I see there, they are not matching the Ati with latest Nvidia card with 512. Why the hell are they not doing this? From this anywone would think ATI has killed it's competition. |
| ||||
The benchmarks he has run are fudged at the least. The thing that gives it away is Quake 4. In his tests, an X1900XT single card comes insanely close to a PAIR of 7800GTX 256 cards in SLI. Quake 4 is ATI's weakpoint, and Nvidia's real performance niche. For the X1900 to even come close to a pair of GTX's in SLI on that bench is complete rubbish. The man is a noob, with absolutely no clue about video cards in the slightest. http://www.gamespot.com/features/6142842/index.html "Architecturally, the X1900 is similar to the X1800, but ATI decided to throw in even more of the good stuff. Whereas the X1800 has 16 pixel pipelines, or pixel shader processors as ATI refers to them, the X1900 XTX comes with an astounding (and baffling) 48 pixel shader processors. It also has a core that's clocked at 650MHz and 512MB of GDDR3 RAM clocked at 1.55GHz. All these numbers boil down to a card that's very fast, in specific ways." He doesn't know what a pixel shader is. That is clear. He doesn't know how many pixel pipelines (TMUs) the X1900 has. Rather he thinks ALUs are TMUs. I know I am not being completely accurate in describing the pixel pipeline, but this is close enough for the purpose of making my point. "Whereas the X1800 has 16 pixel pipelines, or pixel shader processors as ATI refers to them" ATI does not call them that. Because ATI actually know what a video card is. Basically, the X1900 is a 16/48/8 card, as opposed to the X1800 which is a 16/16/8 card. He thinks the X1900 is a 48/48/8 card because he doesn't understand it. Because he thinks it has 48 pixel pipelines with a normal 1:1 TMU/ALU ratio rather that an asymetric 3:1 ratio card, hes fudged his figures.
__________________ Notebook: Apple Macbook Pro 13" i7 2.7Ghz (3.4Ghz max) 8GB DDR3 1333Mhz (Mac OSX 10.6.7) Desktop: ASUS Rampage Formula X48 Intel Core 2 Quad Q9450 (Yorkfield) @ 3.60Ghz (Folding SMP Linux) Running Fedora 15 Linux (GNOME 3) Dual Dell 2407WFP ![]() Drivers, Games, Demos, Mods and Overclocking Tools At AOAFiles Last edited by dsio; 25th January, 2006 at 04:37 PM. |
| ||||
Reading that post back, its still a little misleading to be honest. The point I was trying to make was that the X1900 is often made out to be a "scaled up" R520, with 48 complete pixel "pipelines" in the same sense as they appear on the R520, which would imply that by trippling the number of pixel pipelines, the R580 is three times faster than the R520. This isn't the case, and that is what I was trying to explain. And what that reviewer seams to have missed, along with many others. The whole thing is rather misleading. I've read into this enough to fully understand what they did and how it works, yet I still get a headache trying to explain the situation. Its a marketing department's pipedream.
__________________ Notebook: Apple Macbook Pro 13" i7 2.7Ghz (3.4Ghz max) 8GB DDR3 1333Mhz (Mac OSX 10.6.7) Desktop: ASUS Rampage Formula X48 Intel Core 2 Quad Q9450 (Yorkfield) @ 3.60Ghz (Folding SMP Linux) Running Fedora 15 Linux (GNOME 3) Dual Dell 2407WFP ![]() Drivers, Games, Demos, Mods and Overclocking Tools At AOAFiles |
| |||
lol that review was messed up especially being the numbers seem to be made up to me. (probably got himself a nice free ati card and decided to reward them for it lol) but in quake 4 being based on the doom 3 engine nv should pwn ati (much like all source engine games where ati pwns nv) it's always been that way though even back on the whole nvidia gf3 vs ATI 7500 direct3d has always been ati's ame and opengl has always been nvidias. quite funny actually.
__________________ |
![]() |
Currently Active Users Viewing This Thread: 1 (0 members and 1 guests) | |
Thread Tools | |
Rate This Thread | |
| |
![]() | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
Details of new 22 inch Dell LCD emerge | danrok | Graphics and Sound cards; Speakers and other Peripherals | 2 | 14th January, 2007 05:36 AM |
X1900xt £375 | bradmax57 | Online Deals, and Steals | 3 | 27th January, 2006 07:04 PM |
More Xbox 360 Photos Emerge | danrok | General Hardware Discussion | 55 | 27th May, 2005 03:33 AM |
New folders emerge | Allan | ThunderRd's AOA FOLDING@HOME Team | 14 | 1st November, 2004 02:27 AM |