An exclusive gaming industry community targeted
to, and designed for Professionals, Businesses
and Students in the sectors and industries
of Gaming, New Media and the Web, all closely
related with it's Business and Industry.
A Rich content driven service including articles,
contributed discussion, news, reviews, networking, downloads,
and debate.
We strive to cater for cultural influencers,
technology decision makers, early adopters and business leaders in the gaming industry.
A medium to share your or contribute your ideas,
experiences, questions and point of view or network
with other colleagues here at iVirtua Community.
DDR1 is used by majority of graphics cards and RAM itself. DDR2 is used in high-powered graphics cards like 9800XT, but sucks for normal RAM because it's latencies are WAY too high to outweigh the speed advantage. GDDR3 is used by 6600GT, X600/X700XT, X800/Pro/XT/XL, X850/Pro/XL/XT, 6800/GT/Ultra/SLI.
Contributed by Predator, Guest 510 iVirtua Loyalty Points • • • Back to Top
Thanks...I knew that DDR2 was used in higher-end graphics cards and that, at this time, GDDR3 was strictly for graphics applications. but I thought someone had mentioned that DDR3 would make its way into main system RAM...and THAT is what I was wondering about...since DDR2 has got the high latencies, it would seem to logically follow that DDR3 would have more of the same.
The Only two Graphics Cards which I know of, used the DDR2 was the ATI Radeon 9800 Pro 256MB & the NVIDIA GeForce FX 5800 Ultra/Pro 256MB.
These cards were test pilots with DDR2, but did not perform as well as expected & ran hotter than usual, so I guess this is why ATI & NVIDIA skipped DDR2 & went with DDR3.
The AMD 64 is a genius of a design.
Check this out. AMD 64's have 3 modes.
Long mode: In this mode, the operating system performs in pure 64-bit mode, and supports only 64-bit applications.
Compatibility mode: Here, it supports both 32 and 64-bit applications under a 64-bit operating system.
Legacy mode: The legacy mode can only run current 32-bit operating systems; 64-bit applications cannot be used.
Last edited by Super XP on Wed Jan 19, 2005 6:06 pm; edited 1 time in total
I guess there taking the extreme models and adding a duel core. But they have some new core technology they plan to release that will run 30% cooler. This won?t happen till late 2005 so we will see. Keep in mind for the first year your looking at a price tag of $800.00 to $1,200.00. lol.
Hey TK+2 that sig is rude it just told me i dont belong here.You need to squash that ugly little linux nerd..lol.
DDR 1 smokes DDR2 right now .DDR2 is a waste of money 4-4-4-8 timmings what the is that?looks like my combo for my bike lock
DDR 1 is the best right now. DDR 2 is a lost cause (In my opinion) & DDR 3 rules 100%. For DDR2 to take full effect, it has to reach at least 667MHz + just to see a performance improvement.
I have to agree on that. I think Intel moved too fast with DDR2 & now it's killing them. Intel is forcing comapnies like MSI, Asus, Abit, SIS, etc. to buy there DDR2 motherboards, but the compalies do not want them because they are sitting in the dust, nobody wants them right now.
Intel says that if they don't buy DDR2 mobo's then Intel will not sell them P4 mobo's.
I like Intel, but at this crazy rate of there's I may be moving into the world of AMD sooner than I thought. Though the Opteron 8-Way are great. :)
have you seen the latencies on DDR 2 ? My DDR 4000 runs about the same speeds and lower latencies. why would you need ddr 2? when the only CPU to support it is the 775 and there a limited CPU next to the AMD FX and 64.
At this point in time, I see no REAL advantage to DDR2.
But what is it about DDR3 that make is so good? I really don't know and would like to be informed.
Steve
PS: And yes, Super SP....I followed you over here from the Steam forum!
DDR3 can be increased with extremely tight timings @ faster speeds compared to DDR2. Plus I heard that DDR3 runs cooler with fewer Volts than DD2.
The ATI Radeon 9800 Pro 256MB DDR & the NVIDIA GeForce FX 5800 Ultra both used DDR2. I think they were testing DDR2, but were not satisfied with the performance, so they went to the more efficient DDR3.
I think this crazy Industry should skip DDR2 & move to DDR3. But thanks to Intel, we are going backwards in technology.
Good Article. DDR2 needs to start moving into the Low Latencies realm. And when DDR2 becomes faster wit Enhanced Low Latencies, then AMD will adopt it. Actually, they already adopted DDR2, but not for release just yet.
They shouldn't be running that hot in the first place. Though I think Intel should have gone with the DDR3 just like the Graphic Card companies. It seems like the more logical choice. Unless they can't get memory speed up to par with DDR3 standards.