User Control Panel
Search iVirtua
Advanced/Tag Search...
Search Users...
What is iVirtua Exclusive Community?
  • An exclusive gaming industry community targeted to, and designed for Professionals, Businesses and Students in the sectors and industries of Gaming, New Media and the Web, all closely related with it's Business and Industry.
  • A Rich content driven service including articles, contributed discussion, news, reviews, networking, downloads, and debate.
  • We strive to cater for cultural influencers, technology decision makers, early adopters and business leaders in the gaming industry.
  • A medium to share your or contribute your ideas, experiences, questions and point of view or network with other colleagues here at iVirtua Community.
Guest's Communication
Live Chat
Teamspeak (VOIP) Audio Conference
Private Messages
Check your Private Messages
Themes
Choose an iVirtua Community theme to reflect your interests...
Business Theme
India/Arabic Theme

Gaming Theme
iVirtua Recommends
Fly Emirates Advertising
642 results for nvidia
Don't buy Nvidia's GTX 200 cards now! Price cut on the way! in Gaming
Mountain House (CA) – It appears that AMD’s ATI Radeon 4800 GPU has turned out to be a much better chip than initially expected and AMD’s aggressive pricing puts enough pressure on Nvidia to prompt price adjustments. If you are planning on purchasing a GTX 260 or 280 card, you may want to delay until next Monday. Price cuts are on the way.

Over the past several days we have spent some time with several Nvidia’s partners in the Silicon Valley and in Taiwan, which made it obvious that there are tensions between GPU manufacturers and add-in board companies. A raging price war put five companies on the verge of bankruptcy. We cannot disclose who almost kicked the bucket, but we were told that three vendors are still walking on very thin ice.  

With the debut of GeForce GTX 200 series, made some adjustments protecting its partners with greater margins, but the ATI Radeon 4800 series is changing that scenario again. AMD has received praise from the press for its ATI Radeon 4800 series, causing Nvidia partners to demand price adjustments. We were told that Nvidia finally stepped down from its pedestal and agreed to offer limited price protection for some products - as well as price cuts.

We contacted Nvidia to get more details on this information and were provided with the following statement by Bryan Del Rizzo and Ken Brown, spokespeople for Nvidia:

"We’re working with our partners on adjusting the prices for the GTX 280 and 260. The changes are being implemented over the next few days and will take effect sometime next week. Please obtain final retail pricing from the partners, because they set them for their products."

The third sentence has to be taken with a grain of salt, a most partners complained to us about Nvidia’s Unilateral Minimum Advertised Price Policy, short UMAP, and the way it affects them. For the consumer, however, we have yet another example how well competition works. The race between the Radeon 4800 series and GTX 200 will ultimately drag prices down.

According to our sources, Nvidia cut the price of the GTX 280 by $90 and $30 for the GTX 260. Of course, that is a price cut Nvidia is handing down to its partners and does not reflect retail prices.

GeForce GTX 260 cards currently sell for $379.99 on Newegg ($399.99 minus $20 instant rebate on XFX, BFG and PNY cards), while GTX 280 cards sell for $619.99 ($649.99 minus $30 for the XFX board).

After Nvidia’s adjustment, we should see Monday prices going down to $359.99 for GTX 260 cards and to $559.99 for GTX 280 cards. Please note that we are not including the possibility for additional rebates that may be offered. For example, if you purchase PNY's GTX 280 from Newegg.com, you currently pay 569.99. After price cut, this might dip down to $479-499, lowering the price below $500 mark.

It appears that we might end up with permanent price brackets at $199, $299, $399 and $499. This would greatly simplify the search for the best possible graphics card at a certain point. Also, this opens the battlefield between single and multi-GPU setups: Could two boards for $199 provide more value than a single $399 or $499 card?

We are sure, hardware review websites are going to find out.
Posted by Editorial Team Tue Jul 08, 2008 7:48 am
Build a DX10 rig for under £176 in Hardware, Internet, Networking, Comms and Security
We like big explosions, the bigger the better, in fact.
Wealso like smoke effects, water ripples, dappled lighting filteringthrough jungle canopies and creeping up silently behind people, beforemurdering them with our bare hands. But enough about our weekendpastimes…
What we really like are the fantastic visuals that DX10 gaming offers.
If you listen to most people, they will tell you that you need a quad-core, DDR3, triple-SLI setup to play Crysis. The sort of setup that requires you to remortgage your house to own. These people are wrong, and we're going to show you why.
We'vepreviously demonstrated how to build a DX10 setup for just under £300,a not unreasonable amount that should be within the reach of mostpeople. But what if you just blew all your money on a sordid weekend inAmsterdam, and you've resorted to scrambling under the sofa for loosechange? Would you believe us if we told you that it's possible to builda DX10-capable rig for well under £200? Well, it's true.
Ofcourse you can't connect it to a 22-inch wide screen monitor withoutthe frame rates plummetting, but if you're on that tight a budget, abig monitor is probably the least of your concerns.
Wheneveryou work to such a tight budget, something has to give and this projectwill be no exception. We need to prioritise in certain areas, whileothers can be largely ignored.
Yes, a case is important to stopyour gear being an untidy heap of electronics on the floor, but reallyyou just need a metal box to screw things onto. Optical drives are dirtcheap, and with memory stick capacities being what they are, hardlyanyone burns DVDs, so we only need a DVD ROM.
It also means noquad-core and no SLI. But dual-core chips are surprisingly cheap, andwe'll see just how well a budget DX10 card performs. Don't forget thatif you have any parts available from an existing PC, such as cases anddrives, you can reuse them and put the money towards a higher-end CPUor graphics card.


       
Case and PSU
Ifyou want a well-designed case, with plenty of fans, numerous ports andplenty of upgradability, then it's easily possible to spend more thanour entire budget on such a beast.
Likewise, if you want a 1KwPSU that supports the likes of SLI, then it's going to cost a fairamount of cash. At the other end of the spectrum is the all-in-one caseand PSU deal. We found one for just £26, which includes a 400W PSU.
Thismay not sound like a lot of power, but it's more than enough to run oursetup. When spending such a small amount of money on a case, you'dexpect it to be quite horrific, but it's surprisingly well featured. Ithas a matt-black finish, which helps on the looks front, and the frontpanel has USB and audio ports.
Most of the internals can befitted without the need for a screwdriver, and it even has a lockingside panel. Sure, it isn't the best-looking or quietest case we've everseen, but for this sort of money, we're not complaining.
The result?
As the most basic DX10 card available from NVIDIA, it comes as no surprise that the performance of the 8400 is not the best.
However,at £20 it still does pretty well, as long as you keep the resolutionrealistic. Okay, not everyone wants to play at 800x600, or even1024x768, but then you shouldn't be so cheap, should you?
Surprisingly, Crysis gave some of the best results, although BioShock achieves the best framerates of all. Using the Optimal settings button, Crysisdid set all the detail level to low, but the results still lookedpretty good. However, if you're going to be realistic about playingDirectX10 games, then you are going to have to find a little more moneyin your budget.
Hooking the 9600GT up to our budget CPU workedabsolute miracles, and at around £70 extra is an absolute steal. Notonly could we turn the detail right up, but we could run a higherresolution and still get twice the frame rates of the 8400GS.
Surprisingly,adding a high-end quadcore CPU doesn't give much of an increase, witheither the 8400 or 9600GT. In conjunction with the 8400Gs, the Phemom9550 does give you some extra fps over the Athlon X2 4400+, but withwhen it comes to the 9600GT, the difference over the 4400+ setup ismarginal.
So, if you want the best performance, and can spend alittle extra, buying the 9600GT is the logical choice. You know itmakes sense.
Posted by Editorial Team Tue Jul 08, 2008 7:47 am
I'm confused please respond in Hardware, Internet, Networking, Comms and Security
Because this isn't a mandatory question I'll keep it short.

But basically what I've been seeing lately is Nvidia and ATI are almost just about equal in performance.  But I'm so confused as to what is ATI doing wrong that is so ineffective?  For example, Most Nvidia cards have a 600/1700MHz clock speed (core/memory) and no more than 128 streaming processors.  ATI's competitive cards are more like 800/2000MHz with 240 streaming processors.

This is just an example off the top of my head but its about what things are like right now, so what I want to know is what is ATI doing that makes everything they have significantly slower?  Their latest card has 640 streaming pipelines, thats nuts.  Yet it just barely compete's with Nvidia's good stuff, and it sure as hell doesn't have over 320.
Posted by schmidtbag Mon Jun 30, 2008 11:01 am
Nvidia releases PhysX code for latest GeForce GPUs in Hardware, Internet, Networking, Comms and Security
Nvidia has posted a version of its PhysX software that enables thephysics-on-GPU technology on its GeForce GTX 200-series and 9800 GTXgraphics chips.
The new release, version 8.06.12, was posted last night, builds onsoftware Nvidia acquired when it bought physics chip specialist Ageiain February this year. Like past versions of the PhysX code, the newversion also runs on Ageia PhysX chips.
But the crucial change is support for Nvidia GPUs for which thelatest version of the company's Forceware drivers are required: 177.39,an advance on the version currently available from Nvidia's Forcewaredownload page, which is 177.35, released a couple of weeks ago.

Unreal Tournament 3: zapped, physically

The new drivers run on Windows XP and both the 32- and 64-bit incarnations of Vista.
Of course, you also need an app that can take advantage of the GPUand the physics code, and for that Nvidia offers a link to the Unreal Tournament 3PhysX Mod Pack, which incorporates a couple of new arenas to show ofthe game's "maximum impact" physics effects, which include damage tothe world in which the games is set, a whirlwind hazards that sweepsaround the battlefield, and weapons that can pull debris towards theplayer.
The PhysX code is available from Nvidia's website here, while the Unreal Tournament add-on can be downloaded here.
Posted by Editorial Team Sun Jun 29, 2008 6:42 am
Fastest ever graphics card hits shops Nvidia’s Zotac GTX 280 in Hardware, Internet, Networking, Comms and Security
World's fastest core clock speedisn't something you would like to say ten times fast (unless you wantfunny looks) but this is the impressive boast of Nvidia about itslatest graphics card, the Zotac GTX 280 AMP! Edition .
In teststhe Zotac topped 700MHz straight out of the box – well, we presume thecompany plugged it in first – which, according to Nvidia is "so-farunbeaten by any other manufacturer."
Performance outweighs price
Thenew graphics card offers a massive 12 per cent performance hike overits now slower brethren, the GTX 280. But its only priced £20 to £30higher. This equates to just a 5 per cent price rise – Nvidia's maths,not ours.
In a statement about the new graphics card, CarstenBerger, Marketing Director, Zotac, said: "Yet again Zotac hasdemonstrated its exceptional ability to design and engineer the fastestcard on the market.
"Overclocking the GTX 280's core by almost100MHz is a fantastic achievement, but to then mass produce it andmaintain a 5 year warranty is outstanding."
The Zotac GTX 280 AMP! Edition is out now for around £450.
Posted by Editorial Team Sun Jun 29, 2008 6:38 am
ATI releases new Radeon 4800 cards - Ends NVIDIA dominance? in Hardware, Internet, Networking, Comms and Security
NVIDIA's long-time hold on thegraphics market is under threat from ATI, with the release of its newRadeon 4800 series cards today.
Initial tests on ATI's 4850 modelare very positive, with the new graphics cards said to easily matchNvidia's latest offerings in terms of performance, while blowing themout of the water on price.
High performance, good price
At£130, the ATI Radeon 4850 is quite a bargain. And with more advancedcards on the horizon such as the Radeon 4870 threatening to set aperformance benchmark, it does seem that NVIDIA's dominance of thehigh-end graphics market (currently the GeForce GTX 280 is the one tobeat) could be nearing an end.
We will of course be sure to bring you further news on all new graphics card tech from NVIDIA and ATI as and when we get it.
Posted by Editorial Team Sun Jun 29, 2008 6:34 am
nVidia turning it's GPU's into 'PhysX Physics Processors' in Hardware, Internet, Networking, Comms and Security
3D card manufacturers shouldn't take this the wrong way, but ittakes a lot to make us crawl out of the communal Eurogamer bed (yes,all the Eurogamer writers share a single large bed - we do it forfrugality and communality, which remain our watchwords) and go to ahardware presentation. There's a nagging fear someone may talk maths atus and we'd come home clutching the local equivalent of magic beans.And then we'll be laughed at by our fellow writers and made to sleep inthe chilly end where the covers are thin and Tom left dubious stains.That's no fun at all.
Then again, there's some things you can'thelp but go and have a gawk at. So when an invite claims, "All toooften new hardware brings with it a small performance increase - maybea 5-10 percent over the previous fastest thing. Wouldn't it be far moreexciting to see a speed increase of x20 or even x100... well, we'll behappy to show just that on Friday," you have wander along. Even thoughyou suspect it may be a trap and they're going to attack you withill-shaped blades, you have to find out what on earth they're talkingabout.
As we suspected, it wasn't quite what we were hoping for.Sure, there are programs which gain a x100 increase via the methodsNVIDIA talks about on this particular Friday, but unless you're workingin economics or astrophysics modelling, it's not exactly that relevant.However, something more quietly astounding was explained. Mainly, thatdespite the fact that no-one you know bought a PhysX card, if you're aPC gamer with a relatively recent NVIDIA card, you've already got one.Or, at least, you will soon. Spooks.

Get him!

The primary idea NVIDIA was trying to push was Optimised PC - the approach discussed in Rob Fahey's interview with Roy Taylorthe other day. The idea being that the traditional PC approach whereyou buy the fastest PC processor you can doesn't actually lend the bestresults, at least in most situations. If you spent more on -predictably - a GPU-driven 3D card, for an increasing number of areas,you're going to get much higher performance. If the program is usingthe GPU in a meaningful way, anyway. NVIDIA highlights areas likeimage-processing and HD video-encoding, as well as - natch! - games.You lose in single-threaded activities - like, say, just booting up aprogram - but they argue a small loss in opening a Word Document isless noticeable than frames in games or similar.
Where it startsgetting interesting is NVIDIA's development language, CUDA. The problemwith all the threading programming methods is that it's radicallydifferent to single-threading (and, yes, we're getting into, "Why wouldanyone care about this but a programmer?" territory, but its backgroundfor the key point later). It's hard to do, and CUDA is basically a wayto make things more accessible.
NVIDIA claims anyone experiencedin C or C++ will be able to get a grip on it (i.e. not us, but theaforementioned programmers). This means that anyone who codes in CUDAcan program the GPU to do pretty much whatever they like; it's byturning the 3D card into a bank of processors that the financialanalysts and the astrophysics guys are getting such impressive results.And impressive savings, as it's a lot cheaper to do it this way.

Now, NVIDIA claims that the fact GPU solutions are cheaper is goingto push better GPUs into more business machines. This will help pushthe idea that an okay CPU/good GPU machine gives better performancethan a good CPU/okay GPU, leading to more machines with better GPUs...and so, making more PCs abstractly available for gaming. Or, at least,raising the bottom level of hardware that you can expect people to have.
Interms of a more general use, transcoding video can take hours. Later inJuly, all GeForce 8000+ cards will ship with Elemental HD, a programwhich manages to perform the odious task - in the words of NVIDIA - "ina matter of minutes". The software will also be available for people todownload online, probably with a small fee ala Quicksave if theyalready have a GeForce card.
Point being: this CUDA malarkeyisn't something that's just for future NVIDIA technology. It'ssomething that allows the hardware many PC gamers already have to berepurposed.
For example, PhysX. NVIDIA's Physics 3D Card systemwas only supported in a minor fashion, as no-one would buy a card justto make explosions fancier, but with CUDA it can run on one of theother GPUs. A proportion of the 3D card's power can be given over torunning physics, giving those fancy PhysX-style interactions withoutactually having a specific card for it. CUDA's porting to PhysX willbecome available to the public in July, but developers already have thetools.

The Euphoria engine of Natual Motion. It's hard to illustrate this sort of thing.

You'llbe able to - for example - manually, up front, decide to devote aproportion of your 3D card's power to PhysX. Alternatively, developerscan commandeer it and do exactly the same thing. The new generation ofcards which are about to be announced are able to deal with pretty muchanything that exists on the highest setting with power left over, sothat power can be given over to acting like a 3D card would.
Andit goes further. Where previously you'd have just thrown out your old3D card when you upgraded your PC to a new one, if you have a G8000+ 3Dcard already, you can keep it, and just set it to concentrate solely ondoing PhysX tasks. This isn't a SLI situation where you need two of thesame cards working in tandem - any post-8880 card, rather than beingput out to digital pasture, can be given a job of deciding how bits ofglass bounce off a skyscraper, or similar. NVIDIA claims it's talkingto ATI to try and get them to use CUDA too, which.... well, we'll seethere, eh?
The potential is interesting. Demos shown include Natural Motion, whose Euphoria engineis heavily physics-dependent, allowing unique, convincing moments ingames. A straight collision isn't enough, as straight ragdolls areludicrous - the system involving AI (so the hit object will try andmove limbs to protect self and similar) leads to impressivelynaturalistic results. The first sign of this publicly was in GrandTheft Auto IV, but Natural Motion's own American football game, Backbreaker,is a fascinating example of what a physics-heavier approach tocollisions can give games. And, with CUDA-esque use of GPUs to do thisstuff, the PhysX related boon is accessible to even more of us.
So they did talk some maths, then, but we survived.
Posted by Editorial Team Mon Jun 23, 2008 5:11 pm
No future for PC-exclusive games, says Nvidia in Business and Industry in Gaming, Media, Web, IT and Computing
Nvidia's VP of content business development, Roy Taylor, has said hebelieves the value of consoles means "no-one is going to makePC-exclusive games in the future".
Speaking exclusively to Eurogamer,he said he wasn't threatened by the machines from Sony, Microsoft andNintendo; instead he sees an "exciting future" of co-existence.
"I think we have to face the facts - the value of consoles issuch that no-one is going to make a PC-exclusive game in the future.Why would they? Why would they ignore consoles?" said Taylor.
                   
"That said, PC gaming is changing - and consoles don'tthreaten PC gaming. They're just different. Adapting to that andunderstanding that is what I think is really, really important.
"Most PC gamers also own consoles - not all of them, but a lot of them.What we're seeing happen is that, yes, people are developing for Xbox360, for PS3 - but they're also developing for PC," he added.
The reason Taylor is excited is that PC versions of games,which he says are generally "better", use console code as a baseline,and the better the baseline then the better the desktop conversion.
"The console is now a baseline. If you look at Gears of War orAssassin's Creed, they came out on console and they were greatexperiences - but the PC versions had additional aspects to them thatalso made them attractive, whether you owned the console version ornot," continued Taylor.
"The PC version was better. That's something that people needto get their heads around - the console is a baseline, the PC is goingto be an improved version. That's an exciting future, and that's why Idon't see anything threatening about console at all.
Read Eurogamer's interviewwith Nvidia's VP of content business development, Roy Taylor to seewhat he has to say about the future of graphics, why integratedsolutions are ruining everything, and how the PC installed-base meansit will never disappear as a gaming platform.
Posted by Editorial Team Fri Jun 13, 2008 5:59 pm
AMD to pair CPUs, GPUs with Intel's physics tech in Hardware, Internet, Networking, Comms and Security
AMD is partnering with Intel to improve the way its graphics chips can handle physics and other scientific calculations.
Well, sort of. AMD's actually working with Intel subsidiary Havok,which the chip giant acquired last year. Havok operates separately fromIntel to develop its Havok FX physics processing API, which allowsdevelopers to code up such algorithms to run on GPUs rather than CPUs.
It's main rival was Ageia, developer of a similar API and adedicated chip, PhysX, to run the calculations. Ageia, however, is nowpart of Nvidia, which is understandably playing down PhysX whilepromoting Ageia's software technology as a way of running physicscalculations on its own GPUs.

All this stuff is going to run on discrete graphics chips, so itmakes more sense for AMD to partner with an Intel company, which isn'tcompeting with it - yet - in the discrete GPU market.
The partnership will ensure that Havok FX can take full advantage ofthe idiosyncracies of AMD's Radeon GPU architecture and of its x86processors.
Games, in particular, are increasingly incorporating algorithms thatcan model complex interactions between players and the worlds theyinhabit. Traditionally, these calculations have been handled by theCPU, but they're better suited to the GPU's parallel-processing design,which whizzes through them while the general-purpose CPU wouldstruggle.
Posted by Editorial Team Fri Jun 13, 2008 4:57 pm
Spore's minimum specs are surprisingly low in Gaming
The latest news on Will Wright’s magnum opus, Spore, is that the minimum PC (and Mac) specifications required to run the game are surprisingly low.

This is great news for the majority of casual PC gamers who don’t tend to spend an inordinate amount of time and money constantly upgrading their PCs and graphics cards.

Maxis/EA has also released the Sporepedia on the official Spore.com site to whet the appetites of those PC and Mac gamers looking forward to Wright’s immense and hugely ambitious game.

The Specs required to run Spore on your PC or Mac are as follows:

Windows XP 2.0 GHz P4 processor or equivalent 512 MB RAM A 128 MB Video Card, with support for Pixel Shader 2.0 At least 6 GB of hard drive space

Windows Vista 2.0 GHz P4 processor or equivalent 768 MB RAM A 128 MB Video Card, with support for Pixel Shader 2.0 At least 6 GB of hard drive space

Mac OS X Mac OS X 10.5.3 Leopard or higher Intel Core Duo Processor 1024 MB RAM ATI X1600 or NVidia 7300 GT with 128 MB of Video RAM, or Intel Integrated GMA X3100 At least 4.7GB of hard drive space for installation, plus additional space for creations.
Posted by Editorial Team Fri Jun 13, 2008 4:51 pm
Nvidia GT200 sucessor out: GT200b: Nvidia face tough summer in Hardware, Internet, Networking, Comms and Security
Nvidia's CPU mouthed off about Intel, the firmfollowed it up with the stunning NV5800 'Dustbuster'. This time, he mouthed off,and the successor to the GT200 had already taped out. NV is in deep trouble onceagain.
You heard that right, the successor for the GT200 chip has already taped out,and it too will be nothing special. GT200b, it is nothing more than a 55nm shrink ofthe GT200. Don't expect miracles, but do expect the name to change.
There are several problems with the GT200, most of which are near fatal. Thefirst is thediesize, 576mm^2, bigger than most Itanics. One might trust Intel to make achip that big with decent yields, especially if it is basically an island oflogic in the middle of a sea of cache. Nvidia using a foundry process doesn'thave a chance of pulling this off.
Word has come out of Satan Clara that the yields are laughable. No, make thatabysmal, they are 40 per cent. To add insult to injury, that 40 per centincludes both the 280 and the 260 yield salvage parts. With about 100 diecandidates per wafer, that means 40 good dice per wafer. Doing the maths, a TSMC300mm 65nm wafer runs about $5000, so that means each good die costs $125 beforepackaging, testing and the like. If they can get away with sub-$150 costs perGPU, we will be impressed.
So, these parts cost about $150, and the boards will sell for $449 and $649for the 260 and 280 respectively, so there is enough play room there to makemoney, right? Actually, most likely yes. There are costs though, but not enoughto kill profit for any one touching these.
The biggest cost is memory. The 512b memory width means that they will haveto use at least 16 chips. This ends up making the board very complex when youhave to route all those high speed signals, and that means more layers, morecost, and more defect fallout with the added steps. You also have to solder oneight more memory chips which costs yet more.
To add insult to injury, the TDPs of the 260 and 280 are 182W and 236Wrespectively. This means big copper heatsinks, possibly heatpipes, and high-endfans. Those parts cost a lot of money to buy, assemble and ship. Not fatal, butnot a good situation either. It also precludes a dual GPU board without losing alot of speed.
Basically, these boards are going to cost a lot of money to make, not just tobuy. The $449 price is justified by the cost. The last round of GX2 boardscostOEMs about $425, meaning that NV charges OEMs about 70 per cent of retailfor high-end parts. After packaging, shipping and add-ins, there is almostnothing left for the OEMs, quite possible explaining why one of their biggestone is teetering on the verge of bankruptcy, kept alive because NV won't calltheir debt while still shiping to them. Watch for this to melt down once NVloses the high end.
So, you end up with an expensive chip on an expensive board that makes few ifany people money. Fair enough, bleeding-edge parts mean bleeding-edge prices.The problem is that ATI is going to make a chip that competes with GT200, andlines up with it point for point. NV wins big Z Fill, ATI crushes them onShader Flops. What this translates to in the real world is still up in the air,but it looks like the 770 and the 260 will be about equal for most things.
The GT200 is about six months late, blew out their die size estimates andmissed clock targets by a lot. ATI didn't. This means that buying a GT260 boardwill cost about 50 per cent more than an R770 for equivalent performance. TheGT280 will be about 25 per cent faster but cost more than twice as much. A monthor so after the 770 comes the 700, basically two 770s on a slab. This will crushthe GT280 in just about every conceivable benchmark and likely cost less. Why?Because.
So, what is a company to do when it promised the financial world that ATIwas lost, and GT200 would raise their margins by 100 basis points or so? Surelythey knew what was coming a few weeks ago during their financial call, right? Imean, if word was leaking days later, the hierarchy surely was aware at thetime, right?
The answer to that is to tape out the GT200b yesterday. It has taped out, andit is a little more than 400mm^2 on a TSMC 55nm process. Given that TSMC tendsto price things so that on an equivalent area basis, the new process ismarginally cheaper than the old, don't look for much cost saving there. Anydecrease in defectivity due to smaller area is almost assuredly going to bebalanced out by the learning curve on the new process. Being overly generous, itis still hard to see how the GT200b will cost less than $100 per chip. Don'tlook for much cost savings there.
The new shrink will be a much better chip though, mainly because they mightfix the crippling clock rate problems of the older part. This is most likely nota speed path problem but a heat/power issue. If you get a better perf/wattnumber through better process tech, you can either keep performance the sameand lower net power use, or keep power use the same and raise performance.
Given NV's woeful 933GFLOPS number, you can guess which way they are going togo. This means no saving on heatsinks, no savings on components, and a slightlycheaper die. For consumers, it will likely mean a $50 cheaper board, but nofinal prices have come my way yet. It will also mean a cheaper and faster boardin a few months.
The GT200b will be out in late summer or early fall, instantly obsoleting theGT200. Anyone buying the 65nm version will end up with a lemon, a slow, hot andexpensive lemon. Kind of like the 5800. It would suck for NV if word of this gotout. Ooops, sorry.
What are they going to do? Emails seen by the INQ indicate they are going toplay the usual PR games to take advantage of sites that don't bother checking upon the 'facts' fed to them. They plan to release the GT200 in absurdly limitedquantities, and only four AIBs are going to initially get parts.
There is also serious talk of announcing a price drop to put them head tohead with the R770 and giving that number to reviewers. When the boards comeout, the reviews are already posted with the lower numbers, and no reviewerever updates their pricing or points out that the price performance ratio wasjust blown out of the water. There is also internal debate about giving a fewetailers a real price cut for a short time to 'back up' the 'MSRP'.
We would hope the reviewers are able to look at the numbers they were givenon editors' day, $449 and $649, and take any $100+ last minute price drop withthe appropriate chunk of NaCl. Just given the component cost, there is no way NVcan hit the same price point as the 770 boards. "We lose money on each one, butwe make it up in volume" is not a good long term strategy, nor is it a way toimprove margins by 100 basis points.
In the end, NV is facing atoughsummer in the GPU business. They are effectively out of the Montevinamarket, and they are going to lose the high end in a spectacular way. Nvidia hasno effective countermeasure to the R770, the GT200 was quite simply botched, andnow they are going to pay for it. When all you have is a hammer,everythinglooks like a 5800.
Posted by Editorial Team Wed Jun 04, 2008 5:09 am
First AMD Game PC reviewed in Gaming
THE FIRST AMD GAME PC has been tried and tested according to an Israeli news site, which says it will definitely give the competition a run for their money.

Ynet, the Israeli news site belonging to Yediot Aharanot, claims that Tech Data, an Israeli computer distributor, gave it a prototype of a new AMD Game! machine to play around with before anyone else got their grubby paws on it.

AMD announced it would be launching a new badge for computers called AMD Game! (sic), based on the Spider and Cartwheel gaming platforms. The new machines are supposed to fall within the $600-$1500 price range, which AMD is convinced will make its machines an attractive option to serious gamers and multimedia fanatics.

The prototype computer used AMD hybrid graphics and the review noted that the HD3200 (AMD 780G) offered double the power of Intel's X3500 GMA.

Tech data told Ynet that it would be selling the machines with Vista pre-installed as the default operating system.

A Ynet reviewer wrote "Older games ran very well on it, Vista Ultimate ran on it....like XP". Whatever that means!

Ynet summed up to say that AMD's hybrid graphics are very far from being just a novelty or sideline item and that the Cartwheel platform does deliver on the company's promises.

Price and quality wise, the Ynet reviewers reckon the new AMD Game computers would most definitely give both Intel and Nvidia a run for their money and that it would represent very strong competition in the competitive gaming market.

It added that AMD seemed to be much better placed with this new product than it had been a year ago, so there is indeed light at the end of the tunnel.
Posted by Editorial Team Sun Jun 01, 2008 5:37 pm
When Intel used to love Nvidia... In pictures in Hardware, Internet, Networking, Comms and Security
Nvidia is on thewarpath, telling anyone that will listen that: a) Intel is really big, andreally mean, and, b) the CPU is dead, the GPU is taking over, death to the CPU.Except our new ones, obviously.

With all the animosity in the air, it's all to easy to forget a better time,a nicer time


Posted by Editorial Team Tue May 27, 2008 4:17 pm
Age of Conan tops charts: finally a game that takes on WoW? in Gaming
After a long build up, including an eight-week delay to apply the final polish, Age of Conan: Hyborian Adventures (AoC) has been launched.

The game is widely seen as one that has a chance of taking on the current king of the online gaming heap - World of Warcraft (WoW).



That battle for a share of the global online gaming world is one that the mighty Conan himself would relish. At stake are fame, respect and untold riches.

The BBC News website got a chance to play through the early levels of the game and right from the opening moments it is obvious that the cosy world of WoW has been left far behind. It's not for nothing that the game is rated 18.

The game opens on a galley ship on which both male and female characters are slaves. Under attack, the ship sinks and the character is washed up on the beach of an island called Tortage naked but for a loin cloth and shackles.

From those opening moments the graphical detail of the game is a huge leap forward from the rather "cartoon-y" look of WoW. It even rivals Lord of The Rings Online in the graphical stakes. That detail comes at a price - the minimum specs are quite high.

Joe Best, associate producer at publishers Eidos, said: "We really want this to be full fat but also scalable to the PCs of the last few years."

The opening is worthy of a Conan story in which the hero is left to craft his, or her, destiny with their bare hands. The first quests involve finding a way to remove the shackles and then kill the man who enslaved you.

For the first 20 levels of the game, players will be pretty much alone, said Joe Best, an associate producer at Eidos.

The "linear" nature of those early levels on Tortage is where players become familiar with the game world, the abilities of their character and how to play. After that they get to join the larger MMO world of AoC.

     

During those early levels the most important lessons learned are those that show how to fight.

Combat, bloody visceral combat, is at the heart of the Conan stories and the game is no exception. One of the first decisions made when the game was being drawn up, said Mr Best, was that the combat would be "ferocious".

"It's not about watching your character fight for you," he said. "they really wanted to break away from that "point and click" aspect of MMOs."

In games such as World of Warcraft characters attack automatically once they are directed to a target. In AoC the on-screen character only does what it is told. That's necessary as enemies adapt their fighting style to defend against the way they are attacked so that involvement is key.

And the combat is involving, much more so than WoW, where the same attacks and spells will despatch the same types of foe.

There is no doubt that it is fun to use combos and alter your attacks to beat a foe to the ground, or knock them back and then leap forward to finish them off with a panther-like grace that would win a nod of approval from the massive Cimmerian, Conan himself.

     
Quote:
AGE OF CONAN: SPECIFICATIONS
Minimum
Processor: 3GHz
Ram: 1GB
Video: Nvidia GeForce 5800 or ATI 9800
OS: Windows Vista/XP
Recommended
Processor: Intel Core 2 Duo 2.4 GHz or better
Ram: 2GB or better
Video: NVIDIA GeForce 7950GX2 or better
OS: Windows Vista/XP


It was odd, said Mr Best, that given combat centrality to most MMOs that no-one had tried such a thing before.

AoC's distinctiveness does not stop with blood, gore and intense combat. At the higher levels players can get a mount, (horse, mammoth or war rhino) that can be used for trampling enemies in to the dust. Those on horseback can swing a weapon and use the momentum of a charge to inflict huge amounts of damage.

Those mounts are likely to be very useful in another of AoC's selling points - siege warfare. Guilds can build their own cities or battle keeps, once their members have gathered enough raw materials for the buildings. As Mr Best said creating a city is a "very social experience".

But once built it may not be safe. Rival factions can gather their war mammoths, trebuchets and troops to lay waste to their enemies' homes and businesses. Pitched battles featuring huge groups of players are likely to become very popular.

Mr Best said many of the decisions that have driven the development of Age of Conan were taken to make it stand out.

Quote:
"If you are going up against World of Warcraft you cannot imitate it, you have to go your own path and do it your way," he said.


And that's something Conan would doubtless agree with. But it remains to be seen whether that list of features not seen in many other MMOs is a recipe for the one feature WoW has in spades: success.

Funcom has announced that over 1 million people have signed up to the Age of Conan beta test, a figure which the company believes is a record, and proves huge interest in the forthcoming adult-themed MMO.
Quote:

"Funcom has not been able to find any higher beta numbers for MMOs in the western world," said Morten Larssen, VP of sales and marketing. "We believe it represents the largest ever beta sign-up figure in the history of the genre."


The company also released additional statistics about interest in the game, commenting that almost 800,000 people have signed up for the newsletter, while last week the official site registered 725,000 unique users.

But while the official launch date of May 20 is still applicable in the US, the European date is now May 23, aligning it with the traditional Friday release for titles in the region.
Posted by Editorial Team Tue May 20, 2008 4:54 pm
Nvidia marketing at its worst: 9600GSO 768MB confusion in Hardware, Internet, Networking, Comms and Security
ow, reviewers are becomingpretty annoyed at what they see as a Nvidia marketing doing its worst... andreviews are reflecting this. Well this GSO comes with a non-standard cooler,dual-slot, for increased airflow. Being a Sonic version of the card, it’s alsoslightly OC’d, which means, in the end, it performs almost as well as a 9600GT.Now Shane thinks the 384MB version will throw a wrench in the gears (or justconfuse people even more), and rightly so. Nvidia gets the “Crap Naming Scheme”award of 2008, innit? Catch TT’s reviewhere.
In Win has a new case on the market, the Metal Suit GD. What makes the caseunique is its “VGA cooling duct” that sucks air from slits cut into the side ofthe case and sprays it over the expansion bay area. Sleek design, nice blue ledand the fans make very little noise. Cooling performance on the cooling ductdoesn’t seem to do a very good job, but you should read OCC’s reviewhere.
Every graphics card brand comes up with a nice bit of marchitecture to win alittle edge over the competition. Sapphire’s own is the “Toxic” line, ofVapour-X Technology that improve heat dissipation and allow the card to comefactory overclocked... the toxic design also has the advantage of keeping thecard single-slot. Big Bruin has a Sapphire Toxic HD3870 on their bench andthey’re putting it to the tong and hammer. The specs are lower than the AsusHD3870 TOP, but the Toxic cooler made it an excellent overclocker. Jason tookthe GPU all the way up to 885MHz. Interesting lit,here.
After testing some compact CPU coolers, Xbit has moved on to compactwatercoolers. Right. Well, these little coolers don’t really make much of animpact on your system (cooling-wise) but look great and are pretty easy toinstall, however, they lose on most counts to standard air and water coolers.Sergey does say it’s a first step towards better cooling. Not to be greatcommercial successes, such are the pitfalls of pioneering. Catch the reviewhere.
The mighty Church of CNET has published a review on the Synology Disk StationDS-107+. Yes, it’s a NAS, but according to the review, it’s also much more. Ithas PS3 and Xbox media center functionality, is relatively cheap and supports upto 1TB HDs. It’s also smallish, as it’s a single HD unit. If you’re a securityfreak, you can also network up to six cameras (although you pay extra for eachcamera license after the first) and use it as a video recording device. Gives ussome ideas,doesn’tit?
OCZ Reaper HPC DDR3 is one of many “supercar”-class kits available toenthusiasts with deep pockets. OCIA has tested the 2x1GB DDR3-10666 and taken ita bit beyond spec whilst on the bench. 1500MHz was the top speed, but you paytop dollar too, so it’d be a surprise if you couldn’t play around with thespeeds and timings. They gave it a big thumbs up,apparently.
Those zany canucks that fiddle with hardware have benched the AsrockPenryn1600SLIx3-WiFi motherboard. Like everything Asrock, it’s a mix-match oftechnologies that end up spawning an equally odd name – and as Asrock mobos go –it is quite expensive ($170). This is basically a Nvidia 680i tri-SLI chipsetthat can take Extreme Edition CPUs from Intel, but sticking to DDR2 memory – that alone sounds powerful enough towarranta read. You can’t fiddle around much with overclocking, but the systemitself already runs pretty hot.
Posted by Editorial Team Tue May 20, 2008 4:33 pm
Page 1 of 43 Goto page 1, 2, 3 ... 41, 42, 43  Next
iVirtua Latest
Latest Discussion

Discuss...
Latest Articles and Reviews

Latest Downloads
Subscribe to the iVirtua Community RSS Feed
Use RSS and get automatically notified of new content and contributions on the iVirtua Community.


Tag Cloud
access amd announced applications author based beta building business card case company content cool core course cpu create data deal dec demo design desktop developers development digital download drive email feature features file files firefox flash free future gaming google graphics hardware help industry information intel internet iphone ipod jan launch linux lol love mac market media memory million mobile money movie music net nintendo nov nvidia oct office official online patch performance playing power price product program ps3 pst publish ram release released report rss sales screen search security sep server show size software sony source speed support technology thu tue update video vista war web website wii windows work working works xbox 360 2006 2007 2008

© 2006 - 2008 iVirtua Community (UK), Part of iVirtua Media Group, London (UK). Tel: 020 8144 7222

Terms of Service and Community RulesAdvertise or Affiliate with iVirtuaRSSPress Information and Media CoverageiVirtua Version 4PrivacyContact