An exclusive gaming industry community targeted
to, and designed for Professionals, Businesses
and Students in the sectors and industries
of Gaming, New Media and the Web, all closely
related with it's Business and Industry.
A Rich content driven service including articles,
contributed discussion, news, reviews, networking, downloads,
and debate.
We strive to cater for cultural influencers,
technology decision makers, early adopters and business leaders in the gaming industry.
A medium to share your or contribute your ideas,
experiences, questions and point of view or network
with other colleagues here at iVirtua Community.
OK, they are not the most objective source, but graphics processormanufacturer nVidia does make a pretty convincing argument for spendingmore money on a computer’s graphics card and less on the mainprocessor—in certain conditions.
I met with them last week to hear their case, and today they launched a new siteto help people calculate how much polygon muscle they need. The gist:Often you can get better performance for the same amount of money ifyou spend more on the graphics processing unit (GPU) and less on thecentral processing unit (CPU).
Piclens:
Photo by Piclens
Now, the GPU is inconsequential if all you do are tasks like e-mailing,light Web surfing, and preparing documents in Word and Excel. But insome cases, the GPU makes a big difference. 3D video games are a given,but intensive graphics are creeping into many other applications. Google Earth,for example, now provides 3D views of urban landscapes, and my pooriMac nearly overheated trying to render a view of downtown Manhattan. Alittle less power-intensive, but maybe even more fun, is PicLens—aweb browser plug-in that lets you navigate photo sites like Flickr orGoogle’s image search in 3D, similar to how you flip though albumcovers in iTunes.
I agree with Nvidia, today graphics are more important. I have a 7900GT, and at the time, it was a nice card. Today it is just average. My CPU is a dual core 2.4 GHz AMD Opteron. When I play a game such as Call of Duty 4, I have most details maxed out but I still get some slowness. I do live CPU usage monitoring during the game and I notice that the game never gets above 85% CPU usage. Today it seems all a CPU does in a game is physics/movement, scoring, input actions, and in-game maintenance such as whether or not an object is created or removed, or the environment (wind, sky, weather, etc). Going quad core on a gaming computer is plain stupid, a waste of money. Going quad SLi or Crossfire, well, thats a different story.