• Log in with Facebook Log in with Twitter Log In with Google      Sign In    
  • Create Account
  LongeCity
              Advocacy & Research for Unlimited Lifespans


Adverts help to support the work of this non-profit organisation. To go ad-free join as a Member.


Photo
* * * * * 1 votes

Pushing the PC gaming boundaries


  • Please log in to reply
11 replies to this topic

#1 Matt

  • Guest
  • 2,862 posts
  • 149
  • Location:United Kingdom
  • NO

Posted 17 November 2007 - 04:14 PM


Pushing the PC gaming boundaries

http://news.bbc.co.u...ogy/7096891.stm

The PC remains a big player in the games market but in recent years its cutting edge has been blunted.

The vast majority of games are still played on PCs rather than consoles; typically casual games played on cheap desktop machines or online games, such as World of Warcraft.

The industry is currently experiencing a renaissance in innovation as the trinity of new hardware, developer ambition and tools come together to improve experiences.

The introduction of chip technology with four cores, effectively quadrupling processing power, graphics cards using DirectX 10 tools and developers keen to push powerful machines to the limit are resulting in games which set new graphical benchmarks.

In some cases these machines are desktop behemoths; near supercomputers in a box that are delivering game experiences beyond the wildest dreams of console owners.

The latest games, like Crysis and Unreal Tournament 3, are taking advantage of quad core processors, and twin graphics cards. These are the play things of hotrod PC gamers - the enthusiasts who see their machines as customisable dragsters delivering the pinnacle of performance.

"PC gamers see themselves as the elite gamers," said Michael O'Dell, who runs the professional gaming group Team Dignitas and manages Birmingham Salvo, a team in the Championship Gaming Series.

"High end games PCs are important to the professional players and hard core because the extra processing power can make that millisecond of difference between success and failure, and whether you win prize money or not."

For the hardcore the extra grunt of the most powerful desktops improve the FPS (frames per second) in FPS (First Person Shooter) games.

"My gamers are always moaning about their FPS (frames per second). They always want more and some of the newest games are very demanding on the hardware."

For these gamers, whose reaction times put them in the superhuman category, more frames per second means a smoother experience.

So how much more powerful are these high-end PCs than the latest generation of consoles?

"It's absolute nonsense to think that consoles are at the cutting edge," said Roy Taylor, vice president of content relations at Nvidia, the world's biggest manufacturer of graphics cards.

"As good as consoles are, they are so far behind the PC gaming experience that there is no comparison.

"In terms of raw processing power, the high-end PCs are at least three times more powerful."

Nvidia provides the graphics grunt for the PlayStation 3, while rival ATI provides the imaging hardware for the Xbox 360.

Mr Taylor points out that the latest graphics cards can draw twice as many pixels, twice the screen resolution, as a PlayStation 3 or Xbox 360.

The latest games are employing DirectX 10 tools developed by Microsoft, which are used by developers to get the best out of the high-end and middle-range graphics cards.

Mr Taylor said the new tools and the new hardware had given developers a library of effects to play with.

Nvidia's latest high-end graphics cards, the 8800 series, can easily produce graphical effects that tax the Xbox 360 and PlayStation 3, such as motion blur, depth of field and volumetric smoke.

Mr Taylor said: "Fog, smoke or mist in games until now have been flat and don't respond to objects. Volumetric effects mean they are dynamic - a helicopter can now displace cloud or smoke, or a character can step through the fog realistically."

But these sorts of effects come at a price.

A quadcore Intel machine with twin graphics cards and four gigabytes of ram - at the high end of the PC gaming experience - can cost more than £2,000, six times the price of an Xbox 360.

Nvidia's flagship graphics card, the 8800 Ultra, costs more than £400 although a cut-down version, the 8800 GT, costs from £120, about the same price as a Nintendo Wii.

Rival ATI also sends a high-end graphics card which supports DirectX 10, costing from about £120.

Hardcore PC gamers also specialise in customising their "rigs", with unique cases and intricate cooling systems.

The gaming experience they deliver can be exceptional.

Playing Crysis with the screen resolution set at 1920x1200 with all effects switched up to very high and anti-aliasing turned on, the game is breathtaking to look at and puts consoles titles like Gears of War and Call of Duty 4 into the shade.

"We worked really closely with Intel and Nvidia and even had engineers from Nvidia on site for the last year," said Bernd Diemer, a producer on Crysis at developers Crytek.

"We wanted to be an early adopter. When we started Crysis the current hardware wasn't available or being planned. There was no DX10 or the latest graphic cards. They were not even on the drawing board."

They went to a special effects company in Hollywood to create a render movie of how Crysis could look - and that movie has been the benchmark for the firm.

"We got pretty close. In some areas we even surpassed it," said Mr Diemer.

He said PCs gave gamers the "best possible experience".

Crysis boasts realistic breakable environments - a goal of developers for many years.

"In some areas we have managed to set a new standard. We've managed to push it a bit further," he said.

Crysis is at the forefront of a wave that is delivering blockbuster titles to PCs and making console owners envious of their PC gaming friends.

"The PC is finally back up where it belongs," said Mr Diemer.

He added: "The innovation is happening on the PC; but that's always been the case."

#2 Ghostrider

  • Guest
  • 1,996 posts
  • 56
  • Location:USA

Posted 18 November 2007 - 12:11 AM

The latest games, like Crysis and Unreal Tournament 3, are taking advantage of quad core processors, and twin graphics cards. These are the play things of hotrod PC gamers - the enthusiasts who see their machines as customisable dragsters delivering the pinnacle of performance.


I looked at some screen shots from Crysis. The images are pretty amazing and very life-like. I am amazed at how quickly gaming technology has progressed. The system requirements for Crysis are pretty demanding, I have never seen a game that required a Core2 processor, and not a low-end one either. I also heard that Crysis runs about 36% faster on a Quad Core vs. dual core system. Do it must make very good use of multithreading. I am awaiting Intel's Larrabee, it looks like it might become the greatest thing to ever happen to Folding@Home.

http://www.theregist...larrabee_gpgpu/

sponsored ad

  • Advert

#3 niner

  • Guest
  • 16,276 posts
  • 2,000
  • Location:Philadelphia

Posted 18 November 2007 - 04:59 AM

I am awaiting Intel's Larrabee, it looks like it might become the greatest thing to ever happen to Folding@Home.


Gelsinger hesitated to elaborate more on the product other than to add that it will reach at least one teraflop.

Wow. This kind of makes me want to get back into writing molecular dynamics code. That's really a poopload of processing power.

#4 HellKaiserRyo

  • Guest
  • 41 posts
  • 0

Posted 27 November 2007 - 08:59 AM

I am awaiting Intel's Larrabee, it looks like it might become the greatest thing to ever happen to Folding@Home.


Gelsinger hesitated to elaborate more on the product other than to add that it will reach at least one teraflop.

Wow. This kind of makes me want to get back into writing molecular dynamics code. That's really a poopload of processing power.



The limits of 2D graphics are impressive, but volumetric displays will require several orders of magnitude of more computer power. Also, we would need to have information travel at gigabits per second regularly. I have a 5 megabit per second connection and I am content with it, but I think I would need more bandwidth if we want to see volumetric displays from the internet.

#5 maestro949

  • Guest
  • 2,350 posts
  • 4
  • Location:Rhode Island, USA

Posted 27 November 2007 - 12:54 PM

I am awaiting Intel's Larrabee, it looks like it might become the greatest thing to ever happen to Folding@Home.


Gelsinger hesitated to elaborate more on the product other than to add that it will reach at least one teraflop.

Wow. This kind of makes me want to get back into writing molecular dynamics code. That's really a poopload of processing power.



The limits of 2D graphics are impressive, but volumetric displays will require several orders of magnitude of more computer power. Also, we would need to have information travel at gigabits per second regularly. I have a 5 megabit per second connection and I am content with it, but I think I would need more bandwidth if we want to see volumetric displays from the internet.


I think we'll have our several orders of magnitude fairly soon. AMD's FireStream 9170 GPU card will pack 500 gigaflops of computing power, 2GB RAM, and will be the first to support double precision floating point. Starts at $1,999 and will be out in Q1 of 2008. 1 gigaflop costed $30,000 10 years ago. After that we'll just need a good bundle of fiberoptic cables run to our homes.

AMD FireStream 9170

#6 FunkOdyssey

  • Guest
  • 3,443 posts
  • 166
  • Location:Manchester, CT USA

Posted 27 November 2007 - 04:25 PM

I looked at some screen shots from Crysis. The images are pretty amazing and very life-like. I am amazed at how quickly gaming technology has progressed. The system requirements for Crysis are pretty demanding, I have never seen a game that required a Core2 processor, and not a low-end one either. I also heard that Crysis runs about 36% faster on a Quad Core vs. dual core system.

The video card(s) is the bottleneck in Crysis and other modern games -- this becomes more and more true the higher you push the resolution. In order to test the CPU's performance as a limiting factor, you need to turn down the resolution to 800x600 or even 640x480 in some games. Even with two 8800GTX Ultra's in SLI, the current fastest possible video configuration, Crysis will not run with acceptable framerates at 1920x1200 resolution with maximum quality settings. This year we will see the first quad SLI and quad crossfire enabled motherboards and video cards that will attempt to address these shortcomings.

Intel's newest core 2 duo and quad core processors offer more power than any current games know what to do with. The GPU is where we need to see some innovation and progress.

Edited by FunkOdyssey, 27 November 2007 - 04:26 PM.


#7 jaydfox

  • Guest
  • 6,214 posts
  • 1
  • Location:Atlanta, Georgia

Posted 27 November 2007 - 06:11 PM

I looked at some screen shots from Crysis. The images are pretty amazing and very life-like. I am amazed at how quickly gaming technology has progressed. The system requirements for Crysis are pretty demanding, I have never seen a game that required a Core2 processor, and not a low-end one either. I also heard that Crysis runs about 36% faster on a Quad Core vs. dual core system.

The video card(s) is the bottleneck in Crysis and other modern games -- this becomes more and more true the higher you push the resolution. In order to test the CPU's performance as a limiting factor, you need to turn down the resolution to 800x600 or even 640x480 in some games. Even with two 8800GTX Ultra's in SLI, the current fastest possible video configuration, Crysis will not run with acceptable framerates at 1920x1200 resolution with maximum quality settings. This year we will see the first quad SLI and quad crossfire enabled motherboards and video cards that will attempt to address these shortcomings.

Intel's newest core 2 duo and quad core processors offer more power than any current games know what to do with. The GPU is where we need to see some innovation and progress.


Actually, I think the CPU power is going to waste because too much emphasis is on graphics, and not enough on powerful gameplay. Physics engines have come a long way since I was a kid (I could single-handedly have written a better physics engine as a teenager than many of the games at the time had), but there's always room for improvement. More importantly, AI doesn't just have room for improvement, it's bearly passable in many instances.

I applaud the graphics engine designers for pushing the limits of graphics performance and creating engines that will continue to mature after release with increased hardware capacity, but we need some serious research on the AI side, and if the CPU is idling along at 20%-50%, then perhaps the physics engines need to be beefed up relative to the graphics engines.

#8 Ghostrider

  • Guest
  • 1,996 posts
  • 56
  • Location:USA

Posted 30 November 2007 - 07:56 AM

Crysis makes good use of a Quad Core CPU. It is well multi-threaded, more games will probably follow this trend.

#9 forever freedom

  • Guest
  • 2,362 posts
  • 67

Posted 05 December 2007 - 07:44 PM

I remember games from just a few years ago. Damn they are evolving pretty fast. Imagine what they're gonna look like in 15 to 20 years.

#10 knite

  • Guest
  • 296 posts
  • 0
  • Location:Los Angeles, California

Posted 07 December 2007 - 09:29 PM

yeah 7 years ago you could probably have hand counted the polygons on models, now they are nearly perfectly curved

#11 Liquidus

  • Guest
  • 446 posts
  • 2
  • Location:Earth

Posted 07 December 2007 - 10:14 PM

Posted Image

Unreal Tournament, released Nov 99

Posted Image

Unreal Tournament 3, released Oct 07.

8 years difference right there.

sponsored ad

  • Advert

#12 maestro949

  • Guest
  • 2,350 posts
  • 4
  • Location:Rhode Island, USA

Posted 08 December 2007 - 06:36 PM

Unreal Tournament, released Nov 99


Unreal Tournament 3, released Oct 07.

8 years difference right there.


Improving fast. It'll be great when this quality of graphics works its way into virtual operating rooms for virtual surgery. This article in Scientific American suggests that it may only be twenty years off from the date that the hardware and software make it possible.

Edited by maestro949, 08 December 2007 - 06:37 PM.





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users