Photo-realistic graphics
Discussion
Clue's in the title people..............when do you think we'll be playing games with graphics so realistic that you can't tell the difference between a game & a film?
I know it's all relevant to processing power etc, which, if I remember rightly, doubles about every 18 months or so, but what kind of power/spec will it take for photo-realistic graphics? Luckily for us being pistonheads, I think racing games will be one of the first genres to really have that level of graphical output as some cars in some games are almost there already; there are no facial expressions/explosions etc etc to have to deal with.
If you can answer that then you'll probably have a rough idea of how long it will take before we're playing games that look like the Final Fantasy film, which to be honest wasn't far off photo-realism.
I know it's all relevant to processing power etc, which, if I remember rightly, doubles about every 18 months or so, but what kind of power/spec will it take for photo-realistic graphics? Luckily for us being pistonheads, I think racing games will be one of the first genres to really have that level of graphical output as some cars in some games are almost there already; there are no facial expressions/explosions etc etc to have to deal with.
If you can answer that then you'll probably have a rough idea of how long it will take before we're playing games that look like the Final Fantasy film, which to be honest wasn't far off photo-realism.
I'd say that HL2 looks like computer generated stuff in flims did about 3 or 4 years ago... so if things continue to go as fast that then maybe between 3 and 5 years away? Of course, what you see in films then will be even better.
The thing is that as graphics engines get better and better the amount of work you have to do to generate the content for the game gets bigger and bigger...
The thing is that as graphics engines get better and better the amount of work you have to do to generate the content for the game gets bigger and bigger...
I'd say that HL2 looks like computer generated stuff in flims did about 3 or 4 years ago... so if things continue to go as fast that then maybe between 3 and 5 years away? Of course, what you see in films then will be even better.
The thing is that as graphics engines get better and better the amount of work you have to do to generate the content for the game gets bigger and bigger...
The thing is that as graphics engines get better and better the amount of work you have to do to generate the content for the game gets bigger and bigger...
Another game to look out for is Stalker: Shadows of Chernobyl. Not only is there a great premise and superb setting for the game itself, but the engine looks brilliant. Unfortunately, it has been delayed to Q1 2005 for further development including, IIRC, Pixel Shader 3 paths for the engine.
However, I think the next big thing will not be a new game or graphics engine, but hardware. Specifically, nVidia's SLI technology, which allows two PCI-Express graphics cards to be hooked together on a single mobo. The sheer pixel-pushing power of an SLI rig will mean playable framerates at high resolutions with all graphical options turned up... and that means near photo-realistic graphics in the games we are all waiting for. Plus an empty wallet.
[k]
>> Edited by [k]ar| on Monday 30th August 00:35
However, I think the next big thing will not be a new game or graphics engine, but hardware. Specifically, nVidia's SLI technology, which allows two PCI-Express graphics cards to be hooked together on a single mobo. The sheer pixel-pushing power of an SLI rig will mean playable framerates at high resolutions with all graphical options turned up... and that means near photo-realistic graphics in the games we are all waiting for. Plus an empty wallet.
[k]
>> Edited by [k]ar| on Monday 30th August 00:35
D_Mike said:
oo, that sounds cool. I already have an ATI PCI express video card in my machine (its pretty good). I will have to see if I have two PCIe slots. It's not really anything new though is it? Anybody remember having 2 12Mb Voodoo 2s?
Well remembered, D_Mike . "SLI" has indeed been resurected by nVidia from when they bought out the remains of 3DFX. However, back then it stood for "Scan Line Interleave", whereas now it is "Scalable Link Interface". No doubt an astute marketing decision to take advantage of people's recollection . Essentially, it is a refinement of the older technology that allows dynamic load balancing between the cards (rather like SMP with dual CPUs) instead of simply dividing the screen up equally.
Now for the bad news - it's nVidia proprietary technology, so you won't be able to SLI your ATI card unless they come up with something similar of their own, and there's no indication that they have an answer to SLI. Furthermore, as far as I know, there are no mobos out at the moment which support dual PCI-e graphics, although there have been rumours that of the 3 or 4 different versions of the nForce4 chipset, one will be specifically optimised for SLI.
I'm waiting to upgrade my current system until SLI is ready to go, probably around Christmas this year or Q1 next year. There's some more info on SLI, if you're interested, on this page:-
www.nvidia.com/page/sli.html
[k]
[edit]
You can see some screenshots of Stalker: Shadows of Chernobyl on the dev teams website here:
www.stalker-game.com/index.php?t=gallery&page=1
[/edit]
>> Edited by [k]ar| on Monday 30th August 15:59
D_Mike said:
oo, that sounds cool. I already have an ATI PCI express video card in my machine (its pretty good). I will have to see if I have two PCIe slots. It's not really anything new though is it? Anybody remember having 2 12Mb Voodoo 2s?
I was never rich enough in my student years but I remember the facility.
centurion07 said:It certainly used to. We (chip industry) are struggling a bit now as we hit physical problems like trying to use feature-sizes that vastly smaller than the wavelength of visible light!
I know it's all relevant to processing power etc, which, if I remember rightly, doubles about every 18 months or so
D_Mike said:
Don't you just use UV sensitive masks instead? Or go even further... I guess its up to us chemists to develop those
The fab boys are playing all sorts of games with UV, interference patterns and other dodgy ideas that work in labs but yield poorly in production Mask production costs are rocketing because of the number and needed complexity for getting accurate features onto wafers at 65nm.
That ain't the only problem, leakage currents are rising as the gate-oxide thickness drops (and channel lengths fall) killing battery life and causing huge grief to design teams doing mobile work like mine. MTBF numbers are starting to look like dropping because of all sorts of reasons.
Sheepy
(I can't believe I'm actually posting this stuff in the computer games section of a car website!!)
>> Edited by sheepy on Friday 3rd September 16:43
sheepy said:Since eventually the inability for electrical signals to travel at more than 2.5x10**8 will restrict the speed of CPUs there will be more of a shift to parallel processing won't there?
centurion07 said:
I know it's all relevant to processing power etc, which, if I remember rightly, doubles about every 18 months or so
It certainly used to. We (chip industry) are struggling a bit now as we hit physical problems like trying to use feature-sizes that vastly smaller than the wavelength of visible light!
We're already seing this in the form of 3D accelerators and hardware soundcards which take some of the processing load off of the processor.
FourWheelDrift said:
Maybe if someone could come up with a compatible way of losing the 640k CPU limit that's has been imposed on PC's since the early days, maybe that would help?
>> Edited by FourWheelDrift on Friday 10th September 16:32
Wasn't it Bill Gates who came up with that figure on the basis that nobody could possibly want more memory?
rsvmilly said:Does make me laugh, all these guys running round designing Ghz processors etc only to stick them in a system with Mhz memory i/f and Kb memory limits!
FourWheelDrift said:
Maybe if someone could come up with a compatible way of losing the 640k CPU limit that's has been imposed on PC's since the early days, maybe that would help?
Wasn't it Bill Gates who came up with that figure on the basis that nobody could possibly want more memory?
Bit like the car industry developing whopping great V12s, fitting them into minis and using a coke can for a fuel tank!!
You wouldn't believe the processing power and memory requirements we're utilising just so that some scruffy oik can annoy people on a train with a stupid ringtone!!!
Gassing Station | Video Games | Top of Page | What's New | My Stuff