The Photo-Realism Challenge: Light

In this edition of The Photo-Realism Challenge, we will be dealing with light.  The topic will be covered chronologically, and for the sake of simplicity, I will follow DirectX releases (OpenGL had a really big spell in the middle where it just wasn’t very good).

Before I begin, however, I wanted to address a question from Johnmiceter in the previous article: “So how do we get real life graphics . . .?  Could the PS4 achieve close to real live images with its power?” The answer to the second question is “no.”  The most powerful PC build cannot, using present techniques, produce fully photo-realistic images in real time.  The PS4 is less powerful than this.  So no, the PS4 will not be photo-realistic.  It will be better than the PS3 (arguably much better), but it won’t be photo-realistic.

As to the first question, the answer is “change the underlying technology.”  Current 3D games use a method called “rasterizing” to generate scenes.  In a raster engine, everything is made of triangles (called polygons), and these polygon models are converted into pixels for display on your screen.  If you can, go and pick a flower.  Look at it really closely.  How many tiny triangles do you suppose it would take to completely imitate it?  Hundreds of thousands?  Millions?  More?  It simply isn’t possible, and since a flower is rarely the focus of a scene, a good artist will “dumb down” the detail of the flowers (and the trees and rocks and whathaveyou) to give more polygons to the main actors.  There are promising new techniques, such as voxel engines, that could maybe approach true photo-realism, but these are so CPU intensive as to be impractical for the foreseeable future.  Sorry.  Unless there is a radical shift in computing technology, we will forever approach reality, but we will never achieve it.

voxel

This is what voxel rendering looks like.  No, it is probably not running in real time.

On to light!  Waaaay back in the day (pre-DirectX 6 generation), games didn’t really have “light.”  If you play one of these games, you will notice that everything is just kind of “lit up,” not bright and not dark.  DirectX 6 brought us light in a “cheating” sort of way via lightmaps (lightmaps were around before DirectX, specifically in the Glide API, but who’s keeping track?).  A lightmap is a second texture put onto a model that imitates light being displayed on it.  These are pretty good at first blush, but pretty soon you start to realize that the light never moves (they are static) and that it’s just a texture.  DirectX 7 brought us transform and lighting support, a technique for shading objects with light that has largely gone away due to better methods being developed.

dx6light

This is what lightmapping looks like.  Just in case you felt nostalgic.

By better methods, I mean shaders (DirectX 8)!  In my previous article, I strongly mentioned how shaders revolutionized 3D gaming, but my emphasis was more on their role in simulating additional polygons.  Their primary purpose was lighting, however.  There are numerous lighting fuctions that shaders have performed.  Some of these functions include light “shading” objects with its color (hence the name “shader”), volumetric lighting/ crepuscular rays (light rays from a single bright source, like the sun behind a cloud or Helios in God of War III), and light beam effects (Bioshock Infinite is a good example of this).

crepuscular

Crepuscular rays.  I wanted a screenshot of Helios from GoW III, but I couldn’t find one that showcased it.  Sorry!

In DirectX 9.0c, a major change was made in the lighting model.  Previously, light sources were assigned a numerical value for their brightness, with 0 being fully dark and 1.0 being fully bright.  So a really bright light (like the sun) would be given a value of 1.0, while a 40 watt lightbulb might be given a value of, say, .2.  Reflective surfaces (water, mirrors, wet rocks, etc) were given a value for how much light they reflected.  A mirror might reflect almost all light, a lake maybe 80%, and a wet rock perhaps 50%.  But is this realistic?  If the sun shines on a wet rock in real life, maybe it will lose 50% of its brightness.  But 50% of really freaking bright is still really freaking bright.  In a game, however, that wet rock is going to reduce the sun from 1.0 to .5.  It isn’t very bright anymore!

To accommodate this, DirectX was adjusted to allow for much higher brightness levels.  The sun could now be a given a stupid brightness like 100,000,000.  It will still display as full brightness on a monitor, but when reflection and the like are taken into account, they will display properly.  This is called High Dynamic Range Rendering (HDRR).

hdrr

A decent before and after of HDRR.  I like the example for bloom below better, though.

Another addition in DirectX 9 was bloom lighting (although it was simulated in Ico much earlier).  When you take a picture with a bright light coming through a window, say, the light will “bleed” around the edges.  This effect is caused by the diffraction of light in the camera.  It isn’t really important why it happens, the point is, it happens.  Bloom lighting enables game developers to simulate this light bleed.

bloom

Bloom, plus some really nice HDRR!

So what is the future?  The first and biggest change to look forward to is particles.  Lots and lots of particles.  These small, light emitting points will allow developers to create more realistic looking fire, and will allow for neat light effects like light shining on dust particles.

particles

Particles.  Wow.  Just wow.

Another change applies to shadows.  I could write an entire article on shadow rendering (maybe I will!), but for now, it is enough to say that the biggest problem with shadow rendering is the way light tends to “bleed” around things.  Shadows in general are very hardware intensive, so newer hardware is always better.  Newer video cards allows for Multi-View Soft Shadows, which more realistically simulate how light casts and influences shadows.

shadow

Notice how the shadows diffuse.

Groan inducing pun time: The future of 3D gaming is looking bright!

I’m so sorry.

  1. Minimur12’s avatar

    Nice one Tstrauss, :D what’s next?!?

    Reply

  2. Harski’s avatar

    Thanks, I really love these article! You could’ve mentioned ray tracing though ;)

    Reply

    1. TStrauss’s avatar

      Ray tracing is gorgeous, but falls into the “never gonna happen” category. I’ve seen some tech demos, but these are always running high end hardware and relatively simple scenes.

      Reply

      1. hgoel0974’s avatar

        I wouldn’t say never going to happen. These days high end hardware could easily play a raytraced game, and todays high-end is tomorrow’s low-end ;)

        Reply

        1. TStrauss’s avatar

          The problem isn’t brute force power, it’s architecture and hardware acceleration. CUDA/ Compute cores are being used to do raytracing on GPUs today, but that isn’t hardware acceleration, it’s parallelization of tasks previously performed on the CPU.

          If some future GPU architecture sports a built in raytracing pipeline with proper hardware acceleration (it could happen I guess), it might happen. But given the current direction of 3D, rasterization is probably going to be around for the long haul.

          Reply

      2. Yuki’s avatar

        Have you seen

        Reply

        1. Yuki’s avatar

          *http://www.youtube.com/watch?v=aLPAHHb8j20

          Reply

          1. jason’s avatar

            There are some programs that use raytracers today, like keyshot. Hell I built my own raytracer using c++. However it’s not practical for real time. It needs to render an image every frame. There’s a lot of noises in the engine (when use in real time). However I do think that at one point in our future, we’ll be having raytracer as our game engine. Nvidia’s CUDA cores seems to give us a glimpse of it.

          2. TStrauss’s avatar

            Yeah, that video is raytraced. I tried to run the executable on my computer to see if it was real time, but got some error. Looking through the comments, it appears that those who did get it to run did not enjoy a high framerate. Also, the geometry is very simple in this video. Imagine raytracing a human being.

            So, er, thanks for proving my point?

  3. JiachengWeng’s avatar

    I think the light is the best as well as the most effective way to make vedio games more realistic. I also hope developers to find some way to simplify functions of creating shadows and ray effects so that those perfect effects can be completely used on portable devices like PSVita! Haha!

    Reply

  4. ZacUAX’s avatar

    It’s funny. Even though it’s less realistic, I much prefer the side of the picture without HDR on. I prefer bright, vibrant colors in my video games.

    Reply

  5. foxman’s avatar

    I dunno that much about game engines, but I did watch the 2013 GDC FoxEngine demo. I’m disappointed that it was not mentioned because a giant part of the FoxEngine segment was reflective surfaces, lighting effects, and so on. In fact, the trailer for Metal Gear Solid: Ground Zeroes was even redone in daytime to show the effect, during the conference. It’s on YouTube: http://www.youtube.com/watch?v=17nje72VnPE . (FoxEngine GDC 2013 if the link fails). I just feel that it may be a significant detail to add, as it is supposed to be for current gen, running on PS3 spec hardware, though it will be multiplatform.

    Reply

  6. Bullshiters’s avatar

    Nice atrticle as always!

    Talk about BOOBs physics next ;)

    Reply

  7. gunblade’s avatar

    what about photo shopping a bunch of 3d hd picture on a wire frame… 3d holograms..(coool 3d veiws but the color…)

    Reply

  8. gunblade’s avatar

    ooh on the lighting thing i guess it makes seens to now the differnts of light on a rock to light on a grass the grasss might give a little green glow throw the light that pass throw it wen a light on a rock would be more shiny if on like granite…

    Reply

  9. gunblade’s avatar

    like even that picture with the fire its would be more see tough lighting…

    Reply

  10. svenn’s avatar

    Why are voxel parts calculated @ CPU ? As far as I understand GPU is good at doing repetitive task that can be run in parallel, eg. rendering images. I ask cause you seem to know something about it :) nice article;

    Reply

  11. 大きい傘’s avatar

    ラミー サファリ 万年筆

    Reply

Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>