Long time no post. I've been busy with my indie game project for the past year, so Project Eden is taking a back seat. However, I recently had to update my old scene files for someone, so I decided to take a closer look at how I created our Earth's natural satellite - the Moon.
This is not quite a "making-of" post, but I do want to bring up a few points. Firstly -
gamma correction (and indirectly,
linear workflow). When I started playing with CG from the mid-90s to early 2000s, I had no concept of non-linearity or gamma. I lit my 3D scenes within whatever 3D program I used (mostly LightWave back then) and the render output was usually the final product.
Here's an image of a typical Lambert shaded surface. The image on the left is the original rendered output. However, if those calculated pixel intensity values are meant to correspond to real world intensities, the image on the right is more visually correct:
Consider a simple gradient from 0 to 1, where 0 is black and 1 is full intensity white. Logically, 0.5 should naturally appear as 50% gray. This is indeed the case for computer users because all display devices have been "corrected" to work this way.
We call it a "correction" because in reality all display devices are non-linear (50% input signal strength will not give 50% output), and our eyes are also non-linear (we are more sensitive to low light levels and less so at brighter light levels). To our eyes, we would see a large intensity jump between the values of 0 to 0.1, and an input value of 0.5 will appear to our eyes to be more like a 0.7 or 0.8.
The problem only really becomes important when we are trying to create photorealistic renders. We need to compensate for any gamma corrected input into the CG pipeline by removing any existing gamma correction (generally 2.2 for PC displays and JPEG images, and 1.8 for Mac displays), so that all working data are properly linear.
In summary: in the real world, light works linearly. Our eyes however, perceives light intensity in a non-linear fashion. Display devices also outputs intensity non-linearly from their input. Hence, the existence of a whole bunch of gamma corrections working behind the scene of every computer screen, digital camera and JPEG image - all so that we humans can use Photoshop and RGB values in an intuitive and consistent manner. But when we are simulating physical light for photorealistic 3D renders, we need to remove such gamma correction while working within the CG lighting environment, and add it back at the final stage just before outputting to a display device.
In short: all computer rendered images should have gamma correction added if the CG lighting is to behave anything like their real world counterparts. While this is true, the workflow is not quite as simple as that.
If you find what I said confusing, here's
a good explanation.
Looking back at the Lambert spheres above, given the calculated output intensities, the image on the right is what our eyes would see if those intensities correspond to brightness values in reality, even though the image on the left may look more aesthetically pleasing.
Next, I would like to consider diffuse shaders. Traditionally, the diffuse shading aspect of a typical CG material would be a Lambert shader, and the specular shading aspect would be Phong or Blinn. In the on-going quest for photorealism, the industry defaults are now Oren-Nayar for diffuse and Cook-Torrance for specular. Both are more computationally intensive but give results that better matches real world materials.
The Oren-Nayar shader has a roughness component which allows us to
emulate matte materials such as chalk and clay much better than Lambert
can. Below are some comparison images between Lambert and Oren-Nayar, and at different gamma values:
So what do all these technical stuff have to do with the Moon? Well, firstly the Moon surface is not simply Lambertian. When was the last time you saw darkening at the edges on a full moon night? Never! We know the Moon is actually spherical, but it still appears quite flat because of the way it's surface scatters sunlight. We can simulate it's roughness and light response better using an Oren-Nayar shader.
Secondly, we need to add gamma correction to the rendered output so we can see the CG rendered intensities as our eyes would. Thankfully, Houdini's built-in image display utility "mplay" defaults to a gamma of 2.2, so if you are using Houdini you should already be seeing gamma corrected output.
Thanks for reading. Here's the CG Moon:
Enjoy!