Topics

Sunday, December 7, 2014

Wanderers

An epic short film called "Wanderers" by Erik Wernquist:


Wanderers - a short film by Erik Wernquist from Erik Wernquist on Vimeo.

Words and voice by Carl Sagan:
For all its material advantages, the sedentary life has left us edgy, unfulfilled. Even after 400 generations in villages and cities, we haven’t forgotten. The open road still softly calls, like a nearly forgotten song of childhood. We invest far-off places with a certain romance. This appeal, I suspect, has been meticulously crafted by natural selection as an essential element in our survival. Long summers, mild winters, rich harvests, plentiful game—none of them lasts forever. It is beyond our powers to predict the future.
Catastrophic events have a way of sneaking up on us, of catching us unaware. Your own life, or your band’s, or even your species’ might be owed to a restless few—drawn, by a craving they can hardly articulate or understand, to undiscovered lands and new worlds.
Herman Melville, in Moby Dick, spoke for wanderers in all epochs and meridians: “I am tormented with an everlasting itch for things remote. I love to sail forbidden seas…”
Maybe it’s a little early. Maybe the time is not quite yet. But those other worlds— promising untold opportunities—beckon.
Silently, they orbit the Sun, waiting.

Thursday, July 3, 2014

More Things in Heaven and Earth...

Lightning is a fascinating natural phenomenon, and recent research is giving us more insight into less commonly seen upper-atmospheric lightning phenomena such as Sprites, ELVES and Blue jets.



Shakespeare rings true once again in this quote from Hamlet: "There are more things in heaven and earth, Horatio, Than are dreamt of in your philosophy."

Friday, May 9, 2014

Watch the Earth from orbit in real-time!

Anyone with an internet connection can now observe the Earth from space, live from the International Space Station!

The High Definition Earth Viewing (HDEV) experiment aboard the ISS was activated on April 30, 2014. The experiment includes several commercial HD video cameras mounted on the External Payload Facility of the European Space Agency's Columbus module.


Check out the streaming HDEV imagery with a real-time ISS location display. There are three cameras, one pointing forward, one backwards and one downwards. Note that when the ISS is on the night side of the Earth we see a black scene. When switching cameras or when the communications downlink is not available, we will see a gray scene. Otherwise, it's a glorious live HD video of our beautiful planet from space.

When I started Project Eden in 2011, I tried to use scientific facts and artistic imagination to visualize how the Earth looks from space. Now I'm so happy to be able to see the real thing, despite the limited resolution of the video stream.

Even though I don't know any of them, I'm very grateful to the people at NASA, ESA, the high-school students who participated in the HUNCH program  (High Schools United with NASA to Create Hardware),  and whoever else is involved to make this experiment possible. Perhaps a higher perspective will make more of us appreciate our planet and environment better, realize we are all part of a bigger existence, and lessen the illusion that our individual agendas are as important as we make them out to be.

Enjoy your view!

Sunday, March 16, 2014

Fly Me to the Moon

Long time no post. I've been busy with my indie game project for the past year, so Project Eden is taking a back seat. However, I recently had to update my old scene files for someone, so I decided to take a closer look at how I created our Earth's natural satellite - the Moon.

This is not quite a "making-of" post, but I do want to bring up a few points. Firstly - gamma correction (and indirectly, linear workflow). When I started playing with CG from the mid-90s to early 2000s, I had no concept of non-linearity or gamma. I lit my 3D scenes within whatever 3D program I used (mostly LightWave back then) and the render output was usually the final product.

Here's an image of a typical Lambert shaded surface. The image on the left is the original rendered output. However, if those calculated pixel intensity values are meant to correspond to real world intensities, the image on the right is more visually correct:


Consider a simple gradient from 0 to 1, where 0 is black and 1 is full intensity white. Logically, 0.5 should naturally appear as 50% gray. This is indeed the case for computer users because all display devices have been "corrected" to work this way.

We call it a "correction" because in reality all display devices are non-linear (50% input signal strength will not give 50% output), and our eyes are also non-linear (we are more sensitive to low light levels and less so at brighter light levels). To our eyes, we would see a large intensity jump between the values of 0 to 0.1, and an input value of 0.5 will appear to our eyes to be more like a 0.7 or 0.8.

The problem only really becomes important when we are trying to create photorealistic renders. We need to compensate for any gamma corrected input into the CG pipeline by removing any existing gamma correction (generally 2.2 for PC displays and JPEG images, and 1.8 for Mac displays), so that all working data are properly linear.

In summary: in the real world, light works linearly. Our eyes however, perceives light intensity in a non-linear fashion. Display devices also outputs intensity non-linearly from their input. Hence, the existence of a whole bunch of gamma corrections working behind the scene of every computer screen, digital camera and JPEG image - all so that we humans can use Photoshop and RGB values in an intuitive and consistent manner. But when we are simulating physical light for photorealistic 3D renders, we need to remove such gamma correction while working within the CG lighting environment, and add it back at the final stage just before outputting to a display device.

In short: all computer rendered images should have gamma correction added if the CG lighting is to behave anything like their real world counterparts. While this is true, the workflow is not quite as simple as that.

If you find what I said confusing, here's a good explanation.

Looking back at the Lambert spheres above, given the calculated output intensities, the image on the right is what our eyes would see if those intensities correspond to brightness values in reality, even though the image on the left may look more aesthetically pleasing.

Next, I would like to consider diffuse shaders. Traditionally, the diffuse shading aspect of a typical CG material would be a Lambert shader, and the specular shading aspect would be Phong or Blinn. In the on-going quest for photorealism, the industry defaults are now Oren-Nayar for diffuse and Cook-Torrance for specular. Both are more computationally intensive but give results that better matches real world materials.

The Oren-Nayar shader has a roughness component which allows us to emulate matte materials such as chalk and clay much better than Lambert can. Below are some comparison images between Lambert and Oren-Nayar, and at different gamma values:



So what do all these technical stuff have to do with the Moon? Well, firstly the Moon surface is not simply Lambertian. When was the last time you saw darkening at the edges on a full moon night? Never! We know the Moon is actually spherical, but it still appears quite flat because of the way it's surface scatters sunlight. We can simulate it's roughness and light response better using an Oren-Nayar shader.

Secondly, we need to add gamma correction to the rendered output so we can see the CG rendered intensities as our eyes would. Thankfully, Houdini's built-in image display utility "mplay" defaults to a gamma of 2.2, so if you are using Houdini you should already be seeing gamma corrected output.

Thanks for reading. Here's the CG Moon:


Enjoy!

Monday, February 10, 2014

Leap of Faith

Félix brought to my attention a very nice visualization of the Earth made by Robert Hodgin. As I understand this was a real-time display created as a backdrop for a cello performance at the World Economic Forum in Davos, Switzerland. There is a good breakdown of some of the techniques used to create the visuals at Robert Hodgin's website. Check out his Oculus Rift: Gravity project too ^^

 

Friday, February 7, 2014

Welcome to the Anthropocene

I recently got in touch with Félix Pharand-Deschênes from GLOBAÏA. They create excellent 3D visualization and graphics, many of which feature the Earth. I'm a nut for seeing the big picture and understanding the world through a higher perspective, and I respect the work they do.

As a race, we humans are affecting the surface of our Earth much faster and drastically than nature. If we squeeze the Earth's history of 4.5 billion years into 24 hours, dinosaurs only appeared around 11pm (and ruled the Earth for 160 million years), the human race existed for less than a minute (modern man with say 10,000 years of recorded history only existed for about 0.2 second). Yet in this comparatively short period of existence, an estimated 75% of the Earth's surface has been reshaped by humans. Such significant global impact on Earth's ecosystems has lead scientists to coin the term "Anthropocene" to mark our current geological epoch - the Human Epoch.

Check out GLOBAÏA's cool visualization of how we as a race are affecting our planet since the Industrial Age to present day: