(:cell width=48:)

Tutorials & articles

Back

17/07/2009

HDR images on Linux (part 1)

First of all: This is not a “mathematically-pure HDR approach”-article, so bear with me.

There’s been a trend in photography about HDR images, how to produce them, and how to work with them. Step by step:

HDR ?

HDR stands for High Dynamic Range. Usually a picture when shot with a DSLR (or a compact camera, for that matter) has a quite limited dynamic range. You can get at most 11 stops between the lighter part of the image and the darker.

This is a limitation that’s hard to overcome. Imagine you shot an image of the sky, including the sun. You’ll get a burned-out image, since the sun is very bright. But when you check the image, the sun is just white (255 255 255). Now go a shot anything else that’s white. You’ll probably end up with white pixels.

Does this mean that those pixels are as bright as the sun ? No way, of course.

The problem lies in the dynamic range that your camera is able to capture. The sun “exceeds” it by far, and your image gets clipped.

Now, imagine that there’s a method to obtain, store and represent images with more dynamic range. The sun would be, for example 100% bright, while the sky whould be 9%, and your white pixels 0,001%. That’s an HDR image.
An HDR image is nothing more than a 32, 48 or even 96 bits image (when usually a computer image is just 24bit).
With such a bit-depth, we can represent floating point numbers, and we can represent HDR images.

Creating HDR with a normal camera

If a camera has a limited dynamic range, the question is how to create an HDR. The answer is that we have to take several shots with the camera at different exposures, and then a software will combine and interpolate this information to create an HDR image. As simple and complex as that.

Most DSLR have bracketing options. With this feature you can take several (usually 3) different shots quite fast. The first is taken 1 or 2 stops underexposed, the second is correctly exposed (or whatever the camera thinks it’s ok), and the third one is 1 or 2 stops overexposed.
I usually set up my trusty D70 in “sequence-mode”, so that these 3 shots are taken without having to press and release the button 3 times, but just holding the button down.

In any case, a tripod is nearly a must, otherwise, be prepared to fight the ghosting. Also, if you do manual settings for each shot, try to adjust only the exposure, since a modification of the aperture yields to a different depth of field, and the images will not be the same.

Where does The Gimp fit ?

The greatest limitation you’ll find in The Gimp (IMO) is that is still unable to handle images with more than 8 bit per pixel (24 bit). While this is just OK for most users, pros and advanced amateurs will want to squeeze the maximum from their images.
In these cases, more than 8 bit per pixel is a must.

Of course, this also excludes HDR.

There are ways around this, though.

Exposure Blend

This is a plugin you’ll find under the “Filters > Photo” menu:

The usage is simple: specify the light, normal and dark images:

Leaving the defaults (I have not played with the options) you’ll get:

The result could be better. Clearly, in this specific case, I’d need more than 3 shots. Probably 5 or even 7, since there is quite contrast between the sky and the walls.
Anyway, you get the idea.

In any case, never, ever, this is HDR. This is just a way to merge 3 differently exposed pics to get a nicely-exposed final result, but it is NOT HDR.
This plugin takes the lighter and darker images, and it blends them with the correctly-exposed image. Nothing else.

Real HDR

I’ll talk about working with real HDR images on Linux, and more advanced topics such as Tone Mapping, The Film Gimp, etc… on the next entry.


Other tutorials and articles from this section:


© 2018 Esteve Boix | Powered by PmWiki | Original skin Barthelme theme by Scott and ported by Chi Shang.