asterroc ([personal profile] asterroc) wrote2007-08-14 07:21 pm

Photography

I have heard it's possible to take multiple photos at different light exposure levels and combine them, and thereby get an image closer to what the human eye sees (since we can perceive both the bright things and dark areas at once, while a photo meters for only one or the other). I've seen these done by otheres, the prime example being the inside of an unlit church w/ light streaming in through stained glass windows; let me know if you know the artist's name.

Anyone have a suggestion on how to do this in P-shop or a similar program? What I have is a series of three photos that my camera took automatically bracketting one f-stop (or shutter speed) up and down, one right after the other. I'm thinking along the lines of having a background that's solid white (or black, or gray), then adding three semi-transparent layers, one for each image, and then adjusting the amount of transparency for each until I get something pretty. To make the issue more complicated, some of the images I'm thinking of playing with are of flowers that were swaying a bit with a breeze, so a good combined image might not be possible at all, or will require offsetting of the three images so that the flower looks good but the background does not.

[identity profile] zandperl.livejournal.com 2007-08-15 11:50 pm (UTC)(link)
Hm, unfortunately that looks like a very specific tool in Photoshop. I should've specified that I'm actually using Gimp. I will google that though.

[identity profile] kelsin.livejournal.com 2007-08-16 02:03 am (UTC)(link)
Yeah, Gimp's engine is only 8-bits per channel so it can't handle true HDR. There isn't much on Linux (or open source in general) to be able to handle hdr unfortunately.