Photography
Aug. 14th, 2007 07:21 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
I have heard it's possible to take multiple photos at different light exposure levels and combine them, and thereby get an image closer to what the human eye sees (since we can perceive both the bright things and dark areas at once, while a photo meters for only one or the other). I've seen these done by otheres, the prime example being the inside of an unlit church w/ light streaming in through stained glass windows; let me know if you know the artist's name.
Anyone have a suggestion on how to do this in P-shop or a similar program? What I have is a series of three photos that my camera took automatically bracketting one f-stop (or shutter speed) up and down, one right after the other. I'm thinking along the lines of having a background that's solid white (or black, or gray), then adding three semi-transparent layers, one for each image, and then adjusting the amount of transparency for each until I get something pretty. To make the issue more complicated, some of the images I'm thinking of playing with are of flowers that were swaying a bit with a breeze, so a good combined image might not be possible at all, or will require offsetting of the three images so that the flower looks good but the background does not.
Anyone have a suggestion on how to do this in P-shop or a similar program? What I have is a series of three photos that my camera took automatically bracketting one f-stop (or shutter speed) up and down, one right after the other. I'm thinking along the lines of having a background that's solid white (or black, or gray), then adding three semi-transparent layers, one for each image, and then adjusting the amount of transparency for each until I get something pretty. To make the issue more complicated, some of the images I'm thinking of playing with are of flowers that were swaying a bit with a breeze, so a good combined image might not be possible at all, or will require offsetting of the three images so that the flower looks good but the background does not.
no subject
Date: 2007-08-15 12:22 am (UTC)for example, i suspect that our excellent subjective depth of field is in large part an illusion. after all, you notice that something is out of focus mainly when you give it your attention, and when we give something our attention our eyes refocus on it. a shot that's already been exposed once doesn't have this luxury, so if our attention shifts we know how much is out of focus.
could our subjectively good dynamic range be in part due to something like that? when we're attending to the darker parts of our field of view we can enlarge our pupils, and we can shrink them again when we attend to the brighter parts.
sorry i don't have any actual help. i have heard of that sort of thing but have never tried it. i suspect it's more in demand these days as most digital sensors have significantly worse dynamic range than film used to.
no subject
Date: 2007-08-15 01:58 am (UTC)I'm sure you're right that our apparent depth of field is mostly due to how we change focus so quickly; I'm not sure about our dynamic range, but your guess seems reasonable.
no subject
Date: 2007-08-15 02:13 am (UTC)you are right about the linearity, for prettymuch all digital photosensor technologies being used in cameras.
no subject
Date: 2007-08-15 03:09 am (UTC)If you Google HDR and Photoshop, I'm sure you can find tutorials or info.
no subject
Date: 2007-08-15 03:50 am (UTC)no subject
Date: 2007-08-15 12:17 pm (UTC)no subject
Date: 2007-08-15 05:16 pm (UTC)no subject
Date: 2007-08-15 11:43 pm (UTC)no subject
Date: 2007-08-15 06:35 am (UTC)As for Photoshop or similar, I suspect that in addition to adjusting transparencies, you'll need to make a series of mattes and play a bit with the curves -- but you probably already knew that.
no subject
Date: 2007-08-15 11:44 pm (UTC)Um, no. Why will I have to make mattes? And I really hope I don't have to play with curves, that's a pain.
no subject
Date: 2007-08-16 06:55 am (UTC)no subject
Date: 2007-08-15 11:49 pm (UTC)http://en.wikipedia.org/wiki/Photoelectric_effect
http://en.wikipedia.org/wiki/Charge-coupled_device
In short, each pixel is a material where when a photon hits it, an electron pops out (the photoelectric effect, and the only one of Einstein's three pivotal papers to actually win him the Nobel Prize). Put together a bunch of them and you get a CCD (astronomy) or CMOS (commercial digital cameras). Everything other than the sensor is identical in film and digital.
no subject
Date: 2007-08-16 06:59 am (UTC)> Everything other than the sensor is identical in film and digital.
Ahh -- but it was the nature of the sensor that I wanted to understand better, and I'm afraid I don't have enough physics in my pocket to be able to get quick explanations :( .
Someday -- sometimes I already look forward to retirement...
no subject
Date: 2007-08-16 02:18 pm (UTC)In B&W film, what happens is a photon hits the paper and causes a chemical reaction in the silver atoms that makes them turn dark after more chemicals (developer) are applied to them. In a CCD the photon knocks an electron off of the atoms in the material, and a moving electron == current.
no subject
Date: 2007-08-17 08:15 pm (UTC)no subject
Date: 2007-08-15 02:14 pm (UTC)http://www.cambridgeincolour.com/tutorials/high-dynamic-range.htm
There is a good tutorial it looks like. Just google HDR and photoshop and you'll find more.
no subject
Date: 2007-08-15 11:50 pm (UTC)no subject
Date: 2007-08-16 02:03 am (UTC)