Hello and welcome to yet another topic which was quite a revelation for me as I understood a bit more of digital photography. For many years I was using a DSLR more like a glorified point and shoot. While I did know a little bit of "reciprocity" (not the term until very recently) of ISO, shutter speed and aperture, I was quite not sure why the camera didn't see it the way my eyes saw it. My grey matter was obviously processing it different from the camera's grey matter.
A few years later I got to understand a digital camera loves grey and cannot really process a pure black or white. I got to understand this through an experiment. The images below are of a black sheet, a white sheet and one with black and white interlaced.
Black White
Black and White interlaced
The first shot was of a black sheet in bright daylight. The 2nd shot was of a plain white sheet with the same settings as before and the camera captured it as black. When I interlaced the black and white sheets the camera knew something was happening there and it distinguished the colour differences to some extent. All 3 shots were taken at the same time with the same camera settings with the entire frame with just the paper. The point of this post is not to get too technical and explain why this happens but to highlight the fact that this happens in a digital camera sensor. This experiment was a huge eye opener for me just for the fact that the camera does not exactly see what you see.
Now that you understand the camera thinks differently, how does that change your process in photography? There are several things one can do such as:
a) Adjusting exposure metering using a grey card
b) Adjusting white balance
c) Post processing using software
More about these in later posts.
Until later...
C
Comments