top of page

Behind the Infrared

In this post I aim to explore the science behind infrared photography. This will be updated as I learn more - taking more photographs and experimenting with techniques will inevitably lead me to ask more questions, which I will aim to answer here.


The electromagnetic spectrum


Electromagnetic radiation travels in waves. The type of radiation is determined by how long those waves are, from the tiniest of gamma rays measured in picometers to giant radio waves measured in kilometers. When the waves are between approximately 380 nanometers (nm) and 700nm, we can see them. This is known as visible light, which is further split into different categories - the colours of the rainbow. Red light has longer wavelengths than violet light, but what happens when you get "redder than red"? After approximately 700nm, this light is no longer visible to the naked eye and is now called infrared.



Why does this matter in photography?


The sensors in digital cameras are not just sensitive to visible light, but also to a small amount of ultraviolet light and a portion of infrared light. Our images, however, only display the visible light captured on the sensor. This is because there are filters in front of the sensor to block those UV and IR rays. With these sensors on, the camera can only record visible light, much like our eyes do, but when the filters are removed, the camera can record a wider range of light.



I tend to use four filters for my infrared photography. My camera has been converted to have a built-in 590nm filter, so it blocks all wavelengths shorter than 590nm. As you can see in the images above, this will still allow orange and red light to pass through. My 720nm filter will also allow through a small bit of visible light, but only in the red wavelengths. My 850nm filter will block all visible light so only infrared will be recorded, and my 950nm filter will do the same, but will also block the shorter wavelengths of infrared too.


It is important to note that infrared is often split into two categories - near-infrared (approx. 700nm to 900nm) and far-infrared (approx. 900nm to 15,000nm). Near-infrared is used primarily for photography and far-infrared is used for thermal imaging. This is because just as visible wavelengths can be seen as light, these far-infrared wavelengths can be felt as heat. A smouldering coal may not emit any light, but the heat you can feel is infrared radiation, and can be recorded on a thermal imaging camera that is sensitive to longer wavelengths of infrared. My 950nm filter doesn't quite give me the capability to see heat as the sensor in my camera can only record up to 1200nm wavelengths, but I may be able to pick up very slight tonal differences within that small range.


What about infrared film?


While I will be focusing mostly on digital infrared photography, I felt it would be interesting to briefly touch on analogue photography as well. It is the film that determines whether you will capture infrared light or not. Most film is only sensitive to visible light, but infrared film can be sensitised to both, or even infrared alone. These must be used in conjunction with an appropriate filter on the lens to help eliminate or reduce the unwanted wavelengths of light.


How is infrared light recorded on a digital camera?


Digital cameras record data about the colour and intensity of light that reflects off objects into the lens. The more intense the light, the brighter the representation in the image. Infrared light does the same thing and, although it is invisible to us, the camera's sensor records it and interprets that information just as with visible light. The more infrared light reflected onto the sensor, the brighter that part of the image will be.


This is why, in pure infrared photography, the images come out as black and white. It is simply a record of how much infrared light is reflected into the camera - surfaces that reflect more infrared will appear brighter and surfaces that absorb more infrared will appear darker.



Why isn't the sky blue?


If a camera is recording both infrared light and a part of the range of visible light, this can create some interesting images. The 590nm filter I use, for example, still allows through orange and red tones. I had wondered why the sky in my images appears more red - surely if blue light is blocked by the filter, the sky would just be colourless, right? Not exactly. All wavelengths of light emitted by the sun travel through our atmosphere, from red, to green, to blue. The atmosphere scatters the light and, as blue light has a shorter wavelength than red light, the blue light is scattered more and reaches us more quickly - this is why the sky is blue. Just because those wavelengths of blue light are blocked by a camera filter, it doesn't mean that the red light will be blocked. Those other wavelengths of light are still there in the sky, and my 590nm filter allows them through. This is why, in those images, the sky appears orange or red.



Conversely, infrared light is not scattered by the earth's atmosphere nearly as much as visible light (due to the longer wavelengths) so when using filters that block all visible light, the sky usually looks very dark, sometimes even black. There is simply not much infrared light coming into the lens directly from the sky.


Why are trees white? What happens in the autumn?


We are taught at school that leaves are green because they contain chlorophyll. To explore this idea further, we must remember that an object is a certain colour because it reflects that wavelength of light and absorbs the others. A leaf is green because it reflects green light and absorbs blue and red - or, more accurately, the chlorophyll within the leaf does, using the absorbed energy to fuel photosynthesis. It also reflects infrared wavelengths and, as we know that reflected infrared is shown as white on the camera, this is why healthy green leaves appear white in infrared photography.



During autumn (and also when a leaf is unhealthy) many leaves no longer create chlorophyll for photosynthesis. This is why they turn yellow, red and brown - there is less chlorophyll to reflect green light and, by extension, infrared. With less infrared reflected into the camera, it is no surprise that autumn leaves look much less bright in infrared images. This is actually a very useful feature for assessing the health of plants, especially in a wide area. If one tree is less white than those surrounding it, it's likely to be unhealthy.



This effect can be particularly exciting when using infrared photography creatively. Using my 590nm camera (which allows oranges and reds through), I found that the autumnal leaves were particularly vibrant. This is not because they were reflecting infrared, but because the lack of chlorophyll allowed them to reflect more red light, creating those enchanting colours.



Both of these images were taken with the same filter. On the left are the white trees in Tehidy, taken in the early autumn when most trees were still green. On the right is the orange tree in Trewidden, taken in the late autumn when all of its leaves were a warm orange, many of them having already fallen to the floor. Interestingly, the other trees in the background of the right image were still green, but less vibrant than in the summer - the infrared has captured this as they are more of a dull grey, their chlorophyll levels not high enough to reflect large amounts of infrared.


 

Sources

Comentários


bottom of page