Tag Archives: colour correction

What colour is the sky on mars?

mars_originalmars_red

The cameras never lies. Or does it? Recently I had to take a photo for a medical case and before submitting it I had to sign to say that the photo had not been modified. I did this – but it was ridiculous of course. Many people have this idea that the cameras faithfully captures what the scene looks like and that, unless we intentionally manipulate the images (in photoshop, for example), then we have captured the truth. Nothing could be further from the truth – as the recent image of #TheDress showed.

The top photo above was taken and released by NASA in 1976 and shows a Martian landscape. The sky is blue. However, at the time, Carl Sagan said “Despite the impression on these images, the sky is not blue…The sky is in fact pink.”

You see the original image had not been colour corrected. Colour correction is a process that takes place on most cameras these days without the user being aware of it but in 1976 was not automatic. The process can compensate for the spectral sensitivities of the camera sensors (which may differ from one camera to another) or for the colour of the light source. The second picture (above) shows the colour-corrected image. Some people are now arguing, however, that the amount of colour correction applied by NASA is wrong and that the sky should not be as red as it appears on the second photograph. For the full story including some other nice images of Mars see here.

colour correction for iphone

colour correction for iphone

Of course, one of the reasons (but by far not the only one) that the iphone has been so successful is the quality of the camera that is built in. It was certainly one of the features that made me switch from Nokia about 3 years ago after more than 15 years of loyalty to the swedish brand. So I was interested to read recently that the next iphone may feature advanced colour correction methods and promises to be even better than its predecessors. You can read about the story here.

Colour correction is necessary because different cameras use different RGB primaries and because the activation of the RGB sensors when taking an image depend upon the quantity and quality of the ambient illumination. So, for example, imagine the light was very very red, then the R channel of the camera would be more strongly activated than if the light was whiter. However, our visual systems are able to compensate for this so that most of the time we don’t notice objects changing colour when we move from one room to another or from inside to outside. Colour correction is inspired by human colour constancy and attempt to correct the images so that the objects in the scene would retain their daylight appearance. However, colour correction is difficult; that is, it is very difficult to get it right all of the time. One frustration I have is taking a photo of my band (I play drums in a covers band) under very colourful lighting. Often the images are very disappointing and lack the intensity of the original scene. That is because, human colour constancy is only partial and under extreme lighting things really do change colour markedly – such as under our intense LED stage lighting. In these cases I think sometimes the automatic colour correction is actually too much and I have found that I have to modify the images I capture on my mac to try to recreate what I think the original scene looked like. So auto colour correction – the state of the art – is certainly not perfect. Let’s hope this story about an advance made by Apple is true.