Q & A : The Lowdown on RAW vs JPEG

I cannot recall the number of times a photographer has asked me to explain the whole RAW vs JPEG thing.  Since I started this Q&A offering, it's come in a couple of times, and the latest comes from Joss.  Here's the email, although I don't know Joss' gender, the message is pretty darn clear. "Hi.  I heard about this question service and want to ask a question that i can't get a straight answer to.  My camera is a Nikon D7100.  I just got it.  I bought it to replace my D5000.  I always used to to take pictures using the settings that came in the camera.  I bought the new camera online and went into a couple of stores to ask questions about it.  I want to learn more about taking pictures and the store people gave me different information.  One man said I should use JPEG.  A lady in a different store said to use RAW.  Neither one could tell me why.  I feel like people tell me things without knowing why.  Can you help me?"

In fairness, I edited the email a bit to remove some duplication, because Joss got some really crappy guidance, and more than once.

Let's start at the beginning.  When a sensor captures an image, it doesn't care about the image.  It doesn't even "know" it's an image.  What it sees is an electrical representation of a luminance value.  The three colour Bayer array used in most sensors assigns the luminance value to one of RGB for each photo site on the sensor.  Complex algorithms assign the colour levels based on the luminance values the photo sites collect.  The data stream of the file is what we call the RAW data.  Each manufacturer writes the data in their own format.  In most cases, the formats are proprietary, but some manufacturers choose the open standard DNG format.  The format is not that important so long as software can decode it.  Each camera adds a unique set of information to the file, and that's why a RAW file from the D5000 doesn't look like a RAW file from the D7100.  So even from one manufacturer, you still need a RAW decoder for each camera type.  Apple and Adobe produce new decoders pretty quickly.  That's why older software might not be able to decode RAW files from newer cameras.

JPEG is a standard format designed specifically to convert RAW data streams into a generic format that is widely understood without specific decoders for each camera model.  I order to do this, a number of algorithms get used in the conversion.  They turn the data stream into an image.  Colours are adjusted, contrast is altered, sharpening is applied amongst other image alterations.  In order to simplify file movement, compression algorithms are used to reduce the file size to make transmission easier.  The JPEG compression model is known as a "lossy compression".  In a lossy compression, when duplicate data values are found, the duplicates either get pointers applied to other image points, and adjacent duplicates have some number of them deleted entirely.  In the case of the general JPEG model, the quality setting has control over how much gets tossed and the related file size reduction.

For example, a quality setting of 100% still uses a compression ratio of 2.6:1  That's about 38% of the original file size.  This makes it easier to email the file around because it is smaller,  Unfortunately the file is also missing a lot of data.  That looks like about a 60% loss of data.  A quality setting of 50 means a loss of over 90% of the actual data.  The default quality settings are typically 75-80, loss of around 70% of data.   Simply put, there is no way any JPEG can ever offer the level of data that RAW can.

There's a popular misconception that the only difference between RAW and JPEG is that you cannot adjust the white balance after the fact in JPEG.  This is complete crap.  Software may not offer the same settings for RAW or JPEG, but both are adjustable.  Since there is so much less data in JPEG, the results after shifting the colour balance tend to look like crap pretty quickly but this doesn't mean that it's not doable.

There's another issue with JPEG that is not well known.  Every time you open and change a JPEG, it gets the compression algorithm applied again.  That means that every open / save event results in further degradation.  By the way, if your first save is at 70% and the second is at 90% doesn't restore the loss from the first save.  It's multiplcative.  That means worse.

RAW is lossless.  If you edit a processed RAW file, and you don't want to lose quality, you need to save in another lossless format such as TIFF.  JPEG is great for sharing on the web but it should be the LAST step and only saved in that format ONCE.

RAW does require processing to convert the data stream to an image.  That doesn't mean you need to do significant editing.  Adobe Camera Raw and Adobe Lightroom both offer presets to apply the JPEG camera "looks" without the JPEG loss.

None of use will always get every element of the image right in the camera every time.  RAW gives you all the data, all the time.  There's really no reason to do otherwise.

I hope that this helps Joss and everyone else who is confused.  I have heard of some folks saying that they can explain the differences without getting technical.  Since the entire process is completely technical this sounds spurious to me.  The net is shoot RAW.  You're always ahead of the game.