Does processing an image twice reduce the quality twice?

For example, the default image quality seems to be 75%. If I process an image X using a crop (all other options being the default), and then generate a small thumbnail from X, will the thumbnail have a quality of 56.25% (75% * 75%)?

If so, then is the only way for the thumbnail to have a final quality of 75% to generate the thumbnail from an alternate version of cropping X with the quality set to 100%?

It will depend on the file format your using as each has its own compression algorithm.

If you’re using a file format with lossy compression like jpg, then you’re more or less correct—each time the image is compressed more data (quality) will be lost.

Though, if you’re generating thumbnail images, the artefacts from compression are not likely to be noticeable.

1 Like

@funkydan2 It’s webp, so it’s lossy. I simplified the situation in the explanation; I’m actually generating a smaller version for every Bootstrap breakpoint width, so the quality would be noticeable. Thanks!

@funkydan2 If I don’t refer to the .RelPermalink on the intermediate image, do you think Hugo still puts it into the public directory?

Some discussion on intermediate images here:

Your final quality maths won’t be as simple for two “quality 75” operations.

For best quality for multiple operations you could use a tiff intermediate or a higher quality jpg.
To resize to multiple sizes, you might as well start from the original source for each case.

2 Likes

If you repeatedly compress a WebP image using the same q value, the image size will decrease each time.

If you repeatedly compress a JPEG image using the same q value, the image size will not decrease each time.

This is true when using Hugo, Image Magick, cwebp.

1 Like

Some discussion on intermediate images here:

@andrewd72 Thanks. Looks like intermediate resources aren’t meant to be published, but they currently are due to a bug. I wanted to avoid bloating public files with useless resources, but I’ll just assume this will be fixed in the future.

Your final quality maths won’t be as simple for two “quality 75” operations.

If you mean that the final quality won’t exactly be 75% * 75%, but roughly that, then I understand. I meant to illustrate with that math that the quality was reduced twice.

For best quality for multiple operations you could use a tiff intermediate or a higher quality jpg.
To resize to multiple sizes, you might as well start from the original source for each case.

I’m writing an image helper that processes an image for you using whatever options you want, then optionally generates various breakpoint-sized versions of it for you. In the case where the original image is resized, I use the original image to generate the breakpoint versions. However, the problem is that the user can use processing methods other than resize—like crop—in which case the breakpoint versions have to be generated from the processed image. In that case, it seems that I have to process the original image again using the same processing options, except change the quality to 100%, then generate the breakpoint images with the original quality applied.

If you repeatedly compress a WebP image using the same q value, the image size will decrease each time.

@jmooring Thanks, good to have confirmation!

If you repeatedly compress a JPEG image using the same q value, the image size will not decrease each time.

By size, I assume you mean file size. But the quality will still go down, correct? (E.g. more pixelated, loss of detail.)

No. Content checksum remains the same.

git clone --single-branch -b hugo-forum-topic-43754 https://github.com/jmooring/hugo-testing hugo-forum-topic-43754
cd hugo-forum-topic-43754
hugo

Then examine the console log.

I see, thanks for the demonstration!

For reference for others:

Start building sites … 
hugo v0.111.3+extended darwin/amd64 BuildDate=unknown
WARN 2023/04/05 13:10:30 Compress a JPEG image 5 times...
WARN 2023/04/05 13:10:30 # 1: 15040 bytes (sha1: 5511625a16bb8b0aa4d76228ee0c2903050999f8)
WARN 2023/04/05 13:10:30 # 2: 15034 bytes (sha1: 77748214172872cb236798df969d945ef6941819)
WARN 2023/04/05 13:10:30 # 3: 15034 bytes (sha1: 77748214172872cb236798df969d945ef6941819)
WARN 2023/04/05 13:10:30 # 4: 15034 bytes (sha1: 77748214172872cb236798df969d945ef6941819)
WARN 2023/04/05 13:10:30 # 5: 15034 bytes (sha1: 77748214172872cb236798df969d945ef6941819)
WARN 2023/04/05 13:10:30 Compress a WebP image 5 times...
WARN 2023/04/05 13:10:30 # 1: 7020 bytes (sha1: a8f374575595fa5a93b21dc51121037bcd51505a)
WARN 2023/04/05 13:10:30 # 2: 6022 bytes (sha1: c506190b9f621382daf7f2d4f10ed6810f8e1667)
WARN 2023/04/05 13:10:30 # 3: 5496 bytes (sha1: 0d2cbb0e61b0aa3697c8d7294471a376348b7787)
WARN 2023/04/05 13:10:30 # 4: 5036 bytes (sha1: 5099ce6689c1827ef80c74ff9059c2698ddfe325)
WARN 2023/04/05 13:10:30 # 5: 4738 bytes (sha1: 7069288a70e1c1f7d4409e9c53131a50c91b579d)

Great to know. I’m curious if that’s a property of Hugo for processing JPG, or a property JPG in general.

The image helper is meant to work with any file format, and convert to any other file format, so unfortunately I have to assume the worst (webp, in this case).

Or any other tool/lib that I have used.

1 Like

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.