fix: Handle grayscale data properly when encoding AVIFs #278
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
AVIF Grayscale Handling
This change fixes an issue I noticed when testing my other fix for #277. The AVIF encoder currently mishandles single-channel (monochrome) input data, accidentally tiling it 3x horizontally in the resulting image.
For instance, this 300x300 source image:
Would transcode to this output image using the transcoder example program (invoked with
example -input grayscale.png -output grayscale.avif -width 300 -height 300 -stretch):(Converted back to a PNG here for previewing compatibility.)
To fix this, this change skips RGB to YUV conversion for single-channel images during the AVIF encoding process, since they aren't RGB anyway. Instead, for grayscale source data, it sets the AVIF output format to YUV400 (i.e. luminance only with no subsambling), and points its Y plane directly at the grayscale data. This requires no conversions, so it has perfect accuracy, requires no memory allocations, and runs quite a bit faster than before in addition to being more correct. Hooray for a win-win!