Git Inbox Mirror of the ffmpeg-devel mailing list - see https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
 help / color / mirror / Atom feed
From: "Clément Bœsch" <u@pkh.me>
To: FFmpeg development discussions and patches <ffmpeg-devel@ffmpeg.org>
Subject: Re: [FFmpeg-devel] [PATCH] Revert "avfilter/vf_palette(gen|use): support palettes with alpha"
Date: Tue, 1 Nov 2022 11:18:14 +0100
Message-ID: <Y2DyZmHrlq96usYA@ssq0.pkh.me> (raw)
In-Reply-To: <20221031215857.GF1814017@pb2>

On Mon, Oct 31, 2022 at 10:58:57PM +0100, Michael Niedermayer wrote:
[...]
> > You have to make a call between whether you want to preserve the
> > transparency or the color while constructing the palette, but when
> > choosing a color you must absolutely not choose a color with a different
> > transparency, you must pick amongst the closest alpha, with a particular
> > attention to extreme alphas: an opaque colors must stay opaque, and fully
> > transparent one as well:
> > - rounding a color with 43% alpha into 50% alpha is acceptable
> > - rounding a color with 100% alpha into a 99% alpha is not acceptable in
> >   any way because you're starting to make transparent areas that weren't
> > - rounding a color with 0% alpha into a 1% alpha is not acceptable because
> >   some areas of the images are not starting to blend into an area that was
> >   supposedly non-existent
> 
> really ?
> so if i have all shades of green available for all transparencies from 1% to 99%
> i "must" make my plants all use 0% trasparency even if i only have a single color and
> that is bright pink 

I believe so because you don't know how the alpha channel is going to be
used in the user pipeline. The goal of the palette filters is to quantize
colors, not mess up the alpha channel. It's better that for these filters
to be very bad at quantizing colors but still preserving as best as
possible the alpha, than giving the illusion that the colors are great
while messing up massively the alpha channel (which it currently does).

BTW, the colors are not even pre-multiplied, so in the current state it
just makes no sense at all: we are comparing colors with different alpha
channel even though we have no idea how they look like when blend.

> There are perceptual differences between the different areas of the RGBA hypercube
> though. Hardly anyone would notice the difference between a 255 and 254 blue but
> having some slight transparency might be noticable.

It's noticeable late: that is when your asset reach the blending stage,
which is the worse user experience you can provide.

Just imagine, the user quantize its files, thinking it's going to preserve
transparency. Blend with a black background it appears to be somehow ok
(see softworkz screenshot), so the user starts using it on their website.
Everything looks fine. Then few months later, the user decides to changes
the black background to a brighter colored color: all the images suddenly
revealed their destroyed alpha channel, which artifacts everywhere.

> These different weights in different areas could maybe be considered in palette*
> and elbg, it likely would improve things. OTOH heuristics like always and never
> feels like that might become alot of work to tune. I think its better to attemt
> to achieve a similar goal with less hard and more perceptual scoring

Working on perception can only work with colors, you can not jam in the
alpha, it's another dimension entirely. So you first have to work with
premultiplied data, and then you need to separate the alpha scoring
separately.


-- 
Clément B.
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

  parent reply	other threads:[~2022-11-01 10:18 UTC|newest]

Thread overview: 18+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2022-10-30 17:58 Clément Bœsch
2022-10-30 21:19 ` Soft Works
2022-10-30 21:30   ` Clément Bœsch
2022-10-30 21:37     ` Clément Bœsch
2022-10-30 21:41     ` Soft Works
2022-10-30 22:55       ` Soft Works
2022-10-31  0:29         ` Clément Bœsch
2022-10-31  1:43           ` Soft Works
2022-10-31 10:57             ` Clément Bœsch
2022-10-31 11:58               ` Paul B Mahol
2022-10-31 12:41                 ` Clément Bœsch
2022-10-31 15:11               ` Soft Works
2022-10-31 18:51                 ` Clément Bœsch
2022-10-31 20:41                   ` Soft Works
2022-10-31 21:58               ` Michael Niedermayer
2022-10-31 23:34                 ` Soft Works
2022-11-01 10:18                 ` Clément Bœsch [this message]
2022-10-31  2:09           ` Soft Works

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=Y2DyZmHrlq96usYA@ssq0.pkh.me \
    --to=u@pkh.me \
    --cc=ffmpeg-devel@ffmpeg.org \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link

Git Inbox Mirror of the ffmpeg-devel mailing list - see https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

This inbox may be cloned and mirrored by anyone:

	git clone --mirror https://master.gitmailbox.com/ffmpegdev/0 ffmpegdev/git/0.git

	# If you have public-inbox 1.1+ installed, you may
	# initialize and index your mirror using the following commands:
	public-inbox-init -V2 ffmpegdev ffmpegdev/ https://master.gitmailbox.com/ffmpegdev \
		ffmpegdev@gitmailbox.com
	public-inbox-index ffmpegdev

Example config snippet for mirrors.


AGPL code for this site: git clone https://public-inbox.org/public-inbox.git