From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from ffbox0-bg.mplayerhq.hu (ffbox0-bg.ffmpeg.org [79.124.17.100]) by master.gitmailbox.com (Postfix) with ESMTP id CF65544A81 for ; Mon, 31 Oct 2022 21:59:09 +0000 (UTC) Received: from [127.0.1.1] (localhost [127.0.0.1]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTP id A413868BE3D; Mon, 31 Oct 2022 23:59:06 +0200 (EET) Received: from relay11.mail.gandi.net (relay11.mail.gandi.net [217.70.178.231]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTPS id 1869A68BCCD for ; Mon, 31 Oct 2022 23:59:00 +0200 (EET) Received: (Authenticated sender: michael@niedermayer.cc) by mail.gandi.net (Postfix) with ESMTPSA id 13CF4100004 for ; Mon, 31 Oct 2022 21:58:58 +0000 (UTC) Date: Mon, 31 Oct 2022 22:58:57 +0100 From: Michael Niedermayer To: FFmpeg development discussions and patches Message-ID: <20221031215857.GF1814017@pb2> References: <20221030175813.1497020-1-u@pkh.me> MIME-Version: 1.0 In-Reply-To: Subject: Re: [FFmpeg-devel] [PATCH] Revert "avfilter/vf_palette(gen|use): support palettes with alpha" X-BeenThere: ffmpeg-devel@ffmpeg.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: FFmpeg development discussions and patches List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Reply-To: FFmpeg development discussions and patches Content-Type: multipart/mixed; boundary="===============1867886874888616525==" Errors-To: ffmpeg-devel-bounces@ffmpeg.org Sender: "ffmpeg-devel" Archived-At: List-Archive: List-Post: --===============1867886874888616525== Content-Type: multipart/signed; micalg=pgp-sha256; protocol="application/pgp-signature"; boundary="8TaQrIeukR7mmbKf" Content-Disposition: inline --8TaQrIeukR7mmbKf Content-Type: text/plain; charset=iso-8859-15 Content-Disposition: inline Content-Transfer-Encoding: quoted-printable On Mon, Oct 31, 2022 at 11:57:16AM +0100, Cl=E9ment B=BDsch wrote: > On Mon, Oct 31, 2022 at 01:43:11AM +0000, Soft Works wrote: > [...] > > > > > The patch I had submitted doesn't change the previous behavior > > > > > without the use_alpha parameter. > > >=20 > > > Yes I noticed, but unfortunately I'm reworking the color distance to > > > work > > > in perceptual color space, and the way that alpha is mixed up in the > > > equation just doesn't make any sense at all and prevents me from > > > doing > > > these changes.=20 > >=20 > > If you want to implement a new color distance algorithm, it should=20 > > be either a new filter or a new (switchable) mode for the existing=20 > > filter. >=20 > Why? >=20 > > Photoshop has these different modes as well and it would=20 > > surely be useful, but I don't think it should be replacing the > > existing behavior. > >=20 >=20 > There is no point in keeping a ton of complexity exposed as user options > for something implementation specific. We offer no guarantee over how the > quantization is expected to run. >=20 > > When it turns out that the use_alpha implementation doesn't fit > > with your new color distance calculation and you add it as=20 > > an additional mode, then it would be fine IMO when the filter > > errors in case it would be attempted to use that mode in=20 > > combination with use_alpha. >=20 > IMO The use_alpha option shouldn't exist in the first place, it should be > the default behaviour because honoring the alpha is the correct thing to > do. That's not what the option is currently doing though. >=20 > > > > Do you think it might make sense to put more weight on the > > > > alpha value by tripling it? So it would be weighted equally to the > > > > RGB value? > > >=20 > > > You cannot mix alpha with colors at all, they are separate domains > > > and you > > > need to treat them as such. > >=20 > > What's interesting is that I've followed the same (simplified) > > way for adding a use_alpha option to vf_elbg and it provides excellent > > results without treating alpha separately. >=20 > I don't know how the filter works and what it's supposed to do, but if > it's indeed using the same approach as the palette ones, it cannot work. >=20 > > > From paletteuse perspective what you need to do is first choose the > > > colors > > > in the palette that match exactly the alpha (or at least the closest > > > if > > > and only there is no exact match). Then within that set, and only > > > within > > > that one, you'd pick the closest color. > > >=20 > > > From palettegen perspective, you need to split the colors in > > > different > > > transparency domain (a first dimensional quantization), then quantize > > > the > > > colors in each quantized alpha dimension. And when you have all your > > > quantized palettes for each level of alpha, you find an algorithm to > > > reduce the number of transparency dimensions or the number of colors > > > per > > > dimension to make it fit inside a single palette. But you can't just > > > do > > > the alpha and the colors at the same time, it cannot work, whatever > > > weights you choose. > >=20 > > I would be curious to see how well that would work, especially > > in cases when the target palettes have just a few number of colors. > >=20 >=20 > You have to make a call between whether you want to preserve the > transparency or the color while constructing the palette, but when > choosing a color you must absolutely not choose a color with a different > transparency, you must pick amongst the closest alpha, with a particular > attention to extreme alphas: an opaque colors must stay opaque, and fully > transparent one as well: > - rounding a color with 43% alpha into 50% alpha is acceptable > - rounding a color with 100% alpha into a 99% alpha is not acceptable in > any way because you're starting to make transparent areas that weren't > - rounding a color with 0% alpha into a 1% alpha is not acceptable because > some areas of the images are not starting to blend into an area that was > supposedly non-existent really ? so if i have all shades of green available for all transparencies from 1% t= o 99% i "must" make my plants all use 0% trasparency even if i only have a single= color and that is bright pink=20 I dont think that is the best choice There are perceptual differences between the different areas of the RGBA hy= percube though. Hardly anyone would notice the difference between a 255 and 254 blu= e but having some slight transparency might be noticable. These different weights in different areas could maybe be considered in pal= ette* and elbg, it likely would improve things. OTOH heuristics like always and n= ever feels like that might become alot of work to tune. I think its better to at= temt to achieve a similar goal with less hard and more perceptual scoring thx [...] --=20 Michael GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB Its not that you shouldnt use gotos but rather that you should write readable code and code with gotos often but not always is less readable --8TaQrIeukR7mmbKf Content-Type: application/pgp-signature; name="signature.asc" -----BEGIN PGP SIGNATURE----- iF0EABEIAB0WIQSf8hKLFH72cwut8TNhHseHBAsPqwUCY2BFFgAKCRBhHseHBAsP qwJDAJ98M3BgyjOw+cMEgYxC66VXVahpTQCfRn2+Ok3141t9K/BQQ/Ps6UkR6to= =mbwE -----END PGP SIGNATURE----- --8TaQrIeukR7mmbKf-- --===============1867886874888616525== Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Content-Disposition: inline _______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org https://ffmpeg.org/mailman/listinfo/ffmpeg-devel To unsubscribe, visit link above, or email ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe". --===============1867886874888616525==--