render/color: replace COLOR_TRANSFORM_LUT_3D with COLOR_TRANSFORM_LCMS2

Converting the LCMS2 transform to a 3D LUT early causes issues:

- It's a lossy process, the consumer will not be able to pick a
  3D LUT size on their own.
- It requires unnecessary conversions and allocations: an intermediate
  3D LUT is allocated, but the renderer already allocates one.
- It makes it harder to support arbitrary color transforms in the
  renderer, because each type needs to be handled differently.

Instead, expose a function to evaluate a color transform, and use
that to build the 3D LUT in the renderer.
This commit is contained in:
Simon Ser 2025-05-24 13:04:45 +02:00 committed by Kenny Levinsen
parent 9b97e2607d
commit 3665b53e29
6 changed files with 109 additions and 111 deletions

View file

@ -42,10 +42,8 @@ static void color_transform_destroy(struct wlr_color_transform *tr) {
switch (tr->type) {
case COLOR_TRANSFORM_SRGB:
break;
case COLOR_TRANSFORM_LUT_3D:;
struct wlr_color_transform_lut3d *lut3d =
wlr_color_transform_lut3d_from_base(tr);
free(lut3d->lut_3d);
case COLOR_TRANSFORM_LCMS2:
color_transform_lcms2_finish(color_transform_lcms2_from_base(tr));
break;
}
wlr_addon_set_finish(&tr->addons);
@ -68,13 +66,6 @@ void wlr_color_transform_unref(struct wlr_color_transform *tr) {
}
}
struct wlr_color_transform_lut3d *wlr_color_transform_lut3d_from_base(
struct wlr_color_transform *tr) {
assert(tr->type == COLOR_TRANSFORM_LUT_3D);
struct wlr_color_transform_lut3d *lut3d = wl_container_of(tr, lut3d, base);
return lut3d;
}
void wlr_color_primaries_from_named(struct wlr_color_primaries *out,
enum wlr_color_named_primaries named) {
switch (named) {