Git Inbox Mirror of the ffmpeg-devel mailing list - see https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
 help / color / mirror / Atom feed
* [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
@ 2024-05-10 21:12 François-Simon Fauteux-Chapleau
  2024-05-11 14:08 ` Andrew Sayers
  0 siblings, 1 reply; 13+ messages in thread
From: François-Simon Fauteux-Chapleau @ 2024-05-10 21:12 UTC (permalink / raw)
  To: ffmpeg-devel

This is a revised version of the "pipewiregrab" patch submitted by
Abhishek Ojha a few months ago:
https://patchwork.ffmpeg.org/project/ffmpeg/patch/20231227162504.690730-1-abhishek.ojha@savoirfairelinux.com/
https://patchwork.ffmpeg.org/project/ffmpeg/patch/20231227162504.690730-2-abhishek.ojha@savoirfairelinux.com/

The main change is that the patch is now implemented as a libavfilter
source filter instead of a libavdevice input device, as was requested in
a comment on the previous version. This version also adds support for
DMA buffer sharing and uses sd-bus instead of GDBus.

There are also several small changes meant to fix bugs or simplify the
code, but the overall structure remains the same as before: we use the
ScreenCast interface provided by XDG Desktop Portal to obtain a file
descriptor, which is then used to create a PipeWire stream. The data from
that stream can then be used to generate frames for FFmpeg.

Example usage:
ffmpeg -f lavfi -i pipewiregrab \
       -vf 'hwmap=derive_device=vaapi,scale_vaapi=format=nv12' \
       -c:v h264_vaapi -t 10 output.mp4

Signed-off-by: François-Simon Fauteux-Chapleau <francois-simon.fauteux-chapleau@savoirfairelinux.com>
---
 configure                       |   16 +
 libavfilter/Makefile            |    1 +
 libavfilter/allfilters.c        |    1 +
 libavfilter/vsrc_pipewiregrab.c | 1433 +++++++++++++++++++++++++++++++
 4 files changed, 1451 insertions(+)
 create mode 100644 libavfilter/vsrc_pipewiregrab.c

diff --git a/configure b/configure
index beb1fa6d3c..028020455e 100755
--- a/configure
+++ b/configure
@@ -304,6 +304,7 @@ External library support:
   --enable-libxcb-shm      enable X11 grabbing shm communication [autodetect]
   --enable-libxcb-xfixes   enable X11 grabbing mouse rendering [autodetect]
   --enable-libxcb-shape    enable X11 grabbing shape rendering [autodetect]
+  --enable-libpipewire     enable screen grabbing using PipeWire [autodetect]
   --enable-libxvid         enable Xvid encoding via xvidcore,
                            native MPEG-4/Xvid encoder exists [no]
   --enable-libxml2         enable XML parsing using the C library libxml2, needed
@@ -1845,6 +1846,8 @@ EXTERNAL_AUTODETECT_LIBRARY_LIST="
     libxcb_shm
     libxcb_shape
     libxcb_xfixes
+    libpipewire
+    libsystemd
     lzma
     mediafoundation
     metal
@@ -3895,6 +3898,7 @@ pad_opencl_filter_deps="opencl"
 pan_filter_deps="swresample"
 perspective_filter_deps="gpl"
 phase_filter_deps="gpl"
+pipewiregrab_filter_deps="libpipewire libsystemd pthreads"
 pp7_filter_deps="gpl"
 pp_filter_deps="gpl postproc"
 prewitt_opencl_filter_deps="opencl"
@@ -7230,6 +7234,18 @@ if enabled libxcb; then
     enabled libxcb_xfixes && check_pkg_config libxcb_xfixes xcb-xfixes xcb/xfixes.h xcb_xfixes_get_cursor_image
 fi
 
+# Starting with version 0.3.52, PipeWire's spa library uses the __LOCALE_C_ONLY macro to determine
+# whether the locale_t type (introduced in POSIX.1-2008) and some related functions are available (see
+# https://gitlab.freedesktop.org/pipewire/pipewire/-/issues/2390 for more information).
+# Unfortunately, this macro is specific to uclibc, which can cause build issues on systems that use a
+# different implementation of libc if POSIX 2008 support isn't enabled (which is the case for FFmpeg currently).
+# As a workaround for this problem, we add a compilation flag to ensure that __LOCALE_C_ONLY is always defined.
+add_cppflags -D__LOCALE_C_ONLY
+enabled libpipewire && check_pkg_config libpipewire "libpipewire-0.3 >= 0.3.40" pipewire/pipewire.h pw_init
+if enabled libpipewire; then
+    enabled libsystemd  && check_pkg_config libsystemd "libsystemd >= 246" systemd/sd-bus.h sd_bus_call_method
+fi
+
 check_func_headers "windows.h" CreateDIBSection "$gdigrab_indev_extralibs"
 
 # check if building for desktop or uwp
diff --git a/libavfilter/Makefile b/libavfilter/Makefile
index 5992fd161f..6352e91586 100644
--- a/libavfilter/Makefile
+++ b/libavfilter/Makefile
@@ -603,6 +603,7 @@ OBJS-$(CONFIG_NULLSRC_FILTER)                += vsrc_testsrc.o
 OBJS-$(CONFIG_OPENCLSRC_FILTER)              += vf_program_opencl.o opencl.o
 OBJS-$(CONFIG_PAL75BARS_FILTER)              += vsrc_testsrc.o
 OBJS-$(CONFIG_PAL100BARS_FILTER)             += vsrc_testsrc.o
+OBJS-$(CONFIG_PIPEWIREGRAB_FILTER)           += vsrc_pipewiregrab.o
 OBJS-$(CONFIG_QRENCODE_FILTER)               += qrencode.o textutils.o
 OBJS-$(CONFIG_QRENCODESRC_FILTER)            += qrencode.o textutils.o
 OBJS-$(CONFIG_RGBTESTSRC_FILTER)             += vsrc_testsrc.o
diff --git a/libavfilter/allfilters.c b/libavfilter/allfilters.c
index c532682fc2..3670a6d7e7 100644
--- a/libavfilter/allfilters.c
+++ b/libavfilter/allfilters.c
@@ -569,6 +569,7 @@ extern const AVFilter ff_vsrc_openclsrc;
 extern const AVFilter ff_vsrc_qrencodesrc;
 extern const AVFilter ff_vsrc_pal75bars;
 extern const AVFilter ff_vsrc_pal100bars;
+extern const AVFilter ff_vsrc_pipewiregrab;
 extern const AVFilter ff_vsrc_rgbtestsrc;
 extern const AVFilter ff_vsrc_sierpinski;
 extern const AVFilter ff_vsrc_smptebars;
diff --git a/libavfilter/vsrc_pipewiregrab.c b/libavfilter/vsrc_pipewiregrab.c
new file mode 100644
index 0000000000..51073c22b1
--- /dev/null
+++ b/libavfilter/vsrc_pipewiregrab.c
@@ -0,0 +1,1433 @@
+/*
+ * PipeWire input grabber (ScreenCast)
+ * Copyright (C) 2024 Savoir-faire Linux, Inc.
+ *
+ * This file is part of FFmpeg.
+ *
+ * FFmpeg is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Lesser General Public
+ * License as published by the Free Software Foundation; either
+ * version 2.1 of the License, or (at your option) any later version.
+ *
+ * FFmpeg is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Lesser General Public License for more details.
+ *
+ * You should have received a copy of the GNU Lesser General Public
+ * License along with FFmpeg; if not, write to the Free Software
+ * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
+ */
+
+/**
+ * @file
+ * PipeWireGrab video source
+ * @author Firas Ashkar <firas.ashkar at savoirfairelinux.com>
+ * @author Abhishek Ojha <abhishek.ojha at savoirfairelinux.com>
+ * @author François-Simon Fauteux-Chapleau <francois-simon.fauteux-chapleau at savoirfairelinux.com>
+ */
+
+#include <drm_fourcc.h>
+#include <fcntl.h>
+#include <pipewire/pipewire.h>
+#include <pipewire/thread-loop.h>
+#include <pthread.h>
+#include <spa/debug/types.h>
+#include <spa/param/video/format-utils.h>
+#include <spa/param/video/raw.h>
+#include <spa/param/video/type-info.h>
+#include <stdatomic.h>
+#include <stdlib.h>
+#include <string.h>
+#include <sys/mman.h>
+#include <systemd/sd-bus.h>
+#include <systemd/sd-bus-protocol.h>
+#include <unistd.h>
+
+#include "avfilter.h"
+#include "formats.h"
+#include "video.h"
+
+#include "libavformat/avformat.h"
+#include "libavutil/avassert.h"
+#include "libavutil/avstring.h"
+#include "libavutil/hwcontext.h"
+#include "libavutil/hwcontext_drm.h"
+#include "libavutil/mem.h"
+#include "libavutil/opt.h"
+#include "libavutil/time.h"
+
+#ifndef __USE_XOPEN2K8
+#define F_DUPFD_CLOEXEC                                                        \
+    1030 /* Duplicate file descriptor with close-on-exit set.  */
+#endif
+
+#define DESTINATION "org.freedesktop.portal.Desktop"
+#define SENDER DESTINATION
+#define OBJECT_PATH "/org/freedesktop/portal/desktop"
+#define INTERFACE "org.freedesktop.portal.ScreenCast"
+#define REQUEST_PATH "/org/freedesktop/portal/desktop/request/%s/%s"
+
+#define BYTES_PER_PIXEL 4 /* currently all formats assume 4 bytes per pixel */
+#define MAX_SPA_PARAM 4 /* max number of params for spa pod */
+
+/**
+ * PipeWire capture types
+ */
+typedef enum {
+    DESKTOP_CAPTURE = 1,
+    WINDOW_CAPTURE = 2,
+} pw_capture_type;
+
+/**
+ * XDG Desktop Portal supported cursor modes
+ */
+enum PortalCursorMode {
+    PORTAL_CURSOR_MODE_HIDDEN = 1 << 0,
+    PORTAL_CURSOR_MODE_EMBEDDED = 1 << 1,
+};
+
+typedef struct PipewireGrabContext {
+    const AVClass *class;
+
+    sd_bus *connection;
+    atomic_int dbus_event_loop_running;
+    char *sender_name;
+    char *session_handle;
+
+    uint64_t pipewire_node;
+    int pipewire_fd;
+
+    pthread_cond_t pipewire_initialization_cond_var;
+    pthread_mutex_t pipewire_initialization_mutex;
+    atomic_int pipewire_initialization_over;
+    int pw_init_called;
+    struct pw_thread_loop *thread_loop;
+    struct pw_context *context;
+    struct pw_core *core;
+    struct spa_hook core_listener;
+    struct pw_stream *stream;
+    struct spa_hook stream_listener;
+    struct spa_video_info format;
+
+    uint32_t available_cursor_modes;
+    pw_capture_type capture_type;
+    int draw_mouse;
+
+    uint32_t width, height;
+    size_t frame_size;
+    uint8_t Bpp;
+    enum AVPixelFormat av_pxl_format;
+
+    int64_t time_frame;
+    int64_t frame_duration;
+    AVRational framerate;
+    pthread_mutex_t current_frame_mutex;
+    AVFrame *current_frame;
+    AVBufferRef *hw_device_ref;
+    AVBufferRef *hw_frames_ref;
+    int enable_dmabuf;
+    const char *device_path;
+
+    int portal_error;
+    int pipewire_error;
+} PipewireGrabContext;
+
+/**
+ * Data for DBus signals callbacks
+ */
+struct DbusSignalData {
+    AVFilterContext *ctx;
+    sd_bus_slot *slot;
+};
+
+#define OFFSET(x) offsetof(PipewireGrabContext, x)
+#define FLAGS AV_OPT_FLAG_FILTERING_PARAM|AV_OPT_FLAG_VIDEO_PARAM
+static const AVOption pipewiregrab_options[] = {
+    { "framerate", "set video frame rate", OFFSET(framerate), AV_OPT_TYPE_VIDEO_RATE, { .str = "ntsc" }, 0, INT_MAX, FLAGS },
+    { "draw_mouse", "draw the mouse pointer", OFFSET(draw_mouse), AV_OPT_TYPE_BOOL, { .i64 = 1 }, 0, 1, FLAGS },
+    { "capture_type", "set the capture type (1 for screen, 2 for window)", OFFSET(capture_type), AV_OPT_TYPE_INT, { .i64 = 1 }, 1, 2, FLAGS },
+    { "fd", "set file descriptor to be used by PipeWire", OFFSET(pipewire_fd), AV_OPT_TYPE_INT, { .i64 = 0 }, 0, INT_MAX, FLAGS },
+    { "node", "set PipeWire node (required when using the 'fd' option)", OFFSET(pipewire_node), AV_OPT_TYPE_UINT64, { .i64 = 0 }, 0, 0xffffffff, FLAGS },
+    { "enable_dmabuf", "enable DMA-BUF sharing", OFFSET(enable_dmabuf), AV_OPT_TYPE_BOOL, { .i64 = 1 }, 0, 1, FLAGS },
+    { "device", "DRM device path", OFFSET(device_path), AV_OPT_TYPE_STRING, { .str = "/dev/dri/card0" }, 0, 0, FLAGS },
+    { NULL },
+};
+
+AVFILTER_DEFINE_CLASS(pipewiregrab);
+
+/**
+ * Helper function to allow portal_init_screencast to stop and return an error
+ * code if a DBus operation/callback fails.
+ *
+ * @param ctx
+ * @param error AVERROR code (negative)
+ * @param message error message
+ */
+static void portal_abort(AVFilterContext *ctx, int error, const char *message)
+{
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    pw_ctx->portal_error = error;
+    av_log(ctx, AV_LOG_ERROR, "Aborting: %s\n", message);
+
+    atomic_store(&pw_ctx->dbus_event_loop_running, 0);
+}
+
+/**
+ * Callback to handle PipeWire core info events
+ *
+ * @param user_data pointer to AVFilterContext
+ * @param info pw_core_info
+ */
+static void on_core_info_callback(void *user_data, const struct pw_core_info *info)
+{
+    AVFilterContext *ctx = user_data;
+    av_log(ctx, AV_LOG_DEBUG, "Server version: %s\n", info->version);
+    av_log(ctx, AV_LOG_INFO, "Library version: %s\n", pw_get_library_version());
+    av_log(ctx, AV_LOG_DEBUG, "Header version: %s\n", pw_get_headers_version());
+}
+
+/**
+ * Callback to handle PipeWire core done events
+ *
+ * @param user_data pointer to AVFilterContext
+ * @param id PipeWire object id of calling
+ * @param seq PipeWire object sequence
+ */
+static void on_core_done_callback(void *user_data, uint32_t id, int seq)
+{
+    AVFilterContext *ctx = user_data;
+    PipewireGrabContext *pw_ctx;
+
+    if (!ctx || !ctx->priv)
+        return;
+    pw_ctx = ctx->priv;
+
+    if (id == PW_ID_CORE)
+        pw_thread_loop_signal(pw_ctx->thread_loop, false);
+}
+
+/**
+ * Callback to handle Pipewire core error events
+ *
+ * @param user_data pointer to AVFilterContext
+ * @param id id of PipeWire proxy object where the error occured
+ * @param seq PipeWire sequence number which produced the error
+ * @param res error number
+ * @param message error message
+ */
+static void on_core_error_callback(void *user_data, uint32_t id, int seq,
+                                   int res, const char *message)
+{
+    AVFilterContext *ctx = user_data;
+    PipewireGrabContext *pw_ctx;
+
+    if (!ctx)
+        return;
+
+    av_log(ctx, AV_LOG_ERROR,
+           "PipeWire core error: %s (id=%u, seq=%d, res=%d: %s)\n",
+           message, id, seq, res, strerror(-res));
+
+    pw_ctx = ctx->priv;
+    if (!pw_ctx)
+        return;
+
+    pw_thread_loop_signal(pw_ctx->thread_loop, false);
+    pw_ctx->pipewire_error = res;
+    atomic_store(&pw_ctx->pipewire_initialization_over, 1);
+    pthread_cond_signal(&pw_ctx->pipewire_initialization_cond_var);
+}
+
+/**
+ * PipeWire core events callbacks
+ */
+static const struct pw_core_events core_events = {
+    PW_VERSION_CORE_EVENTS,
+    .info = on_core_info_callback,
+    .done = on_core_done_callback,
+    .error = on_core_error_callback,
+};
+
+/**
+ * Helper function: convert spa video format to AVPixelFormat
+ *
+ * @param video_format spa video format to convert
+ * @return the corresponding AVPixelFormat
+ */
+static enum AVPixelFormat
+spa_video_format_to_av_pixel_format(enum spa_video_format video_format)
+{
+    switch (video_format) {
+    case SPA_VIDEO_FORMAT_RGBA:
+    case SPA_VIDEO_FORMAT_RGBx:
+        return AV_PIX_FMT_RGBA;
+
+    case SPA_VIDEO_FORMAT_BGRA:
+    case SPA_VIDEO_FORMAT_BGRx:
+        return AV_PIX_FMT_BGRA;
+
+    default:
+        return AV_PIX_FMT_NONE;
+    }
+}
+
+static uint32_t spa_video_format_to_drm_format(enum spa_video_format video_format)
+{
+    switch (video_format) {
+    case SPA_VIDEO_FORMAT_RGBA:
+        return DRM_FORMAT_ABGR8888;
+    case SPA_VIDEO_FORMAT_RGBx:
+        return DRM_FORMAT_XBGR8888;
+    case SPA_VIDEO_FORMAT_BGRA:
+        return DRM_FORMAT_ARGB8888;
+    case SPA_VIDEO_FORMAT_BGRx:
+        return DRM_FORMAT_XRGB8888;
+    default:
+        return DRM_FORMAT_INVALID;
+    }
+}
+
+static const uint32_t pipewiregrab_formats[] = {
+    SPA_VIDEO_FORMAT_RGBA,
+    SPA_VIDEO_FORMAT_RGBx,
+    SPA_VIDEO_FORMAT_BGRx,
+    SPA_VIDEO_FORMAT_BGRA,
+};
+
+static const uint64_t pipewiregrab_default_modifiers[] = {
+    DRM_FORMAT_MOD_LINEAR,
+    DRM_FORMAT_MOD_INVALID,
+};
+
+/**
+ * PipeWire callback of parameters changed events
+ *
+ * @param user_data pointer to AVFilterContext
+ * @param id type of changed param
+ * @param param pointer to changed param structure
+ */
+static void on_stream_param_changed_callback(void *user_data, uint32_t id,
+                                             const struct spa_pod *param)
+{
+    struct spa_pod_builder pod_builder;
+    const struct spa_pod *params[MAX_SPA_PARAM];
+    uint32_t n_params = 0;
+    uint32_t buffer_types;
+    uint8_t params_buffer[4096];
+    int result;
+    int err;
+    PipewireGrabContext *pw_ctx;
+    AVFilterContext *ctx = user_data;
+    AVHWFramesContext *frames_ctx = NULL;
+
+    if (!ctx || !ctx->priv || !param)
+        return;
+
+    if (id != SPA_PARAM_Format) {
+        av_log(ctx, AV_LOG_WARNING,
+               "Ignoring non-Format param change\n");
+        return;
+    }
+
+    pw_ctx = ctx->priv;
+
+    result = spa_format_parse(param, &pw_ctx->format.media_type,
+                              &pw_ctx->format.media_subtype);
+    if (result < 0) {
+        av_log(ctx, AV_LOG_ERROR, "Unable to parse media type\n");
+        pw_ctx->pipewire_error = AVERROR(EINVAL);
+        goto end;
+    }
+
+    if (pw_ctx->format.media_type != SPA_MEDIA_TYPE_video ||
+        pw_ctx->format.media_subtype != SPA_MEDIA_SUBTYPE_raw) {
+        av_log(ctx, AV_LOG_ERROR, "Unexpected media type\n");
+        pw_ctx->pipewire_error = AVERROR(EINVAL);
+        goto end;
+    }
+
+    spa_format_video_raw_parse(param, &pw_ctx->format.info.raw);
+
+    av_log(ctx, AV_LOG_INFO, "Negotiated format:\n");
+
+    av_log(ctx, AV_LOG_INFO, "Format: %d (%s)\n",
+           pw_ctx->format.info.raw.format,
+           spa_debug_type_find_name(spa_type_video_format,
+                                    pw_ctx->format.info.raw.format));
+    av_log(ctx, AV_LOG_INFO, "Size: %dx%d\n",
+           pw_ctx->format.info.raw.size.width,
+           pw_ctx->format.info.raw.size.height);
+    av_log(ctx, AV_LOG_INFO, "Framerate: %d/%d\n",
+           pw_ctx->format.info.raw.framerate.num,
+           pw_ctx->format.info.raw.framerate.denom);
+
+    pw_ctx->width = pw_ctx->format.info.raw.size.width;
+    pw_ctx->height = pw_ctx->format.info.raw.size.height;
+    pw_ctx->Bpp = BYTES_PER_PIXEL;
+    pw_ctx->frame_size = pw_ctx->width * pw_ctx->height * pw_ctx->Bpp;
+    if (pw_ctx->frame_size + AV_INPUT_BUFFER_PADDING_SIZE > INT_MAX) {
+        av_log(ctx, AV_LOG_ERROR, "Captured area is too large\n");
+        pw_ctx->pipewire_error = AVERROR(EINVAL);
+        goto end;
+    }
+
+    pw_ctx->av_pxl_format =
+        spa_video_format_to_av_pixel_format(pw_ctx->format.info.raw.format);
+    if (pw_ctx->av_pxl_format == AV_PIX_FMT_NONE) {
+        av_log(ctx, AV_LOG_ERROR,
+               "Unsupported buffer format: %d\n", pw_ctx->format.info.raw.format);
+        pw_ctx->pipewire_error = AVERROR(EINVAL);
+        goto end;
+    }
+
+    /* Video crop */
+    pod_builder = SPA_POD_BUILDER_INIT(params_buffer, sizeof(params_buffer));
+    params[n_params++] = spa_pod_builder_add_object(
+        &pod_builder, SPA_TYPE_OBJECT_ParamMeta, SPA_PARAM_Meta,
+        SPA_PARAM_META_type, SPA_POD_Id(SPA_META_VideoCrop),
+        SPA_PARAM_META_size, SPA_POD_Int(sizeof(struct spa_meta_region)));
+
+    /* Buffer options */
+    buffer_types = (1 << SPA_DATA_MemPtr) | (1 << SPA_DATA_MemFd);
+    if (spa_pod_find_prop(param, NULL, SPA_FORMAT_VIDEO_modifier)) {
+        err = av_hwdevice_ctx_create(&pw_ctx->hw_device_ref, AV_HWDEVICE_TYPE_DRM,
+                                     pw_ctx->device_path, NULL, 0);
+        if (err < 0)
+            goto hw_fail;
+
+        pw_ctx->hw_frames_ref = av_hwframe_ctx_alloc(pw_ctx->hw_device_ref);
+        if (!pw_ctx->hw_frames_ref) {
+            err = AVERROR(ENOMEM);
+            goto hw_fail;
+        }
+        frames_ctx = (AVHWFramesContext*)pw_ctx->hw_frames_ref->data;
+        frames_ctx->format    = AV_PIX_FMT_DRM_PRIME;
+        frames_ctx->sw_format = pw_ctx->av_pxl_format;
+        frames_ctx->width     = pw_ctx->width;
+        frames_ctx->height    = pw_ctx->height;
+        err = av_hwframe_ctx_init(pw_ctx->hw_frames_ref);
+hw_fail:
+        if (!err) {
+            buffer_types |= 1 << SPA_DATA_DmaBuf;
+        } else {
+            av_log(ctx, AV_LOG_WARNING,
+                   "Failed to initialize hardware frames context: %s. "
+                   "Falling back to shared memory\n", av_err2str(err));
+        }
+    }
+
+    params[n_params++] = spa_pod_builder_add_object(
+        &pod_builder, SPA_TYPE_OBJECT_ParamBuffers, SPA_PARAM_Buffers,
+        SPA_PARAM_BUFFERS_dataType,
+        SPA_POD_Int(buffer_types));
+
+    /* Meta header */
+    params[n_params++] = spa_pod_builder_add_object(
+        &pod_builder, SPA_TYPE_OBJECT_ParamMeta, SPA_PARAM_Meta,
+        SPA_PARAM_META_type, SPA_POD_Id(SPA_META_Header),
+        SPA_PARAM_META_size,
+        SPA_POD_Int(sizeof(struct spa_meta_header)));
+
+    pw_stream_update_params(pw_ctx->stream, params, n_params);
+
+end:
+    // Signal pipewiregrab_init that PipeWire initialization is over (either
+    // because it was completed successfully or because there was an error, in
+    // which case pw_ctx->pipewire_error will have been set to a nonzero value).
+    atomic_store(&pw_ctx->pipewire_initialization_over, 1);
+    pthread_cond_signal(&pw_ctx->pipewire_initialization_cond_var);
+}
+
+/**
+ * PipeWire callback of state changed events
+ *
+ * @param user_data pointer to AVFilterContext
+ * @param old old PipeWire stream state
+ * @param state current PipeWire stream state
+ * @param error received error information
+ */
+static void on_stream_state_changed_callback(void *user_data,
+                                             enum pw_stream_state old,
+                                             enum pw_stream_state state,
+                                             const char *error)
+{
+    AVFilterContext *ctx = user_data;
+    if (!ctx)
+        return;
+
+    av_log(ctx, AV_LOG_INFO, "stream state: \"%s\"\n",
+           pw_stream_state_as_string(state));
+}
+
+/**
+ * Find most recent buffer received in a PipeWire stream
+ *
+ * @param stream stream to get buffer from
+ * @return most recent buffer in the stream
+ */
+static struct pw_buffer *find_most_recent_buffer_and_recycle_olders(struct pw_stream *stream)
+{
+    struct pw_buffer *pw_buf = NULL;
+    while (1) {
+        struct pw_buffer *aux = pw_stream_dequeue_buffer(stream);
+        if (!aux)
+            break;
+        if (pw_buf)
+            pw_stream_queue_buffer(stream, pw_buf);
+        pw_buf = aux;
+    }
+    return pw_buf;
+}
+
+static void free_frame_desc(void *opaque, uint8_t *data)
+{
+    AVDRMFrameDescriptor *frame_desc = (AVDRMFrameDescriptor *)data;
+
+    for (int i = 0; i < frame_desc->nb_objects; i++)
+        close(frame_desc->objects[i].fd);
+    av_free(frame_desc);
+}
+
+static void process_dma_buffer(AVFilterContext *ctx, struct spa_buffer *spa_buf)
+{
+    AVFrame *frame = NULL;
+    AVDRMFrameDescriptor *frame_desc = NULL;
+    int ret;
+    int n_planes;
+    size_t size;
+    uint32_t offset, pitch;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    n_planes = spa_buf->n_datas;
+    av_assert0(n_planes <= AV_DRM_MAX_PLANES);
+
+    // Create frame descriptor
+    frame_desc = av_mallocz(sizeof(*frame_desc));
+    if (!frame_desc) {
+        av_log(ctx, AV_LOG_ERROR, "Failed to allocate frame descriptor\n");
+        goto fail;
+    }
+    *frame_desc = (AVDRMFrameDescriptor) {
+        .nb_objects = n_planes,
+        .nb_layers = 1,
+        .layers[0] = {
+            .format = spa_video_format_to_drm_format(pw_ctx->format.info.raw.format),
+            .nb_planes = n_planes,
+        },
+    };
+    for (int i = 0; i < n_planes; i++) {
+        offset = spa_buf->datas[i].chunk->offset;
+        pitch = spa_buf->datas[i].chunk->stride;
+        size = offset + pitch * pw_ctx->height;
+
+        frame_desc->objects[i] = (AVDRMObjectDescriptor) {
+            .fd              = spa_buf->datas[i].fd,
+            .size            = size,
+            .format_modifier = pw_ctx->format.info.raw.modifier,
+        };
+        frame_desc->layers[0].planes[i] = (AVDRMPlaneDescriptor) {
+            .object_index = i,
+            .offset       = offset,
+            .pitch        = pitch,
+        };
+    }
+
+    // Create frame
+    frame = av_frame_alloc();
+    if (!frame) {
+        av_log(ctx, AV_LOG_ERROR, "Failed to allocate frame\n");
+        goto fail;
+    }
+    frame->hw_frames_ctx = av_buffer_ref(pw_ctx->hw_frames_ref);
+    if (!frame->hw_frames_ctx) {
+        av_log(ctx, AV_LOG_ERROR, "Failed to create buffer reference\n");
+        goto fail;
+    }
+    frame->buf[0] = av_buffer_create((uint8_t *)frame_desc, sizeof(*frame_desc),
+                                     free_frame_desc, NULL, 0);
+    if (!frame->buf[0]) {
+        av_log(ctx, AV_LOG_ERROR, "Failed to create buffer\n");
+        goto fail;
+    }
+    frame->data[0] = (uint8_t *)frame_desc;
+    frame->format  = AV_PIX_FMT_DRM_PRIME;
+    frame->width = pw_ctx->width;
+    frame->height = pw_ctx->height;
+
+    // Update current_frame
+    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
+    av_frame_unref(pw_ctx->current_frame);
+    ret = av_frame_ref(pw_ctx->current_frame, frame);
+    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
+    if (ret < 0) {
+        av_log(ctx, AV_LOG_ERROR, "Failed to create frame reference\n");
+        av_frame_free(&frame);
+    }
+    return;
+
+fail:
+    av_freep(&frame_desc);
+    av_frame_free(&frame);
+}
+
+static void process_shm_buffer(AVFilterContext *ctx, struct spa_buffer *spa_buf)
+{
+    uint8_t *map = NULL;
+    void *sdata = NULL;
+    struct spa_meta_region *region;
+    int crop_left, crop_right, crop_top, crop_bottom;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    // Get data
+    if (spa_buf->datas[0].type == SPA_DATA_MemFd ) {
+        map = mmap(NULL, spa_buf->datas[0].maxsize + spa_buf->datas[0].mapoffset,
+                   PROT_READ, MAP_PRIVATE, spa_buf->datas[0].fd, 0);
+        if (map == MAP_FAILED) {
+            av_log(ctx, AV_LOG_ERROR, "mmap failed: %s\n", strerror(errno));
+            return;
+        }
+        sdata = SPA_PTROFF(map, spa_buf->datas[0].mapoffset, uint8_t);
+    } else if (spa_buf->datas[0].type == SPA_DATA_MemPtr) {
+        if (spa_buf->datas[0].data == NULL) {
+            av_log(ctx, AV_LOG_ERROR, "No data in buffer\n");
+            return;
+        }
+        sdata = spa_buf->datas[0].data;
+    } else {
+        av_log(ctx, AV_LOG_ERROR, "Buffer is not valid\n");
+        return;
+    }
+
+    region = spa_buffer_find_meta_data(spa_buf, SPA_META_VideoCrop, sizeof(*region));
+    if (region && spa_meta_region_is_valid(region)) {
+        crop_left = region->region.position.x;
+        crop_top = region->region.position.y;
+        crop_right = pw_ctx->width - crop_left - region->region.size.width;
+        crop_bottom = pw_ctx->height - crop_top - region->region.size.height;
+    }
+
+    // Update current_frame with the new data
+    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
+    memcpy(pw_ctx->current_frame->data[0], sdata, spa_buf->datas[0].chunk->size);
+    pw_ctx->current_frame->crop_top = crop_top;
+    pw_ctx->current_frame->crop_bottom = crop_bottom;
+    pw_ctx->current_frame->crop_left = crop_left;
+    pw_ctx->current_frame->crop_right = crop_right;
+    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
+
+    // Cleanup
+    if (spa_buf->datas[0].type == SPA_DATA_MemFd)
+        munmap(map, spa_buf->datas[0].maxsize + spa_buf->datas[0].mapoffset);
+}
+
+/**
+ * This function is called by PipeWire when a buffer
+ * is ready to be dequeued and processed.
+ *
+ * @param user_data pointer to AVFilterContext
+ */
+static void on_stream_process_callback(void *user_data)
+{
+    struct spa_buffer *spa_buf;
+    struct pw_buffer *pw_buf = NULL;
+    struct spa_meta_header *header = NULL;
+
+    AVFilterContext *ctx = user_data;
+    PipewireGrabContext *pw_ctx;
+    if (!ctx || !ctx->priv)
+        return;
+    pw_ctx = ctx->priv;
+
+    // We need to wait for pw_ctx->current_frame to have been allocated before
+    // we can use it to get frames from the PipeWire thread to FFmpeg
+    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
+    if (!pw_ctx->current_frame) {
+        pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
+        return;
+    }
+    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
+
+    pw_buf = find_most_recent_buffer_and_recycle_olders(pw_ctx->stream);
+    if (!pw_buf) {
+        av_log(ctx, AV_LOG_ERROR, "Out of buffers\n");
+        return;
+    }
+
+    spa_buf = pw_buf->buffer;
+    header = spa_buffer_find_meta_data(spa_buf, SPA_META_Header, sizeof(*header));
+    if (header && (header->flags & SPA_META_HEADER_FLAG_CORRUPTED)) {
+        av_log(ctx, AV_LOG_ERROR, "Corrupted PipeWire buffer\n");
+        goto end;
+    }
+
+    if (spa_buf->datas[0].type == SPA_DATA_DmaBuf)
+        process_dma_buffer(ctx, spa_buf);
+    else
+        process_shm_buffer(ctx, spa_buf);
+
+end:
+    pw_stream_queue_buffer(pw_ctx->stream, pw_buf);
+}
+
+static const struct pw_stream_events stream_events = {
+    PW_VERSION_STREAM_EVENTS,
+    .state_changed = on_stream_state_changed_callback,
+    .param_changed = on_stream_param_changed_callback,
+    .process = on_stream_process_callback,
+};
+
+static int subscribe_to_signal(AVFilterContext *ctx,
+                               const char *sender_name,
+                               const char *request_token,
+                               sd_bus_message_handler_t callback)
+{
+    int ret;
+    char *request_path;
+    struct DbusSignalData *dbus_signal_data;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    dbus_signal_data = (struct DbusSignalData *)av_mallocz(sizeof(struct DbusSignalData));
+    if (!dbus_signal_data)
+        return AVERROR(ENOMEM);
+
+    dbus_signal_data->ctx = ctx;
+    request_path = av_asprintf(REQUEST_PATH, sender_name, request_token);
+
+    ret = sd_bus_match_signal(pw_ctx->connection,
+                              &dbus_signal_data->slot,
+                              SENDER,
+                              request_path,
+                              "org.freedesktop.portal.Request",
+                              "Response",
+                              callback,
+                              dbus_signal_data);
+    av_free(request_path);
+    return (ret < 0) ? ret : 0;
+}
+
+static struct spa_pod *build_format(PipewireGrabContext *pw_ctx,
+                                    struct spa_pod_builder *builder,
+                                    uint32_t format,
+                                    const uint64_t *modifiers,
+                                    int n_modifiers)
+{
+    struct spa_pod_frame format_frame;
+    struct spa_pod_frame modifier_frame;
+
+    spa_pod_builder_push_object(builder, &format_frame,
+                                SPA_TYPE_OBJECT_Format, SPA_PARAM_EnumFormat);
+    spa_pod_builder_add(builder, SPA_FORMAT_mediaType,
+                        SPA_POD_Id(SPA_MEDIA_TYPE_video), 0);
+    spa_pod_builder_add(builder, SPA_FORMAT_mediaSubtype,
+                        SPA_POD_Id(SPA_MEDIA_SUBTYPE_raw), 0);
+    spa_pod_builder_add(builder, SPA_FORMAT_VIDEO_format,
+                        SPA_POD_Id(format), 0);
+    spa_pod_builder_add(builder, SPA_FORMAT_VIDEO_size,
+                        SPA_POD_CHOICE_RANGE_Rectangle(
+                            &SPA_RECTANGLE(320, 240),
+                            &SPA_RECTANGLE(1, 1),
+                            &SPA_RECTANGLE(4096, 4096)
+                        ), 0);
+    spa_pod_builder_add(builder, SPA_FORMAT_VIDEO_framerate,
+                        SPA_POD_CHOICE_RANGE_Fraction(
+                            &SPA_FRACTION(pw_ctx->framerate.num, pw_ctx->framerate.den),
+                            &SPA_FRACTION(0, 1),
+                            &SPA_FRACTION(144, 1)
+                        ), 0);
+    if (n_modifiers > 0) {
+        spa_pod_builder_prop(builder, SPA_FORMAT_VIDEO_modifier,
+                             SPA_POD_PROP_FLAG_MANDATORY | SPA_POD_PROP_FLAG_DONT_FIXATE);
+        spa_pod_builder_push_choice(builder, &modifier_frame, SPA_CHOICE_Enum, 0);
+
+        // A choice POD consists of a "default" value followed by the list of
+        // all possible values (https://docs.pipewire.org/page_spa_pod.html)
+        // This is why we need to add one of the modifiers twice.
+        spa_pod_builder_long(builder, modifiers[0]);
+        for (int i = 0; i < n_modifiers; i++)
+            spa_pod_builder_long(builder, modifiers[i]);
+
+        spa_pod_builder_pop(builder, &modifier_frame);
+    }
+    return spa_pod_builder_pop(builder, &format_frame);
+}
+
+static int play_pipewire_stream(AVFilterContext *ctx)
+{
+    int ret;
+    uint8_t buffer[4096];
+    struct spa_pod_builder pod_builder;
+    const struct spa_pod **params;
+    uint32_t n_params;
+
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    pw_init(NULL, NULL);
+    pw_ctx->pw_init_called = 1;
+
+    pw_ctx->thread_loop =
+        pw_thread_loop_new("thread loop", NULL);
+    if (!pw_ctx->thread_loop) {
+        av_log(ctx, AV_LOG_ERROR, "pw_thread_loop_new failed\n");
+        return AVERROR(ENOMEM);
+    }
+
+    pw_ctx->context =
+        pw_context_new(pw_thread_loop_get_loop(pw_ctx->thread_loop), NULL, 0);
+    if (!pw_ctx->context) {
+        av_log(ctx, AV_LOG_ERROR, "pw_context_new failed\n");
+        ret = AVERROR(ENOMEM);
+        goto fail;
+    }
+
+    if (pw_thread_loop_start(pw_ctx->thread_loop) < 0) {
+        av_log(ctx, AV_LOG_ERROR, "pw_thread_loop_start failed\n");
+        ret = AVERROR(EFAULT);
+        goto fail;
+    }
+
+    pw_thread_loop_lock(pw_ctx->thread_loop);
+
+    // Core
+    pw_ctx->core =
+        pw_context_connect_fd(pw_ctx->context,
+                              fcntl(pw_ctx->pipewire_fd, F_DUPFD_CLOEXEC, 3),
+                              NULL, 0);
+    if (!pw_ctx->core) {
+        ret = AVERROR(errno);
+        av_log(ctx, AV_LOG_ERROR, "pw_context_connect_fd failed\n");
+        pw_thread_loop_unlock(pw_ctx->thread_loop);
+        goto fail;
+    }
+
+    pw_core_add_listener(pw_ctx->core, &pw_ctx->core_listener, &core_events,
+                         ctx /* user_data */);
+
+    // Stream
+    pw_ctx->stream = pw_stream_new(
+        pw_ctx->core, "wayland grab",
+        pw_properties_new(PW_KEY_MEDIA_TYPE, "Video", PW_KEY_MEDIA_CATEGORY,
+                          "Capture", PW_KEY_MEDIA_ROLE, "Screen", NULL));
+
+    if (!pw_ctx->stream) {
+        av_log(ctx, AV_LOG_ERROR, "pw_stream_new failed\n");
+        ret = AVERROR(ENOMEM);
+        pw_thread_loop_unlock(pw_ctx->thread_loop);
+        goto fail;
+    }
+
+    pw_stream_add_listener(pw_ctx->stream, &pw_ctx->stream_listener,
+                           &stream_events, ctx /* user_data */);
+
+    // Stream parameters
+    pod_builder = SPA_POD_BUILDER_INIT(buffer, sizeof(buffer));
+    params = av_mallocz(2 * FF_ARRAY_ELEMS(pipewiregrab_formats) * sizeof(*params));
+    n_params = 0;
+
+    for (int i = 0; i < FF_ARRAY_ELEMS(pipewiregrab_formats); i++) {
+        if (pw_ctx->enable_dmabuf)
+            params[n_params++] = build_format(pw_ctx, &pod_builder, pipewiregrab_formats[i],
+                                              pipewiregrab_default_modifiers,
+                                              FF_ARRAY_ELEMS(pipewiregrab_default_modifiers));
+        params[n_params++] = build_format(pw_ctx, &pod_builder, pipewiregrab_formats[i],
+                                          NULL, 0);
+    }
+
+    ret = pw_stream_connect(
+        pw_ctx->stream, PW_DIRECTION_INPUT, (uint32_t)pw_ctx->pipewire_node,
+        PW_STREAM_FLAG_AUTOCONNECT | PW_STREAM_FLAG_MAP_BUFFERS, params, n_params);
+    if (ret != 0) {
+        av_log(ctx, AV_LOG_ERROR, "pw_stream_connect failed\n");
+        pw_thread_loop_unlock(pw_ctx->thread_loop);
+        goto fail;
+    }
+
+    av_log(ctx, AV_LOG_INFO, "Starting screen capture ...\n");
+    pw_thread_loop_unlock(pw_ctx->thread_loop);
+    return 0;
+
+fail:
+    if (pw_ctx->core) {
+        pw_core_disconnect(pw_ctx->core);
+        pw_ctx->core = NULL;
+    }
+    if (pw_ctx->context) {
+        pw_context_destroy(pw_ctx->context);
+        pw_ctx->context = NULL;
+    }
+    if (pw_ctx->thread_loop) {
+        pw_thread_loop_destroy(pw_ctx->thread_loop);
+        pw_ctx->thread_loop = NULL;
+    }
+
+    return ret;
+}
+
+static void portal_open_pipewire_remote(AVFilterContext *ctx)
+{
+    int ret;
+    int fd;
+    sd_bus_message *reply = NULL;
+    sd_bus_error err = SD_BUS_ERROR_NULL;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    const char *method_name = "OpenPipeWireRemote";
+    ret = sd_bus_call_method(pw_ctx->connection,
+                             DESTINATION,
+                             OBJECT_PATH,
+                             INTERFACE,
+                             method_name,
+                             &err,
+                             &reply,
+                             "oa{sv}",
+                             pw_ctx->session_handle,
+                             0);
+    if (ret < 0) {
+        av_log(ctx, AV_LOG_ERROR,
+               "Call to DBus method '%s' failed: %s\n",
+               method_name, err.message);
+        sd_bus_error_free(&err);
+        portal_abort(ctx, ret, "Failed to open PipeWire remote");
+        return;
+    }
+
+    ret = sd_bus_message_read(reply, "h", &fd);
+    if (ret < 0) {
+        portal_abort(ctx, ret, "Failed to read file descriptor");
+        return;
+    } else
+        av_log(ctx, AV_LOG_DEBUG, "PipeWire fd: %d\n", fd);
+
+    pw_ctx->pipewire_fd = fd;
+    atomic_store(&pw_ctx->dbus_event_loop_running, 0);
+}
+
+static void dbus_signal_data_free(struct DbusSignalData *dbus_signal_data)
+{
+    sd_bus_slot_unref(dbus_signal_data->slot);
+    av_free(dbus_signal_data);
+}
+
+static int on_start_response_received_callback(
+    sd_bus_message *message, void *user_data, sd_bus_error *err)
+{
+    int ret;
+    uint32_t response;
+    uint32_t node;
+    struct DbusSignalData *dbus_signal_data = user_data;
+    AVFilterContext *ctx = dbus_signal_data->ctx;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    dbus_signal_data_free(dbus_signal_data);
+
+    ret = sd_bus_message_read(message, "u", &response);
+    if (ret < 0) {
+        portal_abort(ctx, ret, "Failed to read DBus response");
+        return -1;
+    }
+    if (response != 0) {
+        portal_abort(ctx, AVERROR(EACCES),
+                     "Failed to start screen cast, denied or cancelled by user");
+        return -1;
+    }
+
+    sd_bus_message_enter_container(message, SD_BUS_TYPE_ARRAY, "{sv}");
+    sd_bus_message_enter_container(message, SD_BUS_TYPE_DICT_ENTRY, "sv");
+    sd_bus_message_skip(message, "s");
+    sd_bus_message_enter_container(message, SD_BUS_TYPE_VARIANT, "a(ua{sv})");
+    sd_bus_message_enter_container(message, SD_BUS_TYPE_ARRAY, "(ua{sv})");
+    sd_bus_message_enter_container(message, SD_BUS_TYPE_STRUCT, "ua{sv}");
+
+    ret = sd_bus_message_read(message, "u", &node);
+    if (ret < 0) {
+        portal_abort(ctx, ret, "Failed to read PipeWire node: %s");
+        return -1;
+    }
+    pw_ctx->pipewire_node = node;
+
+    av_log(ctx, AV_LOG_DEBUG, "PipeWire node: %"PRIu64"\n", pw_ctx->pipewire_node);
+    av_log(ctx, AV_LOG_INFO, "Monitor selected, setting up screen cast\n\n");
+
+    portal_open_pipewire_remote(ctx);
+    return 0;
+}
+
+static void portal_start(AVFilterContext *ctx)
+{
+    int ret;
+    sd_bus_error err = SD_BUS_ERROR_NULL;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    const char *method_name = "Start";
+    const char *request_token = "pipewiregrabStart";
+
+    ret = subscribe_to_signal(ctx, pw_ctx->sender_name, request_token,
+                              on_start_response_received_callback);
+    if (ret < 0) {
+        portal_abort(ctx, ret, "Failed to subscribe to DBus signal");
+        return;
+    }
+
+    av_log(ctx, AV_LOG_INFO, "Asking for monitor…\n");
+    ret = sd_bus_call_method(pw_ctx->connection,
+                             DESTINATION,
+                             OBJECT_PATH,
+                             INTERFACE,
+                             method_name,
+                             &err,
+                             NULL,
+                             "osa{sv}",
+                             pw_ctx->session_handle,
+                             "",
+                             1,
+                             "handle_token", "s", request_token);
+    if (ret < 0) {
+        av_log(ctx, AV_LOG_ERROR,
+               "Call to DBus method '%s' failed: %s\n",
+               method_name, err.message);
+        sd_bus_error_free(&err);
+        portal_abort(ctx, ret, "Failed to start screen cast session");
+    }
+}
+
+static int on_select_sources_response_received_callback(
+    sd_bus_message *message, void *user_data, sd_bus_error *err)
+{
+    int ret;
+    uint32_t response;
+    struct DbusSignalData *dbus_signal_data = user_data;
+    AVFilterContext *ctx = dbus_signal_data->ctx;
+
+    dbus_signal_data_free(dbus_signal_data);
+
+    ret = sd_bus_message_read(message, "u", &response);
+    if (ret < 0) {
+        portal_abort(ctx, ret, "Failed to read DBus response");
+        return -1;
+    }
+    if (response != 0) {
+        portal_abort(ctx, AVERROR(EACCES),
+                     "Failed to select screen cast sources");
+        return -1;
+    }
+
+    portal_start(ctx);
+    return 0;
+}
+
+static void portal_select_sources(AVFilterContext *ctx)
+{
+    int ret;
+    uint32_t cursor_mode;
+    sd_bus_error err = SD_BUS_ERROR_NULL;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    const char *method_name = "SelectSources";
+    const char *request_token = "pipewiregrabSelectSources";
+
+    ret = subscribe_to_signal(ctx, pw_ctx->sender_name, request_token,
+                              on_select_sources_response_received_callback);
+    if (ret < 0) {
+        portal_abort(ctx, ret, "Failed to subscribe to DBus signal");
+        return;
+    }
+
+    if ((pw_ctx->available_cursor_modes & PORTAL_CURSOR_MODE_EMBEDDED)
+             && pw_ctx->draw_mouse)
+        cursor_mode = PORTAL_CURSOR_MODE_EMBEDDED;
+    else
+        cursor_mode = PORTAL_CURSOR_MODE_HIDDEN;
+
+    ret = sd_bus_call_method(pw_ctx->connection,
+                             DESTINATION,
+                             OBJECT_PATH,
+                             INTERFACE,
+                             method_name,
+                             &err,
+                             NULL,
+                             "oa{sv}",
+                             pw_ctx->session_handle,
+                             4,
+                             "types", "u", pw_ctx->capture_type,
+                             "multiple", "b", 0,
+                             "handle_token", "s", request_token,
+                             "cursor_mode", "u", cursor_mode);
+    if (ret < 0) {
+        av_log(ctx, AV_LOG_ERROR,
+               "Call to DBus method '%s' failed: %s\n",
+               method_name, err.message);
+        sd_bus_error_free(&err);
+        portal_abort(ctx, ret, "Failed to select sources for screen cast session");
+    }
+}
+
+static int on_create_session_response_received_callback(
+    sd_bus_message *message, void *user_data, sd_bus_error *err)
+{
+    int ret;
+    uint32_t response;
+    const char *session_handle;
+    const char *type;
+    struct DbusSignalData *dbus_signal_data = user_data;
+    AVFilterContext *ctx = dbus_signal_data->ctx;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    dbus_signal_data_free(dbus_signal_data);
+
+    ret = sd_bus_message_read(message, "u", &response);
+    if (ret < 0) {
+        portal_abort(ctx, ret, "Failed to read DBus response");
+        return -1;
+    }
+    if (response != 0) {
+        portal_abort(ctx, AVERROR(EACCES),
+                     "Failed to create screen cast session");
+        return -1;
+    }
+
+    sd_bus_message_enter_container(message, SD_BUS_TYPE_ARRAY, "{sv}");
+    sd_bus_message_enter_container(message, SD_BUS_TYPE_DICT_ENTRY, "sv");
+    sd_bus_message_skip(message, "s");
+    // The XDG Desktop Portal documentation says that the type of `session_handle`
+    // is "o" (object path), but at least on some systems it's actually "s" (string),
+    // so we need to check to make sure we're using the right one.
+    sd_bus_message_peek_type(message, NULL, &type);
+    ret = sd_bus_message_read(message, "v", type, &session_handle);
+    if (ret < 0) {
+        portal_abort(ctx, ret, "Failed to read session handle");
+        return -1;
+    }
+    pw_ctx->session_handle = av_strdup(session_handle);
+
+    portal_select_sources(ctx);
+    return 0;
+}
+
+/**
+ * Function to create a screen cast session
+ *
+ * @param ctx
+ */
+static void portal_create_session(AVFilterContext *ctx)
+{
+    int ret;
+    sd_bus_error err = SD_BUS_ERROR_NULL;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    const char *method_name = "CreateSession";
+    const char *request_token = "pipewiregrabCreateSession";
+
+    ret = subscribe_to_signal(ctx, pw_ctx->sender_name, request_token,
+                              on_create_session_response_received_callback);
+    if (ret < 0) {
+        portal_abort(ctx, ret, "Failed to subscribe to DBus signal");
+        return;
+    }
+
+    ret = sd_bus_call_method(pw_ctx->connection,
+                             DESTINATION,
+                             OBJECT_PATH,
+                             INTERFACE,
+                             method_name,
+                             &err,
+                             NULL,
+                             "a{sv}",
+                             2,
+                             "handle_token", "s", request_token,
+                             "session_handle_token", "s", "pipewiregrab");
+    if (ret < 0) {
+        av_log(ctx, AV_LOG_ERROR,
+               "Call to DBus method '%s' failed: %s\n",
+               method_name, err.message);
+        sd_bus_error_free(&err);
+        portal_abort(ctx, ret, "Failed to create screen cast session");
+    }
+}
+
+/**
+ * Helper function: get available cursor modes and update the
+ *                  PipewireGrabContext accordingly
+ *
+ * @param ctx
+ */
+static int portal_update_available_cursor_modes(AVFilterContext *ctx)
+{
+    int ret;
+    sd_bus_error err = SD_BUS_ERROR_NULL;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    ret = sd_bus_get_property_trivial(pw_ctx->connection,
+                                      DESTINATION,
+                                      OBJECT_PATH,
+                                      INTERFACE,
+                                      "AvailableCursorModes",
+                                      &err,
+                                      'u',
+                                      &pw_ctx->available_cursor_modes);
+    if (ret < 0)
+        av_log(ctx, AV_LOG_ERROR,
+               "Couldn't retrieve available cursor modes: %s\n", err.message);
+
+    sd_bus_error_free(&err);
+    return ret;
+}
+
+static int create_dbus_connection(AVFilterContext *ctx)
+{
+    const char *aux;
+    int ret;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    ret = sd_bus_open_user(&pw_ctx->connection);
+    if (ret < 0) {
+        av_log(ctx, AV_LOG_ERROR,
+               "Failed to create DBus connection: %s\n", strerror(-ret));
+        return ret;
+    }
+
+    ret = sd_bus_get_unique_name(pw_ctx->connection, &aux);
+    if (ret < 0) {
+        av_log(ctx, AV_LOG_ERROR,
+               "Failed to get bus name: %s\n", strerror(-ret));
+        return ret;
+    }
+    // From https://flatpak.github.io/xdg-desktop-portal/docs/doc-org.freedesktop.portal.Request.html:
+    // "SENDER is the caller's unique name, with the initial ':' removed and all '.' replaced by '_'"
+    pw_ctx->sender_name = av_strireplace(aux + 1, ".", "_");
+    av_log(ctx, AV_LOG_DEBUG,
+           "DBus connection created (sender name: %s)\n", pw_ctx->sender_name);
+    return 0;
+}
+
+
+/**
+ * Use XDG Desktop Portal's ScreenCast interface to open a file descriptor that
+ * can be used by PipeWire to access the screen cast streams.
+ * (https://flatpak.github.io/xdg-desktop-portal/docs/doc-org.freedesktop.portal.ScreenCast.html)
+ *
+ * @param ctx
+ */
+static int portal_init_screencast(AVFilterContext *ctx)
+{
+    int ret;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+
+    ret = create_dbus_connection(ctx);
+    if (ret < 0)
+        return ret;
+
+    ret = portal_update_available_cursor_modes(ctx);
+    if (ret < 0)
+        return ret;
+
+    portal_create_session(ctx);
+    if (pw_ctx->portal_error)
+        return pw_ctx->portal_error;
+
+    // The event loop will run until it's stopped by portal_open_pipewire_remote (if
+    // all DBus method calls completed successfully) or portal_abort (in case of error).
+    // In the latter case, pw_ctx->portal_error gets set to a negative value.
+    atomic_store(&pw_ctx->dbus_event_loop_running, 1);
+    while(atomic_load(&pw_ctx->dbus_event_loop_running)) {
+        ret = sd_bus_process(pw_ctx->connection, NULL);
+        if (ret < 0) {
+            av_log(ctx, AV_LOG_ERROR,
+                   "Failed to process DBus event: %s\n", strerror(-ret));
+            return ret;
+        }
+
+        ret = sd_bus_wait(pw_ctx->connection, 2000);
+        if (ret < 0) {
+            av_log(ctx, AV_LOG_ERROR,
+                   "Error while waiting on bus: %s\n", strerror(-ret));
+            return ret;
+        }
+    }
+    return pw_ctx->portal_error;
+}
+
+static av_cold int pipewiregrab_init(AVFilterContext *ctx)
+{
+    int ret;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+    if (!pw_ctx) {
+        av_log(ctx, AV_LOG_ERROR,
+               "Invalid private context data\n");
+        return AVERROR(EINVAL);
+    }
+
+    atomic_init(&pw_ctx->dbus_event_loop_running, 0);
+    atomic_init(&pw_ctx->pipewire_initialization_over, 0);
+    pthread_cond_init(&pw_ctx->pipewire_initialization_cond_var, NULL);
+    pthread_mutex_init(&pw_ctx->pipewire_initialization_mutex, NULL);
+    pthread_mutex_init(&pw_ctx->current_frame_mutex, NULL);
+
+    if (pw_ctx->pipewire_fd == 0) {
+        ret = portal_init_screencast(ctx);
+        if (ret != 0) {
+            av_log(ctx, AV_LOG_ERROR, "Couldn't init screen cast\n");
+            return ret;
+        }
+    }
+
+    ret = play_pipewire_stream(ctx);
+    if (ret != 0)
+        return ret;
+
+    // Wait until PipeWire initialization is over
+    pthread_mutex_lock(&pw_ctx->pipewire_initialization_mutex);
+    while (!atomic_load(&pw_ctx->pipewire_initialization_over)) {
+        pthread_cond_wait(&pw_ctx->pipewire_initialization_cond_var,
+                          &pw_ctx->pipewire_initialization_mutex);
+    }
+    pthread_mutex_unlock(&pw_ctx->pipewire_initialization_mutex);
+
+    return pw_ctx->pipewire_error;
+}
+
+static void pipewiregrab_uninit(AVFilterContext *ctx)
+{
+    int ret;
+    PipewireGrabContext *pw_ctx = ctx->priv;
+    if (!pw_ctx)
+        return;
+
+    // PipeWire cleanup
+    if (pw_ctx->thread_loop) {
+        pw_thread_loop_signal(pw_ctx->thread_loop, false);
+        pw_thread_loop_unlock(pw_ctx->thread_loop);
+        pw_thread_loop_stop(pw_ctx->thread_loop);
+    }
+    if (pw_ctx->stream) {
+        pw_stream_disconnect(pw_ctx->stream);
+        pw_stream_destroy(pw_ctx->stream);
+        pw_ctx->stream = NULL;
+    }
+    if (pw_ctx->core){
+        pw_core_disconnect(pw_ctx->core);
+        pw_ctx->core = NULL;
+    }
+    if (pw_ctx->context) {
+        pw_context_destroy(pw_ctx->context);
+        pw_ctx->context = NULL;
+    }
+    if (pw_ctx->thread_loop) {
+        pw_thread_loop_destroy(pw_ctx->thread_loop);
+        pw_ctx->thread_loop = NULL;
+    }
+    if (pw_ctx->pw_init_called) {
+        pw_deinit();
+        pw_ctx->pw_init_called = 0;
+    }
+    if (pw_ctx->pipewire_fd > 0) {
+        close(pw_ctx->pipewire_fd);
+        pw_ctx->pipewire_fd = 0;
+    }
+    av_frame_free(&pw_ctx->current_frame);
+    av_buffer_unref(&pw_ctx->hw_frames_ref);
+    av_buffer_unref(&pw_ctx->hw_device_ref);
+
+    // DBus cleanup
+    if (pw_ctx->session_handle) {
+        ret = sd_bus_call_method(pw_ctx->connection,
+                                 DESTINATION,
+                                 pw_ctx->session_handle,
+                                 "org.freedesktop.portal.Session",
+                                 "Close",
+                                 NULL, NULL, NULL);
+        if (ret < 0)
+            av_log(ctx, AV_LOG_DEBUG,
+                   "Failed to close portal session: %s\n", strerror(-ret));
+
+        av_freep(&pw_ctx->session_handle);
+    }
+    sd_bus_flush_close_unref(pw_ctx->connection);
+    av_freep(&pw_ctx->sender_name);
+}
+
+static int pipewiregrab_config_props(AVFilterLink *outlink)
+{
+    AVFrame *frame;
+    PipewireGrabContext *pw_ctx = outlink->src->priv;
+
+    AVRational time_base = av_inv_q(pw_ctx->framerate);
+    pw_ctx->frame_duration = av_rescale_q(1, time_base, AV_TIME_BASE_Q);
+    pw_ctx->time_frame = av_gettime_relative();
+
+    outlink->w = pw_ctx->width;
+    outlink->h = pw_ctx->height;
+    outlink->time_base = AV_TIME_BASE_Q;
+    outlink->frame_rate = pw_ctx->framerate;
+
+    frame = ff_get_video_buffer(outlink, pw_ctx->width, pw_ctx->height);
+    if (!frame)
+        return AVERROR(ENOMEM);
+    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
+    pw_ctx->current_frame = frame;
+    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
+
+    return 0;
+}
+
+static int pipewiregrab_request_frame(AVFilterLink *outlink)
+{
+    int ret;
+    int64_t curtime, delay;
+    PipewireGrabContext *pw_ctx = outlink->src->priv;
+    AVFrame *frame = av_frame_alloc();
+    if (!frame)
+        return AVERROR(ENOMEM);
+
+    pw_ctx->time_frame += pw_ctx->frame_duration;
+    while (1) {
+        curtime = av_gettime_relative();
+        delay   = pw_ctx->time_frame - curtime;
+        if (delay <= 0)
+            break;
+        av_usleep(delay);
+    }
+
+    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
+    ret = av_frame_ref(frame, pw_ctx->current_frame);
+    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
+    if (ret < 0) {
+        av_frame_free(&frame);
+        return ret;
+    }
+
+    frame->pts = av_gettime();
+    frame->duration = pw_ctx->frame_duration;
+    frame->sample_aspect_ratio = (AVRational) {1, 1};
+
+    return ff_filter_frame(outlink, frame);
+}
+
+static int pipewiregrab_query_formats(AVFilterContext *ctx)
+{
+    PipewireGrabContext *pw_ctx = ctx->priv;
+    enum AVPixelFormat pix_fmts[] = {pw_ctx->av_pxl_format, AV_PIX_FMT_NONE};
+
+    return ff_set_common_formats_from_list(ctx, pix_fmts);
+}
+
+static const AVFilterPad pipewiregrab_outputs[] = {
+    {
+        .name          = "default",
+        .type          = AVMEDIA_TYPE_VIDEO,
+        .request_frame = pipewiregrab_request_frame,
+        .config_props  = pipewiregrab_config_props,
+    },
+};
+
+const AVFilter ff_vsrc_pipewiregrab= {
+    .name = "pipewiregrab",
+    .description = NULL_IF_CONFIG_SMALL("Capture screen or window using PipeWire."),
+    .priv_size = sizeof(struct PipewireGrabContext),
+    .priv_class = &pipewiregrab_class,
+    .init = pipewiregrab_init,
+    .uninit = pipewiregrab_uninit,
+    .inputs = NULL,
+    FILTER_OUTPUTS(pipewiregrab_outputs),
+    FILTER_QUERY_FUNC(pipewiregrab_query_formats),
+};
-- 
2.34.1

_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-05-10 21:12 [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab François-Simon Fauteux-Chapleau
@ 2024-05-11 14:08 ` Andrew Sayers
  0 siblings, 0 replies; 13+ messages in thread
From: Andrew Sayers @ 2024-05-11 14:08 UTC (permalink / raw)
  To: FFmpeg development discussions and patches

(only reviewing the documentation, not the code itself)

On Fri, May 10, 2024 at 05:12:19PM -0400, François-Simon Fauteux-Chapleau wrote:
> This is a revised version of the "pipewiregrab" patch submitted by
> Abhishek Ojha a few months ago:
> https://patchwork.ffmpeg.org/project/ffmpeg/patch/20231227162504.690730-1-abhishek.ojha@savoirfairelinux.com/
> https://patchwork.ffmpeg.org/project/ffmpeg/patch/20231227162504.690730-2-abhishek.ojha@savoirfairelinux.com/
> 
> The main change is that the patch is now implemented as a libavfilter
> source filter instead of a libavdevice input device, as was requested in
> a comment on the previous version. This version also adds support for
> DMA buffer sharing and uses sd-bus instead of GDBus.
> 
> There are also several small changes meant to fix bugs or simplify the
> code, but the overall structure remains the same as before: we use the
> ScreenCast interface provided by XDG Desktop Portal to obtain a file
> descriptor, which is then used to create a PipeWire stream. The data from
> that stream can then be used to generate frames for FFmpeg.
> 
> Example usage:
> ffmpeg -f lavfi -i pipewiregrab \
>        -vf 'hwmap=derive_device=vaapi,scale_vaapi=format=nv12' \
>        -c:v h264_vaapi -t 10 output.mp4
> 
> Signed-off-by: François-Simon Fauteux-Chapleau <francois-simon.fauteux-chapleau@savoirfairelinux.com>
> ---
>  configure                       |   16 +
>  libavfilter/Makefile            |    1 +
>  libavfilter/allfilters.c        |    1 +
>  libavfilter/vsrc_pipewiregrab.c | 1433 +++++++++++++++++++++++++++++++
>  4 files changed, 1451 insertions(+)
>  create mode 100644 libavfilter/vsrc_pipewiregrab.c
> 
> diff --git a/configure b/configure
> index beb1fa6d3c..028020455e 100755
> --- a/configure
> +++ b/configure
> @@ -304,6 +304,7 @@ External library support:
>    --enable-libxcb-shm      enable X11 grabbing shm communication [autodetect]
>    --enable-libxcb-xfixes   enable X11 grabbing mouse rendering [autodetect]
>    --enable-libxcb-shape    enable X11 grabbing shape rendering [autodetect]
> +  --enable-libpipewire     enable screen grabbing using PipeWire [autodetect]
>    --enable-libxvid         enable Xvid encoding via xvidcore,
>                             native MPEG-4/Xvid encoder exists [no]
>    --enable-libxml2         enable XML parsing using the C library libxml2, needed
> @@ -1845,6 +1846,8 @@ EXTERNAL_AUTODETECT_LIBRARY_LIST="
>      libxcb_shm
>      libxcb_shape
>      libxcb_xfixes
> +    libpipewire
> +    libsystemd
>      lzma
>      mediafoundation
>      metal
> @@ -3895,6 +3898,7 @@ pad_opencl_filter_deps="opencl"
>  pan_filter_deps="swresample"
>  perspective_filter_deps="gpl"
>  phase_filter_deps="gpl"
> +pipewiregrab_filter_deps="libpipewire libsystemd pthreads"
>  pp7_filter_deps="gpl"
>  pp_filter_deps="gpl postproc"
>  prewitt_opencl_filter_deps="opencl"
> @@ -7230,6 +7234,18 @@ if enabled libxcb; then
>      enabled libxcb_xfixes && check_pkg_config libxcb_xfixes xcb-xfixes xcb/xfixes.h xcb_xfixes_get_cursor_image
>  fi
>  
> +# Starting with version 0.3.52, PipeWire's spa library uses the __LOCALE_C_ONLY macro to determine
> +# whether the locale_t type (introduced in POSIX.1-2008) and some related functions are available (see
> +# https://gitlab.freedesktop.org/pipewire/pipewire/-/issues/2390 for more information).
> +# Unfortunately, this macro is specific to uclibc, which can cause build issues on systems that use a
> +# different implementation of libc if POSIX 2008 support isn't enabled (which is the case for FFmpeg currently).
> +# As a workaround for this problem, we add a compilation flag to ensure that __LOCALE_C_ONLY is always defined.
> +add_cppflags -D__LOCALE_C_ONLY
> +enabled libpipewire && check_pkg_config libpipewire "libpipewire-0.3 >= 0.3.40" pipewire/pipewire.h pw_init
> +if enabled libpipewire; then
> +    enabled libsystemd  && check_pkg_config libsystemd "libsystemd >= 246" systemd/sd-bus.h sd_bus_call_method
> +fi
> +
>  check_func_headers "windows.h" CreateDIBSection "$gdigrab_indev_extralibs"
>  
>  # check if building for desktop or uwp
> diff --git a/libavfilter/Makefile b/libavfilter/Makefile
> index 5992fd161f..6352e91586 100644
> --- a/libavfilter/Makefile
> +++ b/libavfilter/Makefile
> @@ -603,6 +603,7 @@ OBJS-$(CONFIG_NULLSRC_FILTER)                += vsrc_testsrc.o
>  OBJS-$(CONFIG_OPENCLSRC_FILTER)              += vf_program_opencl.o opencl.o
>  OBJS-$(CONFIG_PAL75BARS_FILTER)              += vsrc_testsrc.o
>  OBJS-$(CONFIG_PAL100BARS_FILTER)             += vsrc_testsrc.o
> +OBJS-$(CONFIG_PIPEWIREGRAB_FILTER)           += vsrc_pipewiregrab.o
>  OBJS-$(CONFIG_QRENCODE_FILTER)               += qrencode.o textutils.o
>  OBJS-$(CONFIG_QRENCODESRC_FILTER)            += qrencode.o textutils.o
>  OBJS-$(CONFIG_RGBTESTSRC_FILTER)             += vsrc_testsrc.o
> diff --git a/libavfilter/allfilters.c b/libavfilter/allfilters.c
> index c532682fc2..3670a6d7e7 100644
> --- a/libavfilter/allfilters.c
> +++ b/libavfilter/allfilters.c
> @@ -569,6 +569,7 @@ extern const AVFilter ff_vsrc_openclsrc;
>  extern const AVFilter ff_vsrc_qrencodesrc;
>  extern const AVFilter ff_vsrc_pal75bars;
>  extern const AVFilter ff_vsrc_pal100bars;
> +extern const AVFilter ff_vsrc_pipewiregrab;
>  extern const AVFilter ff_vsrc_rgbtestsrc;
>  extern const AVFilter ff_vsrc_sierpinski;
>  extern const AVFilter ff_vsrc_smptebars;
> diff --git a/libavfilter/vsrc_pipewiregrab.c b/libavfilter/vsrc_pipewiregrab.c
> new file mode 100644
> index 0000000000..51073c22b1
> --- /dev/null
> +++ b/libavfilter/vsrc_pipewiregrab.c
> @@ -0,0 +1,1433 @@
> +/*
> + * PipeWire input grabber (ScreenCast)
> + * Copyright (C) 2024 Savoir-faire Linux, Inc.
> + *
> + * This file is part of FFmpeg.
> + *
> + * FFmpeg is free software; you can redistribute it and/or
> + * modify it under the terms of the GNU Lesser General Public
> + * License as published by the Free Software Foundation; either
> + * version 2.1 of the License, or (at your option) any later version.
> + *
> + * FFmpeg is distributed in the hope that it will be useful,
> + * but WITHOUT ANY WARRANTY; without even the implied warranty of
> + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
> + * Lesser General Public License for more details.
> + *
> + * You should have received a copy of the GNU Lesser General Public
> + * License along with FFmpeg; if not, write to the Free Software
> + * Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
> + */
> +
> +/**
> + * @file
> + * PipeWireGrab video source
> + * @author Firas Ashkar <firas.ashkar at savoirfairelinux.com>
> + * @author Abhishek Ojha <abhishek.ojha at savoirfairelinux.com>
> + * @author François-Simon Fauteux-Chapleau <francois-simon.fauteux-chapleau at savoirfairelinux.com>
> + */
> +
> +#include <drm_fourcc.h>
> +#include <fcntl.h>
> +#include <pipewire/pipewire.h>
> +#include <pipewire/thread-loop.h>
> +#include <pthread.h>
> +#include <spa/debug/types.h>
> +#include <spa/param/video/format-utils.h>
> +#include <spa/param/video/raw.h>
> +#include <spa/param/video/type-info.h>
> +#include <stdatomic.h>
> +#include <stdlib.h>
> +#include <string.h>
> +#include <sys/mman.h>
> +#include <systemd/sd-bus.h>
> +#include <systemd/sd-bus-protocol.h>
> +#include <unistd.h>
> +
> +#include "avfilter.h"
> +#include "formats.h"
> +#include "video.h"
> +
> +#include "libavformat/avformat.h"
> +#include "libavutil/avassert.h"
> +#include "libavutil/avstring.h"
> +#include "libavutil/hwcontext.h"
> +#include "libavutil/hwcontext_drm.h"
> +#include "libavutil/mem.h"
> +#include "libavutil/opt.h"
> +#include "libavutil/time.h"
> +
> +#ifndef __USE_XOPEN2K8
> +#define F_DUPFD_CLOEXEC                                                        \
> +    1030 /* Duplicate file descriptor with close-on-exit set.  */
> +#endif
> +
> +#define DESTINATION "org.freedesktop.portal.Desktop"
> +#define SENDER DESTINATION
> +#define OBJECT_PATH "/org/freedesktop/portal/desktop"
> +#define INTERFACE "org.freedesktop.portal.ScreenCast"
> +#define REQUEST_PATH "/org/freedesktop/portal/desktop/request/%s/%s"
> +
> +#define BYTES_PER_PIXEL 4 /* currently all formats assume 4 bytes per pixel */
> +#define MAX_SPA_PARAM 4 /* max number of params for spa pod */
> +
> +/**
> + * PipeWire capture types
> + */
> +typedef enum {
> +    DESKTOP_CAPTURE = 1,
> +    WINDOW_CAPTURE = 2,
> +} pw_capture_type;
> +
> +/**
> + * XDG Desktop Portal supported cursor modes
> + */
> +enum PortalCursorMode {
> +    PORTAL_CURSOR_MODE_HIDDEN = 1 << 0,
> +    PORTAL_CURSOR_MODE_EMBEDDED = 1 << 1,
> +};
> +
> +typedef struct PipewireGrabContext {
> +    const AVClass *class;
> +
> +    sd_bus *connection;
> +    atomic_int dbus_event_loop_running;
> +    char *sender_name;
> +    char *session_handle;
> +
> +    uint64_t pipewire_node;
> +    int pipewire_fd;
> +
> +    pthread_cond_t pipewire_initialization_cond_var;
> +    pthread_mutex_t pipewire_initialization_mutex;
> +    atomic_int pipewire_initialization_over;
> +    int pw_init_called;
> +    struct pw_thread_loop *thread_loop;
> +    struct pw_context *context;
> +    struct pw_core *core;
> +    struct spa_hook core_listener;
> +    struct pw_stream *stream;
> +    struct spa_hook stream_listener;
> +    struct spa_video_info format;
> +
> +    uint32_t available_cursor_modes;
> +    pw_capture_type capture_type;
> +    int draw_mouse;
> +
> +    uint32_t width, height;
> +    size_t frame_size;
> +    uint8_t Bpp;
> +    enum AVPixelFormat av_pxl_format;
> +
> +    int64_t time_frame;
> +    int64_t frame_duration;
> +    AVRational framerate;
> +    pthread_mutex_t current_frame_mutex;
> +    AVFrame *current_frame;
> +    AVBufferRef *hw_device_ref;
> +    AVBufferRef *hw_frames_ref;
> +    int enable_dmabuf;
> +    const char *device_path;
> +
> +    int portal_error;
> +    int pipewire_error;
> +} PipewireGrabContext;
> +
> +/**
> + * Data for DBus signals callbacks
> + */
> +struct DbusSignalData {
> +    AVFilterContext *ctx;
> +    sd_bus_slot *slot;
> +};
> +
> +#define OFFSET(x) offsetof(PipewireGrabContext, x)
> +#define FLAGS AV_OPT_FLAG_FILTERING_PARAM|AV_OPT_FLAG_VIDEO_PARAM
> +static const AVOption pipewiregrab_options[] = {
> +    { "framerate", "set video frame rate", OFFSET(framerate), AV_OPT_TYPE_VIDEO_RATE, { .str = "ntsc" }, 0, INT_MAX, FLAGS },
> +    { "draw_mouse", "draw the mouse pointer", OFFSET(draw_mouse), AV_OPT_TYPE_BOOL, { .i64 = 1 }, 0, 1, FLAGS },
> +    { "capture_type", "set the capture type (1 for screen, 2 for window)", OFFSET(capture_type), AV_OPT_TYPE_INT, { .i64 = 1 }, 1, 2, FLAGS },
> +    { "fd", "set file descriptor to be used by PipeWire", OFFSET(pipewire_fd), AV_OPT_TYPE_INT, { .i64 = 0 }, 0, INT_MAX, FLAGS },
> +    { "node", "set PipeWire node (required when using the 'fd' option)", OFFSET(pipewire_node), AV_OPT_TYPE_UINT64, { .i64 = 0 }, 0, 0xffffffff, FLAGS },
> +    { "enable_dmabuf", "enable DMA-BUF sharing", OFFSET(enable_dmabuf), AV_OPT_TYPE_BOOL, { .i64 = 1 }, 0, 1, FLAGS },
> +    { "device", "DRM device path", OFFSET(device_path), AV_OPT_TYPE_STRING, { .str = "/dev/dri/card0" }, 0, 0, FLAGS },
> +    { NULL },
> +};
> +
> +AVFILTER_DEFINE_CLASS(pipewiregrab);
> +
> +/**
> + * Helper function to allow portal_init_screencast to stop and return an error
> + * code if a DBus operation/callback fails.

Here and below, several comments begin with "Helper function...", "function is
called by..." etc.  The reader already knows this a function from the context,
so the documentation would be more efficient without these redundant prefixes.

> + *
> + * @param ctx
> + * @param error AVERROR code (negative)
> + * @param message error message
> + */
> +static void portal_abort(AVFilterContext *ctx, int error, const char *message)
> +{
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    pw_ctx->portal_error = error;
> +    av_log(ctx, AV_LOG_ERROR, "Aborting: %s\n", message);
> +
> +    atomic_store(&pw_ctx->dbus_event_loop_running, 0);
> +}
> +
> +/**
> + * Callback to handle PipeWire core info events

I'd argue "Callback to" is redundant here in the same way as "Function",
but not quite such a problem.

> + *
> + * @param user_data pointer to AVFilterContext
> + * @param info pw_core_info
> + */
> +static void on_core_info_callback(void *user_data, const struct pw_core_info *info)
> +{
> +    AVFilterContext *ctx = user_data;
> +    av_log(ctx, AV_LOG_DEBUG, "Server version: %s\n", info->version);
> +    av_log(ctx, AV_LOG_INFO, "Library version: %s\n", pw_get_library_version());
> +    av_log(ctx, AV_LOG_DEBUG, "Header version: %s\n", pw_get_headers_version());
> +}
> +
> +/**
> + * Callback to handle PipeWire core done events
> + *
> + * @param user_data pointer to AVFilterContext
> + * @param id PipeWire object id of calling
> + * @param seq PipeWire object sequence
> + */
> +static void on_core_done_callback(void *user_data, uint32_t id, int seq)
> +{
> +    AVFilterContext *ctx = user_data;
> +    PipewireGrabContext *pw_ctx;
> +
> +    if (!ctx || !ctx->priv)
> +        return;
> +    pw_ctx = ctx->priv;
> +
> +    if (id == PW_ID_CORE)
> +        pw_thread_loop_signal(pw_ctx->thread_loop, false);
> +}
> +
> +/**
> + * Callback to handle Pipewire core error events
> + *
> + * @param user_data pointer to AVFilterContext
> + * @param id id of PipeWire proxy object where the error occured
> + * @param seq PipeWire sequence number which produced the error
> + * @param res error number

Nitpick: "error number" might be better as "PipeWire error number" here?
Just in case someone skip-reads past and assumes this is an AVError.

> + * @param message error message
> + */
> +static void on_core_error_callback(void *user_data, uint32_t id, int seq,
> +                                   int res, const char *message)
> +{
> +    AVFilterContext *ctx = user_data;
> +    PipewireGrabContext *pw_ctx;
> +
> +    if (!ctx)
> +        return;
> +
> +    av_log(ctx, AV_LOG_ERROR,
> +           "PipeWire core error: %s (id=%u, seq=%d, res=%d: %s)\n",
> +           message, id, seq, res, strerror(-res));
> +
> +    pw_ctx = ctx->priv;
> +    if (!pw_ctx)
> +        return;
> +
> +    pw_thread_loop_signal(pw_ctx->thread_loop, false);
> +    pw_ctx->pipewire_error = res;
> +    atomic_store(&pw_ctx->pipewire_initialization_over, 1);
> +    pthread_cond_signal(&pw_ctx->pipewire_initialization_cond_var);
> +}
> +
> +/**
> + * PipeWire core events callbacks
> + */
> +static const struct pw_core_events core_events = {
> +    PW_VERSION_CORE_EVENTS,
> +    .info = on_core_info_callback,
> +    .done = on_core_done_callback,
> +    .error = on_core_error_callback,
> +};
> +
> +/**
> + * Helper function: convert spa video format to AVPixelFormat
> + *
> + * @param video_format spa video format to convert
> + * @return the corresponding AVPixelFormat
> + */
> +static enum AVPixelFormat
> +spa_video_format_to_av_pixel_format(enum spa_video_format video_format)
> +{
> +    switch (video_format) {
> +    case SPA_VIDEO_FORMAT_RGBA:
> +    case SPA_VIDEO_FORMAT_RGBx:
> +        return AV_PIX_FMT_RGBA;
> +
> +    case SPA_VIDEO_FORMAT_BGRA:
> +    case SPA_VIDEO_FORMAT_BGRx:
> +        return AV_PIX_FMT_BGRA;
> +
> +    default:
> +        return AV_PIX_FMT_NONE;
> +    }
> +}
> +
> +static uint32_t spa_video_format_to_drm_format(enum spa_video_format video_format)
> +{
> +    switch (video_format) {
> +    case SPA_VIDEO_FORMAT_RGBA:
> +        return DRM_FORMAT_ABGR8888;
> +    case SPA_VIDEO_FORMAT_RGBx:
> +        return DRM_FORMAT_XBGR8888;
> +    case SPA_VIDEO_FORMAT_BGRA:
> +        return DRM_FORMAT_ARGB8888;
> +    case SPA_VIDEO_FORMAT_BGRx:
> +        return DRM_FORMAT_XRGB8888;
> +    default:
> +        return DRM_FORMAT_INVALID;
> +    }
> +}
> +
> +static const uint32_t pipewiregrab_formats[] = {
> +    SPA_VIDEO_FORMAT_RGBA,
> +    SPA_VIDEO_FORMAT_RGBx,
> +    SPA_VIDEO_FORMAT_BGRx,
> +    SPA_VIDEO_FORMAT_BGRA,
> +};
> +
> +static const uint64_t pipewiregrab_default_modifiers[] = {
> +    DRM_FORMAT_MOD_LINEAR,
> +    DRM_FORMAT_MOD_INVALID,
> +};
> +
> +/**
> + * PipeWire callback of parameters changed events
> + *
> + * @param user_data pointer to AVFilterContext
> + * @param id type of changed param
> + * @param param pointer to changed param structure
> + */
> +static void on_stream_param_changed_callback(void *user_data, uint32_t id,
> +                                             const struct spa_pod *param)
> +{
> +    struct spa_pod_builder pod_builder;
> +    const struct spa_pod *params[MAX_SPA_PARAM];
> +    uint32_t n_params = 0;
> +    uint32_t buffer_types;
> +    uint8_t params_buffer[4096];
> +    int result;
> +    int err;
> +    PipewireGrabContext *pw_ctx;
> +    AVFilterContext *ctx = user_data;
> +    AVHWFramesContext *frames_ctx = NULL;
> +
> +    if (!ctx || !ctx->priv || !param)
> +        return;
> +
> +    if (id != SPA_PARAM_Format) {
> +        av_log(ctx, AV_LOG_WARNING,
> +               "Ignoring non-Format param change\n");
> +        return;
> +    }
> +
> +    pw_ctx = ctx->priv;
> +
> +    result = spa_format_parse(param, &pw_ctx->format.media_type,
> +                              &pw_ctx->format.media_subtype);
> +    if (result < 0) {
> +        av_log(ctx, AV_LOG_ERROR, "Unable to parse media type\n");
> +        pw_ctx->pipewire_error = AVERROR(EINVAL);
> +        goto end;
> +    }
> +
> +    if (pw_ctx->format.media_type != SPA_MEDIA_TYPE_video ||
> +        pw_ctx->format.media_subtype != SPA_MEDIA_SUBTYPE_raw) {
> +        av_log(ctx, AV_LOG_ERROR, "Unexpected media type\n");
> +        pw_ctx->pipewire_error = AVERROR(EINVAL);
> +        goto end;
> +    }
> +
> +    spa_format_video_raw_parse(param, &pw_ctx->format.info.raw);
> +
> +    av_log(ctx, AV_LOG_INFO, "Negotiated format:\n");
> +
> +    av_log(ctx, AV_LOG_INFO, "Format: %d (%s)\n",
> +           pw_ctx->format.info.raw.format,
> +           spa_debug_type_find_name(spa_type_video_format,
> +                                    pw_ctx->format.info.raw.format));
> +    av_log(ctx, AV_LOG_INFO, "Size: %dx%d\n",
> +           pw_ctx->format.info.raw.size.width,
> +           pw_ctx->format.info.raw.size.height);
> +    av_log(ctx, AV_LOG_INFO, "Framerate: %d/%d\n",
> +           pw_ctx->format.info.raw.framerate.num,
> +           pw_ctx->format.info.raw.framerate.denom);
> +
> +    pw_ctx->width = pw_ctx->format.info.raw.size.width;
> +    pw_ctx->height = pw_ctx->format.info.raw.size.height;
> +    pw_ctx->Bpp = BYTES_PER_PIXEL;
> +    pw_ctx->frame_size = pw_ctx->width * pw_ctx->height * pw_ctx->Bpp;
> +    if (pw_ctx->frame_size + AV_INPUT_BUFFER_PADDING_SIZE > INT_MAX) {
> +        av_log(ctx, AV_LOG_ERROR, "Captured area is too large\n");
> +        pw_ctx->pipewire_error = AVERROR(EINVAL);
> +        goto end;
> +    }
> +
> +    pw_ctx->av_pxl_format =
> +        spa_video_format_to_av_pixel_format(pw_ctx->format.info.raw.format);
> +    if (pw_ctx->av_pxl_format == AV_PIX_FMT_NONE) {
> +        av_log(ctx, AV_LOG_ERROR,
> +               "Unsupported buffer format: %d\n", pw_ctx->format.info.raw.format);
> +        pw_ctx->pipewire_error = AVERROR(EINVAL);
> +        goto end;
> +    }
> +
> +    /* Video crop */
> +    pod_builder = SPA_POD_BUILDER_INIT(params_buffer, sizeof(params_buffer));
> +    params[n_params++] = spa_pod_builder_add_object(
> +        &pod_builder, SPA_TYPE_OBJECT_ParamMeta, SPA_PARAM_Meta,
> +        SPA_PARAM_META_type, SPA_POD_Id(SPA_META_VideoCrop),
> +        SPA_PARAM_META_size, SPA_POD_Int(sizeof(struct spa_meta_region)));
> +
> +    /* Buffer options */
> +    buffer_types = (1 << SPA_DATA_MemPtr) | (1 << SPA_DATA_MemFd);
> +    if (spa_pod_find_prop(param, NULL, SPA_FORMAT_VIDEO_modifier)) {
> +        err = av_hwdevice_ctx_create(&pw_ctx->hw_device_ref, AV_HWDEVICE_TYPE_DRM,
> +                                     pw_ctx->device_path, NULL, 0);
> +        if (err < 0)
> +            goto hw_fail;
> +
> +        pw_ctx->hw_frames_ref = av_hwframe_ctx_alloc(pw_ctx->hw_device_ref);
> +        if (!pw_ctx->hw_frames_ref) {
> +            err = AVERROR(ENOMEM);
> +            goto hw_fail;
> +        }
> +        frames_ctx = (AVHWFramesContext*)pw_ctx->hw_frames_ref->data;
> +        frames_ctx->format    = AV_PIX_FMT_DRM_PRIME;
> +        frames_ctx->sw_format = pw_ctx->av_pxl_format;
> +        frames_ctx->width     = pw_ctx->width;
> +        frames_ctx->height    = pw_ctx->height;
> +        err = av_hwframe_ctx_init(pw_ctx->hw_frames_ref);
> +hw_fail:
> +        if (!err) {
> +            buffer_types |= 1 << SPA_DATA_DmaBuf;
> +        } else {
> +            av_log(ctx, AV_LOG_WARNING,
> +                   "Failed to initialize hardware frames context: %s. "
> +                   "Falling back to shared memory\n", av_err2str(err));
> +        }
> +    }
> +
> +    params[n_params++] = spa_pod_builder_add_object(
> +        &pod_builder, SPA_TYPE_OBJECT_ParamBuffers, SPA_PARAM_Buffers,
> +        SPA_PARAM_BUFFERS_dataType,
> +        SPA_POD_Int(buffer_types));
> +
> +    /* Meta header */
> +    params[n_params++] = spa_pod_builder_add_object(
> +        &pod_builder, SPA_TYPE_OBJECT_ParamMeta, SPA_PARAM_Meta,
> +        SPA_PARAM_META_type, SPA_POD_Id(SPA_META_Header),
> +        SPA_PARAM_META_size,
> +        SPA_POD_Int(sizeof(struct spa_meta_header)));
> +
> +    pw_stream_update_params(pw_ctx->stream, params, n_params);
> +
> +end:
> +    // Signal pipewiregrab_init that PipeWire initialization is over (either
> +    // because it was completed successfully or because there was an error, in
> +    // which case pw_ctx->pipewire_error will have been set to a nonzero value).
> +    atomic_store(&pw_ctx->pipewire_initialization_over, 1);
> +    pthread_cond_signal(&pw_ctx->pipewire_initialization_cond_var);
> +}
> +
> +/**
> + * PipeWire callback of state changed events
> + *
> + * @param user_data pointer to AVFilterContext
> + * @param old old PipeWire stream state
> + * @param state current PipeWire stream state
> + * @param error received error information
> + */
> +static void on_stream_state_changed_callback(void *user_data,
> +                                             enum pw_stream_state old,
> +                                             enum pw_stream_state state,
> +                                             const char *error)
> +{
> +    AVFilterContext *ctx = user_data;
> +    if (!ctx)
> +        return;
> +
> +    av_log(ctx, AV_LOG_INFO, "stream state: \"%s\"\n",
> +           pw_stream_state_as_string(state));
> +}
> +
> +/**
> + * Find most recent buffer received in a PipeWire stream
> + *
> + * @param stream stream to get buffer from
> + * @return most recent buffer in the stream
> + */
> +static struct pw_buffer *find_most_recent_buffer_and_recycle_olders(struct pw_stream *stream)
> +{
> +    struct pw_buffer *pw_buf = NULL;
> +    while (1) {
> +        struct pw_buffer *aux = pw_stream_dequeue_buffer(stream);
> +        if (!aux)
> +            break;
> +        if (pw_buf)
> +            pw_stream_queue_buffer(stream, pw_buf);
> +        pw_buf = aux;
> +    }
> +    return pw_buf;
> +}
> +
> +static void free_frame_desc(void *opaque, uint8_t *data)
> +{
> +    AVDRMFrameDescriptor *frame_desc = (AVDRMFrameDescriptor *)data;
> +
> +    for (int i = 0; i < frame_desc->nb_objects; i++)
> +        close(frame_desc->objects[i].fd);
> +    av_free(frame_desc);
> +}
> +
> +static void process_dma_buffer(AVFilterContext *ctx, struct spa_buffer *spa_buf)
> +{
> +    AVFrame *frame = NULL;
> +    AVDRMFrameDescriptor *frame_desc = NULL;
> +    int ret;
> +    int n_planes;
> +    size_t size;
> +    uint32_t offset, pitch;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    n_planes = spa_buf->n_datas;
> +    av_assert0(n_planes <= AV_DRM_MAX_PLANES);
> +
> +    // Create frame descriptor
> +    frame_desc = av_mallocz(sizeof(*frame_desc));
> +    if (!frame_desc) {
> +        av_log(ctx, AV_LOG_ERROR, "Failed to allocate frame descriptor\n");
> +        goto fail;
> +    }
> +    *frame_desc = (AVDRMFrameDescriptor) {
> +        .nb_objects = n_planes,
> +        .nb_layers = 1,
> +        .layers[0] = {
> +            .format = spa_video_format_to_drm_format(pw_ctx->format.info.raw.format),
> +            .nb_planes = n_planes,
> +        },
> +    };
> +    for (int i = 0; i < n_planes; i++) {
> +        offset = spa_buf->datas[i].chunk->offset;
> +        pitch = spa_buf->datas[i].chunk->stride;
> +        size = offset + pitch * pw_ctx->height;
> +
> +        frame_desc->objects[i] = (AVDRMObjectDescriptor) {
> +            .fd              = spa_buf->datas[i].fd,
> +            .size            = size,
> +            .format_modifier = pw_ctx->format.info.raw.modifier,
> +        };
> +        frame_desc->layers[0].planes[i] = (AVDRMPlaneDescriptor) {
> +            .object_index = i,
> +            .offset       = offset,
> +            .pitch        = pitch,
> +        };
> +    }
> +
> +    // Create frame
> +    frame = av_frame_alloc();
> +    if (!frame) {
> +        av_log(ctx, AV_LOG_ERROR, "Failed to allocate frame\n");
> +        goto fail;
> +    }
> +    frame->hw_frames_ctx = av_buffer_ref(pw_ctx->hw_frames_ref);
> +    if (!frame->hw_frames_ctx) {
> +        av_log(ctx, AV_LOG_ERROR, "Failed to create buffer reference\n");
> +        goto fail;
> +    }
> +    frame->buf[0] = av_buffer_create((uint8_t *)frame_desc, sizeof(*frame_desc),
> +                                     free_frame_desc, NULL, 0);
> +    if (!frame->buf[0]) {
> +        av_log(ctx, AV_LOG_ERROR, "Failed to create buffer\n");
> +        goto fail;
> +    }
> +    frame->data[0] = (uint8_t *)frame_desc;
> +    frame->format  = AV_PIX_FMT_DRM_PRIME;
> +    frame->width = pw_ctx->width;
> +    frame->height = pw_ctx->height;
> +
> +    // Update current_frame
> +    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
> +    av_frame_unref(pw_ctx->current_frame);
> +    ret = av_frame_ref(pw_ctx->current_frame, frame);
> +    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
> +    if (ret < 0) {
> +        av_log(ctx, AV_LOG_ERROR, "Failed to create frame reference\n");
> +        av_frame_free(&frame);
> +    }
> +    return;
> +
> +fail:
> +    av_freep(&frame_desc);
> +    av_frame_free(&frame);
> +}
> +
> +static void process_shm_buffer(AVFilterContext *ctx, struct spa_buffer *spa_buf)
> +{
> +    uint8_t *map = NULL;
> +    void *sdata = NULL;
> +    struct spa_meta_region *region;
> +    int crop_left, crop_right, crop_top, crop_bottom;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    // Get data
> +    if (spa_buf->datas[0].type == SPA_DATA_MemFd ) {
> +        map = mmap(NULL, spa_buf->datas[0].maxsize + spa_buf->datas[0].mapoffset,
> +                   PROT_READ, MAP_PRIVATE, spa_buf->datas[0].fd, 0);
> +        if (map == MAP_FAILED) {
> +            av_log(ctx, AV_LOG_ERROR, "mmap failed: %s\n", strerror(errno));
> +            return;
> +        }
> +        sdata = SPA_PTROFF(map, spa_buf->datas[0].mapoffset, uint8_t);
> +    } else if (spa_buf->datas[0].type == SPA_DATA_MemPtr) {
> +        if (spa_buf->datas[0].data == NULL) {
> +            av_log(ctx, AV_LOG_ERROR, "No data in buffer\n");
> +            return;
> +        }
> +        sdata = spa_buf->datas[0].data;
> +    } else {
> +        av_log(ctx, AV_LOG_ERROR, "Buffer is not valid\n");
> +        return;
> +    }
> +
> +    region = spa_buffer_find_meta_data(spa_buf, SPA_META_VideoCrop, sizeof(*region));
> +    if (region && spa_meta_region_is_valid(region)) {
> +        crop_left = region->region.position.x;
> +        crop_top = region->region.position.y;
> +        crop_right = pw_ctx->width - crop_left - region->region.size.width;
> +        crop_bottom = pw_ctx->height - crop_top - region->region.size.height;
> +    }
> +
> +    // Update current_frame with the new data
> +    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
> +    memcpy(pw_ctx->current_frame->data[0], sdata, spa_buf->datas[0].chunk->size);
> +    pw_ctx->current_frame->crop_top = crop_top;
> +    pw_ctx->current_frame->crop_bottom = crop_bottom;
> +    pw_ctx->current_frame->crop_left = crop_left;
> +    pw_ctx->current_frame->crop_right = crop_right;
> +    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
> +
> +    // Cleanup
> +    if (spa_buf->datas[0].type == SPA_DATA_MemFd)
> +        munmap(map, spa_buf->datas[0].maxsize + spa_buf->datas[0].mapoffset);
> +}
> +
> +/**
> + * This function is called by PipeWire when a buffer
> + * is ready to be dequeued and processed.
> + *
> + * @param user_data pointer to AVFilterContext
> + */
> +static void on_stream_process_callback(void *user_data)
> +{
> +    struct spa_buffer *spa_buf;
> +    struct pw_buffer *pw_buf = NULL;
> +    struct spa_meta_header *header = NULL;
> +
> +    AVFilterContext *ctx = user_data;
> +    PipewireGrabContext *pw_ctx;
> +    if (!ctx || !ctx->priv)
> +        return;
> +    pw_ctx = ctx->priv;
> +
> +    // We need to wait for pw_ctx->current_frame to have been allocated before
> +    // we can use it to get frames from the PipeWire thread to FFmpeg
> +    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
> +    if (!pw_ctx->current_frame) {
> +        pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
> +        return;
> +    }
> +    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
> +
> +    pw_buf = find_most_recent_buffer_and_recycle_olders(pw_ctx->stream);
> +    if (!pw_buf) {
> +        av_log(ctx, AV_LOG_ERROR, "Out of buffers\n");
> +        return;
> +    }
> +
> +    spa_buf = pw_buf->buffer;
> +    header = spa_buffer_find_meta_data(spa_buf, SPA_META_Header, sizeof(*header));
> +    if (header && (header->flags & SPA_META_HEADER_FLAG_CORRUPTED)) {
> +        av_log(ctx, AV_LOG_ERROR, "Corrupted PipeWire buffer\n");
> +        goto end;
> +    }
> +
> +    if (spa_buf->datas[0].type == SPA_DATA_DmaBuf)
> +        process_dma_buffer(ctx, spa_buf);
> +    else
> +        process_shm_buffer(ctx, spa_buf);
> +
> +end:
> +    pw_stream_queue_buffer(pw_ctx->stream, pw_buf);
> +}
> +
> +static const struct pw_stream_events stream_events = {
> +    PW_VERSION_STREAM_EVENTS,
> +    .state_changed = on_stream_state_changed_callback,
> +    .param_changed = on_stream_param_changed_callback,
> +    .process = on_stream_process_callback,
> +};
> +
> +static int subscribe_to_signal(AVFilterContext *ctx,
> +                               const char *sender_name,
> +                               const char *request_token,
> +                               sd_bus_message_handler_t callback)
> +{
> +    int ret;
> +    char *request_path;
> +    struct DbusSignalData *dbus_signal_data;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    dbus_signal_data = (struct DbusSignalData *)av_mallocz(sizeof(struct DbusSignalData));
> +    if (!dbus_signal_data)
> +        return AVERROR(ENOMEM);
> +
> +    dbus_signal_data->ctx = ctx;
> +    request_path = av_asprintf(REQUEST_PATH, sender_name, request_token);
> +
> +    ret = sd_bus_match_signal(pw_ctx->connection,
> +                              &dbus_signal_data->slot,
> +                              SENDER,
> +                              request_path,
> +                              "org.freedesktop.portal.Request",
> +                              "Response",
> +                              callback,
> +                              dbus_signal_data);
> +    av_free(request_path);
> +    return (ret < 0) ? ret : 0;
> +}
> +
> +static struct spa_pod *build_format(PipewireGrabContext *pw_ctx,
> +                                    struct spa_pod_builder *builder,
> +                                    uint32_t format,
> +                                    const uint64_t *modifiers,
> +                                    int n_modifiers)
> +{
> +    struct spa_pod_frame format_frame;
> +    struct spa_pod_frame modifier_frame;
> +
> +    spa_pod_builder_push_object(builder, &format_frame,
> +                                SPA_TYPE_OBJECT_Format, SPA_PARAM_EnumFormat);
> +    spa_pod_builder_add(builder, SPA_FORMAT_mediaType,
> +                        SPA_POD_Id(SPA_MEDIA_TYPE_video), 0);
> +    spa_pod_builder_add(builder, SPA_FORMAT_mediaSubtype,
> +                        SPA_POD_Id(SPA_MEDIA_SUBTYPE_raw), 0);
> +    spa_pod_builder_add(builder, SPA_FORMAT_VIDEO_format,
> +                        SPA_POD_Id(format), 0);
> +    spa_pod_builder_add(builder, SPA_FORMAT_VIDEO_size,
> +                        SPA_POD_CHOICE_RANGE_Rectangle(
> +                            &SPA_RECTANGLE(320, 240),
> +                            &SPA_RECTANGLE(1, 1),
> +                            &SPA_RECTANGLE(4096, 4096)
> +                        ), 0);
> +    spa_pod_builder_add(builder, SPA_FORMAT_VIDEO_framerate,
> +                        SPA_POD_CHOICE_RANGE_Fraction(
> +                            &SPA_FRACTION(pw_ctx->framerate.num, pw_ctx->framerate.den),
> +                            &SPA_FRACTION(0, 1),
> +                            &SPA_FRACTION(144, 1)
> +                        ), 0);
> +    if (n_modifiers > 0) {
> +        spa_pod_builder_prop(builder, SPA_FORMAT_VIDEO_modifier,
> +                             SPA_POD_PROP_FLAG_MANDATORY | SPA_POD_PROP_FLAG_DONT_FIXATE);
> +        spa_pod_builder_push_choice(builder, &modifier_frame, SPA_CHOICE_Enum, 0);
> +
> +        // A choice POD consists of a "default" value followed by the list of
> +        // all possible values (https://docs.pipewire.org/page_spa_pod.html)
> +        // This is why we need to add one of the modifiers twice.
> +        spa_pod_builder_long(builder, modifiers[0]);
> +        for (int i = 0; i < n_modifiers; i++)
> +            spa_pod_builder_long(builder, modifiers[i]);
> +
> +        spa_pod_builder_pop(builder, &modifier_frame);
> +    }
> +    return spa_pod_builder_pop(builder, &format_frame);
> +}
> +
> +static int play_pipewire_stream(AVFilterContext *ctx)
> +{
> +    int ret;
> +    uint8_t buffer[4096];
> +    struct spa_pod_builder pod_builder;
> +    const struct spa_pod **params;
> +    uint32_t n_params;
> +
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    pw_init(NULL, NULL);
> +    pw_ctx->pw_init_called = 1;
> +
> +    pw_ctx->thread_loop =
> +        pw_thread_loop_new("thread loop", NULL);
> +    if (!pw_ctx->thread_loop) {
> +        av_log(ctx, AV_LOG_ERROR, "pw_thread_loop_new failed\n");
> +        return AVERROR(ENOMEM);
> +    }
> +
> +    pw_ctx->context =
> +        pw_context_new(pw_thread_loop_get_loop(pw_ctx->thread_loop), NULL, 0);
> +    if (!pw_ctx->context) {
> +        av_log(ctx, AV_LOG_ERROR, "pw_context_new failed\n");
> +        ret = AVERROR(ENOMEM);
> +        goto fail;
> +    }
> +
> +    if (pw_thread_loop_start(pw_ctx->thread_loop) < 0) {
> +        av_log(ctx, AV_LOG_ERROR, "pw_thread_loop_start failed\n");
> +        ret = AVERROR(EFAULT);
> +        goto fail;
> +    }
> +
> +    pw_thread_loop_lock(pw_ctx->thread_loop);
> +
> +    // Core
> +    pw_ctx->core =
> +        pw_context_connect_fd(pw_ctx->context,
> +                              fcntl(pw_ctx->pipewire_fd, F_DUPFD_CLOEXEC, 3),
> +                              NULL, 0);
> +    if (!pw_ctx->core) {
> +        ret = AVERROR(errno);
> +        av_log(ctx, AV_LOG_ERROR, "pw_context_connect_fd failed\n");
> +        pw_thread_loop_unlock(pw_ctx->thread_loop);
> +        goto fail;
> +    }
> +
> +    pw_core_add_listener(pw_ctx->core, &pw_ctx->core_listener, &core_events,
> +                         ctx /* user_data */);
> +
> +    // Stream
> +    pw_ctx->stream = pw_stream_new(
> +        pw_ctx->core, "wayland grab",
> +        pw_properties_new(PW_KEY_MEDIA_TYPE, "Video", PW_KEY_MEDIA_CATEGORY,
> +                          "Capture", PW_KEY_MEDIA_ROLE, "Screen", NULL));
> +
> +    if (!pw_ctx->stream) {
> +        av_log(ctx, AV_LOG_ERROR, "pw_stream_new failed\n");
> +        ret = AVERROR(ENOMEM);
> +        pw_thread_loop_unlock(pw_ctx->thread_loop);
> +        goto fail;
> +    }
> +
> +    pw_stream_add_listener(pw_ctx->stream, &pw_ctx->stream_listener,
> +                           &stream_events, ctx /* user_data */);
> +
> +    // Stream parameters
> +    pod_builder = SPA_POD_BUILDER_INIT(buffer, sizeof(buffer));
> +    params = av_mallocz(2 * FF_ARRAY_ELEMS(pipewiregrab_formats) * sizeof(*params));
> +    n_params = 0;
> +
> +    for (int i = 0; i < FF_ARRAY_ELEMS(pipewiregrab_formats); i++) {
> +        if (pw_ctx->enable_dmabuf)
> +            params[n_params++] = build_format(pw_ctx, &pod_builder, pipewiregrab_formats[i],
> +                                              pipewiregrab_default_modifiers,
> +                                              FF_ARRAY_ELEMS(pipewiregrab_default_modifiers));
> +        params[n_params++] = build_format(pw_ctx, &pod_builder, pipewiregrab_formats[i],
> +                                          NULL, 0);
> +    }
> +
> +    ret = pw_stream_connect(
> +        pw_ctx->stream, PW_DIRECTION_INPUT, (uint32_t)pw_ctx->pipewire_node,
> +        PW_STREAM_FLAG_AUTOCONNECT | PW_STREAM_FLAG_MAP_BUFFERS, params, n_params);
> +    if (ret != 0) {
> +        av_log(ctx, AV_LOG_ERROR, "pw_stream_connect failed\n");
> +        pw_thread_loop_unlock(pw_ctx->thread_loop);
> +        goto fail;
> +    }
> +
> +    av_log(ctx, AV_LOG_INFO, "Starting screen capture ...\n");
> +    pw_thread_loop_unlock(pw_ctx->thread_loop);
> +    return 0;
> +
> +fail:
> +    if (pw_ctx->core) {
> +        pw_core_disconnect(pw_ctx->core);
> +        pw_ctx->core = NULL;
> +    }
> +    if (pw_ctx->context) {
> +        pw_context_destroy(pw_ctx->context);
> +        pw_ctx->context = NULL;
> +    }
> +    if (pw_ctx->thread_loop) {
> +        pw_thread_loop_destroy(pw_ctx->thread_loop);
> +        pw_ctx->thread_loop = NULL;
> +    }
> +
> +    return ret;
> +}
> +
> +static void portal_open_pipewire_remote(AVFilterContext *ctx)
> +{
> +    int ret;
> +    int fd;
> +    sd_bus_message *reply = NULL;
> +    sd_bus_error err = SD_BUS_ERROR_NULL;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    const char *method_name = "OpenPipeWireRemote";
> +    ret = sd_bus_call_method(pw_ctx->connection,
> +                             DESTINATION,
> +                             OBJECT_PATH,
> +                             INTERFACE,
> +                             method_name,
> +                             &err,
> +                             &reply,
> +                             "oa{sv}",
> +                             pw_ctx->session_handle,
> +                             0);
> +    if (ret < 0) {
> +        av_log(ctx, AV_LOG_ERROR,
> +               "Call to DBus method '%s' failed: %s\n",
> +               method_name, err.message);
> +        sd_bus_error_free(&err);
> +        portal_abort(ctx, ret, "Failed to open PipeWire remote");
> +        return;
> +    }
> +
> +    ret = sd_bus_message_read(reply, "h", &fd);
> +    if (ret < 0) {
> +        portal_abort(ctx, ret, "Failed to read file descriptor");
> +        return;
> +    } else
> +        av_log(ctx, AV_LOG_DEBUG, "PipeWire fd: %d\n", fd);
> +
> +    pw_ctx->pipewire_fd = fd;
> +    atomic_store(&pw_ctx->dbus_event_loop_running, 0);
> +}
> +
> +static void dbus_signal_data_free(struct DbusSignalData *dbus_signal_data)
> +{
> +    sd_bus_slot_unref(dbus_signal_data->slot);
> +    av_free(dbus_signal_data);
> +}
> +
> +static int on_start_response_received_callback(
> +    sd_bus_message *message, void *user_data, sd_bus_error *err)
> +{
> +    int ret;
> +    uint32_t response;
> +    uint32_t node;
> +    struct DbusSignalData *dbus_signal_data = user_data;
> +    AVFilterContext *ctx = dbus_signal_data->ctx;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    dbus_signal_data_free(dbus_signal_data);
> +
> +    ret = sd_bus_message_read(message, "u", &response);
> +    if (ret < 0) {
> +        portal_abort(ctx, ret, "Failed to read DBus response");
> +        return -1;
> +    }
> +    if (response != 0) {
> +        portal_abort(ctx, AVERROR(EACCES),
> +                     "Failed to start screen cast, denied or cancelled by user");
> +        return -1;
> +    }
> +
> +    sd_bus_message_enter_container(message, SD_BUS_TYPE_ARRAY, "{sv}");
> +    sd_bus_message_enter_container(message, SD_BUS_TYPE_DICT_ENTRY, "sv");
> +    sd_bus_message_skip(message, "s");
> +    sd_bus_message_enter_container(message, SD_BUS_TYPE_VARIANT, "a(ua{sv})");
> +    sd_bus_message_enter_container(message, SD_BUS_TYPE_ARRAY, "(ua{sv})");
> +    sd_bus_message_enter_container(message, SD_BUS_TYPE_STRUCT, "ua{sv}");
> +
> +    ret = sd_bus_message_read(message, "u", &node);
> +    if (ret < 0) {
> +        portal_abort(ctx, ret, "Failed to read PipeWire node: %s");
> +        return -1;
> +    }
> +    pw_ctx->pipewire_node = node;
> +
> +    av_log(ctx, AV_LOG_DEBUG, "PipeWire node: %"PRIu64"\n", pw_ctx->pipewire_node);
> +    av_log(ctx, AV_LOG_INFO, "Monitor selected, setting up screen cast\n\n");
> +
> +    portal_open_pipewire_remote(ctx);
> +    return 0;
> +}
> +
> +static void portal_start(AVFilterContext *ctx)
> +{
> +    int ret;
> +    sd_bus_error err = SD_BUS_ERROR_NULL;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    const char *method_name = "Start";
> +    const char *request_token = "pipewiregrabStart";
> +
> +    ret = subscribe_to_signal(ctx, pw_ctx->sender_name, request_token,
> +                              on_start_response_received_callback);
> +    if (ret < 0) {
> +        portal_abort(ctx, ret, "Failed to subscribe to DBus signal");
> +        return;
> +    }
> +
> +    av_log(ctx, AV_LOG_INFO, "Asking for monitor…\n");
> +    ret = sd_bus_call_method(pw_ctx->connection,
> +                             DESTINATION,
> +                             OBJECT_PATH,
> +                             INTERFACE,
> +                             method_name,
> +                             &err,
> +                             NULL,
> +                             "osa{sv}",
> +                             pw_ctx->session_handle,
> +                             "",
> +                             1,
> +                             "handle_token", "s", request_token);
> +    if (ret < 0) {
> +        av_log(ctx, AV_LOG_ERROR,
> +               "Call to DBus method '%s' failed: %s\n",
> +               method_name, err.message);
> +        sd_bus_error_free(&err);
> +        portal_abort(ctx, ret, "Failed to start screen cast session");
> +    }
> +}
> +
> +static int on_select_sources_response_received_callback(
> +    sd_bus_message *message, void *user_data, sd_bus_error *err)
> +{
> +    int ret;
> +    uint32_t response;
> +    struct DbusSignalData *dbus_signal_data = user_data;
> +    AVFilterContext *ctx = dbus_signal_data->ctx;
> +
> +    dbus_signal_data_free(dbus_signal_data);
> +
> +    ret = sd_bus_message_read(message, "u", &response);
> +    if (ret < 0) {
> +        portal_abort(ctx, ret, "Failed to read DBus response");
> +        return -1;
> +    }
> +    if (response != 0) {
> +        portal_abort(ctx, AVERROR(EACCES),
> +                     "Failed to select screen cast sources");
> +        return -1;
> +    }
> +
> +    portal_start(ctx);
> +    return 0;
> +}
> +
> +static void portal_select_sources(AVFilterContext *ctx)
> +{
> +    int ret;
> +    uint32_t cursor_mode;
> +    sd_bus_error err = SD_BUS_ERROR_NULL;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    const char *method_name = "SelectSources";
> +    const char *request_token = "pipewiregrabSelectSources";
> +
> +    ret = subscribe_to_signal(ctx, pw_ctx->sender_name, request_token,
> +                              on_select_sources_response_received_callback);
> +    if (ret < 0) {
> +        portal_abort(ctx, ret, "Failed to subscribe to DBus signal");
> +        return;
> +    }
> +
> +    if ((pw_ctx->available_cursor_modes & PORTAL_CURSOR_MODE_EMBEDDED)
> +             && pw_ctx->draw_mouse)
> +        cursor_mode = PORTAL_CURSOR_MODE_EMBEDDED;
> +    else
> +        cursor_mode = PORTAL_CURSOR_MODE_HIDDEN;
> +
> +    ret = sd_bus_call_method(pw_ctx->connection,
> +                             DESTINATION,
> +                             OBJECT_PATH,
> +                             INTERFACE,
> +                             method_name,
> +                             &err,
> +                             NULL,
> +                             "oa{sv}",
> +                             pw_ctx->session_handle,
> +                             4,
> +                             "types", "u", pw_ctx->capture_type,
> +                             "multiple", "b", 0,
> +                             "handle_token", "s", request_token,
> +                             "cursor_mode", "u", cursor_mode);
> +    if (ret < 0) {
> +        av_log(ctx, AV_LOG_ERROR,
> +               "Call to DBus method '%s' failed: %s\n",
> +               method_name, err.message);
> +        sd_bus_error_free(&err);
> +        portal_abort(ctx, ret, "Failed to select sources for screen cast session");
> +    }
> +}
> +
> +static int on_create_session_response_received_callback(
> +    sd_bus_message *message, void *user_data, sd_bus_error *err)
> +{
> +    int ret;
> +    uint32_t response;
> +    const char *session_handle;
> +    const char *type;
> +    struct DbusSignalData *dbus_signal_data = user_data;
> +    AVFilterContext *ctx = dbus_signal_data->ctx;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    dbus_signal_data_free(dbus_signal_data);
> +
> +    ret = sd_bus_message_read(message, "u", &response);
> +    if (ret < 0) {
> +        portal_abort(ctx, ret, "Failed to read DBus response");
> +        return -1;
> +    }
> +    if (response != 0) {
> +        portal_abort(ctx, AVERROR(EACCES),
> +                     "Failed to create screen cast session");
> +        return -1;
> +    }
> +
> +    sd_bus_message_enter_container(message, SD_BUS_TYPE_ARRAY, "{sv}");
> +    sd_bus_message_enter_container(message, SD_BUS_TYPE_DICT_ENTRY, "sv");
> +    sd_bus_message_skip(message, "s");
> +    // The XDG Desktop Portal documentation says that the type of `session_handle`
> +    // is "o" (object path), but at least on some systems it's actually "s" (string),
> +    // so we need to check to make sure we're using the right one.
> +    sd_bus_message_peek_type(message, NULL, &type);
> +    ret = sd_bus_message_read(message, "v", type, &session_handle);
> +    if (ret < 0) {
> +        portal_abort(ctx, ret, "Failed to read session handle");
> +        return -1;
> +    }
> +    pw_ctx->session_handle = av_strdup(session_handle);
> +
> +    portal_select_sources(ctx);
> +    return 0;
> +}
> +
> +/**
> + * Function to create a screen cast session
> + *
> + * @param ctx
> + */
> +static void portal_create_session(AVFilterContext *ctx)
> +{
> +    int ret;
> +    sd_bus_error err = SD_BUS_ERROR_NULL;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    const char *method_name = "CreateSession";
> +    const char *request_token = "pipewiregrabCreateSession";
> +
> +    ret = subscribe_to_signal(ctx, pw_ctx->sender_name, request_token,
> +                              on_create_session_response_received_callback);
> +    if (ret < 0) {
> +        portal_abort(ctx, ret, "Failed to subscribe to DBus signal");
> +        return;
> +    }
> +
> +    ret = sd_bus_call_method(pw_ctx->connection,
> +                             DESTINATION,
> +                             OBJECT_PATH,
> +                             INTERFACE,
> +                             method_name,
> +                             &err,
> +                             NULL,
> +                             "a{sv}",
> +                             2,
> +                             "handle_token", "s", request_token,
> +                             "session_handle_token", "s", "pipewiregrab");
> +    if (ret < 0) {
> +        av_log(ctx, AV_LOG_ERROR,
> +               "Call to DBus method '%s' failed: %s\n",
> +               method_name, err.message);
> +        sd_bus_error_free(&err);
> +        portal_abort(ctx, ret, "Failed to create screen cast session");
> +    }
> +}
> +
> +/**
> + * Helper function: get available cursor modes and update the
> + *                  PipewireGrabContext accordingly
> + *
> + * @param ctx
> + */
> +static int portal_update_available_cursor_modes(AVFilterContext *ctx)
> +{
> +    int ret;
> +    sd_bus_error err = SD_BUS_ERROR_NULL;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    ret = sd_bus_get_property_trivial(pw_ctx->connection,
> +                                      DESTINATION,
> +                                      OBJECT_PATH,
> +                                      INTERFACE,
> +                                      "AvailableCursorModes",
> +                                      &err,
> +                                      'u',
> +                                      &pw_ctx->available_cursor_modes);
> +    if (ret < 0)
> +        av_log(ctx, AV_LOG_ERROR,
> +               "Couldn't retrieve available cursor modes: %s\n", err.message);
> +
> +    sd_bus_error_free(&err);
> +    return ret;
> +}
> +
> +static int create_dbus_connection(AVFilterContext *ctx)
> +{
> +    const char *aux;
> +    int ret;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    ret = sd_bus_open_user(&pw_ctx->connection);
> +    if (ret < 0) {
> +        av_log(ctx, AV_LOG_ERROR,
> +               "Failed to create DBus connection: %s\n", strerror(-ret));
> +        return ret;
> +    }
> +
> +    ret = sd_bus_get_unique_name(pw_ctx->connection, &aux);
> +    if (ret < 0) {
> +        av_log(ctx, AV_LOG_ERROR,
> +               "Failed to get bus name: %s\n", strerror(-ret));
> +        return ret;
> +    }
> +    // From https://flatpak.github.io/xdg-desktop-portal/docs/doc-org.freedesktop.portal.Request.html:
> +    // "SENDER is the caller's unique name, with the initial ':' removed and all '.' replaced by '_'"
> +    pw_ctx->sender_name = av_strireplace(aux + 1, ".", "_");
> +    av_log(ctx, AV_LOG_DEBUG,
> +           "DBus connection created (sender name: %s)\n", pw_ctx->sender_name);
> +    return 0;
> +}
> +
> +
> +/**
> + * Use XDG Desktop Portal's ScreenCast interface to open a file descriptor that
> + * can be used by PipeWire to access the screen cast streams.
> + * (https://flatpak.github.io/xdg-desktop-portal/docs/doc-org.freedesktop.portal.ScreenCast.html)
> + *
> + * @param ctx
> + */
> +static int portal_init_screencast(AVFilterContext *ctx)
> +{
> +    int ret;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +
> +    ret = create_dbus_connection(ctx);
> +    if (ret < 0)
> +        return ret;
> +
> +    ret = portal_update_available_cursor_modes(ctx);
> +    if (ret < 0)
> +        return ret;
> +
> +    portal_create_session(ctx);
> +    if (pw_ctx->portal_error)
> +        return pw_ctx->portal_error;
> +
> +    // The event loop will run until it's stopped by portal_open_pipewire_remote (if
> +    // all DBus method calls completed successfully) or portal_abort (in case of error).
> +    // In the latter case, pw_ctx->portal_error gets set to a negative value.
> +    atomic_store(&pw_ctx->dbus_event_loop_running, 1);
> +    while(atomic_load(&pw_ctx->dbus_event_loop_running)) {
> +        ret = sd_bus_process(pw_ctx->connection, NULL);
> +        if (ret < 0) {
> +            av_log(ctx, AV_LOG_ERROR,
> +                   "Failed to process DBus event: %s\n", strerror(-ret));
> +            return ret;
> +        }
> +
> +        ret = sd_bus_wait(pw_ctx->connection, 2000);
> +        if (ret < 0) {
> +            av_log(ctx, AV_LOG_ERROR,
> +                   "Error while waiting on bus: %s\n", strerror(-ret));
> +            return ret;
> +        }
> +    }
> +    return pw_ctx->portal_error;
> +}
> +
> +static av_cold int pipewiregrab_init(AVFilterContext *ctx)
> +{
> +    int ret;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +    if (!pw_ctx) {
> +        av_log(ctx, AV_LOG_ERROR,
> +               "Invalid private context data\n");
> +        return AVERROR(EINVAL);
> +    }
> +
> +    atomic_init(&pw_ctx->dbus_event_loop_running, 0);
> +    atomic_init(&pw_ctx->pipewire_initialization_over, 0);
> +    pthread_cond_init(&pw_ctx->pipewire_initialization_cond_var, NULL);
> +    pthread_mutex_init(&pw_ctx->pipewire_initialization_mutex, NULL);
> +    pthread_mutex_init(&pw_ctx->current_frame_mutex, NULL);
> +
> +    if (pw_ctx->pipewire_fd == 0) {
> +        ret = portal_init_screencast(ctx);
> +        if (ret != 0) {
> +            av_log(ctx, AV_LOG_ERROR, "Couldn't init screen cast\n");
> +            return ret;
> +        }
> +    }
> +
> +    ret = play_pipewire_stream(ctx);
> +    if (ret != 0)
> +        return ret;
> +
> +    // Wait until PipeWire initialization is over
> +    pthread_mutex_lock(&pw_ctx->pipewire_initialization_mutex);
> +    while (!atomic_load(&pw_ctx->pipewire_initialization_over)) {
> +        pthread_cond_wait(&pw_ctx->pipewire_initialization_cond_var,
> +                          &pw_ctx->pipewire_initialization_mutex);
> +    }
> +    pthread_mutex_unlock(&pw_ctx->pipewire_initialization_mutex);
> +
> +    return pw_ctx->pipewire_error;
> +}
> +
> +static void pipewiregrab_uninit(AVFilterContext *ctx)
> +{
> +    int ret;
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +    if (!pw_ctx)
> +        return;
> +
> +    // PipeWire cleanup
> +    if (pw_ctx->thread_loop) {
> +        pw_thread_loop_signal(pw_ctx->thread_loop, false);
> +        pw_thread_loop_unlock(pw_ctx->thread_loop);
> +        pw_thread_loop_stop(pw_ctx->thread_loop);
> +    }
> +    if (pw_ctx->stream) {
> +        pw_stream_disconnect(pw_ctx->stream);
> +        pw_stream_destroy(pw_ctx->stream);
> +        pw_ctx->stream = NULL;
> +    }
> +    if (pw_ctx->core){
> +        pw_core_disconnect(pw_ctx->core);
> +        pw_ctx->core = NULL;
> +    }
> +    if (pw_ctx->context) {
> +        pw_context_destroy(pw_ctx->context);
> +        pw_ctx->context = NULL;
> +    }
> +    if (pw_ctx->thread_loop) {
> +        pw_thread_loop_destroy(pw_ctx->thread_loop);
> +        pw_ctx->thread_loop = NULL;
> +    }
> +    if (pw_ctx->pw_init_called) {
> +        pw_deinit();
> +        pw_ctx->pw_init_called = 0;
> +    }
> +    if (pw_ctx->pipewire_fd > 0) {
> +        close(pw_ctx->pipewire_fd);
> +        pw_ctx->pipewire_fd = 0;
> +    }
> +    av_frame_free(&pw_ctx->current_frame);
> +    av_buffer_unref(&pw_ctx->hw_frames_ref);
> +    av_buffer_unref(&pw_ctx->hw_device_ref);
> +
> +    // DBus cleanup
> +    if (pw_ctx->session_handle) {
> +        ret = sd_bus_call_method(pw_ctx->connection,
> +                                 DESTINATION,
> +                                 pw_ctx->session_handle,
> +                                 "org.freedesktop.portal.Session",
> +                                 "Close",
> +                                 NULL, NULL, NULL);
> +        if (ret < 0)
> +            av_log(ctx, AV_LOG_DEBUG,
> +                   "Failed to close portal session: %s\n", strerror(-ret));
> +
> +        av_freep(&pw_ctx->session_handle);
> +    }
> +    sd_bus_flush_close_unref(pw_ctx->connection);
> +    av_freep(&pw_ctx->sender_name);
> +}
> +
> +static int pipewiregrab_config_props(AVFilterLink *outlink)
> +{
> +    AVFrame *frame;
> +    PipewireGrabContext *pw_ctx = outlink->src->priv;
> +
> +    AVRational time_base = av_inv_q(pw_ctx->framerate);
> +    pw_ctx->frame_duration = av_rescale_q(1, time_base, AV_TIME_BASE_Q);
> +    pw_ctx->time_frame = av_gettime_relative();
> +
> +    outlink->w = pw_ctx->width;
> +    outlink->h = pw_ctx->height;
> +    outlink->time_base = AV_TIME_BASE_Q;
> +    outlink->frame_rate = pw_ctx->framerate;
> +
> +    frame = ff_get_video_buffer(outlink, pw_ctx->width, pw_ctx->height);
> +    if (!frame)
> +        return AVERROR(ENOMEM);
> +    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
> +    pw_ctx->current_frame = frame;
> +    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
> +
> +    return 0;
> +}
> +
> +static int pipewiregrab_request_frame(AVFilterLink *outlink)
> +{
> +    int ret;
> +    int64_t curtime, delay;
> +    PipewireGrabContext *pw_ctx = outlink->src->priv;
> +    AVFrame *frame = av_frame_alloc();
> +    if (!frame)
> +        return AVERROR(ENOMEM);
> +
> +    pw_ctx->time_frame += pw_ctx->frame_duration;
> +    while (1) {
> +        curtime = av_gettime_relative();
> +        delay   = pw_ctx->time_frame - curtime;
> +        if (delay <= 0)
> +            break;
> +        av_usleep(delay);
> +    }
> +
> +    pthread_mutex_lock(&pw_ctx->current_frame_mutex);
> +    ret = av_frame_ref(frame, pw_ctx->current_frame);
> +    pthread_mutex_unlock(&pw_ctx->current_frame_mutex);
> +    if (ret < 0) {
> +        av_frame_free(&frame);
> +        return ret;
> +    }
> +
> +    frame->pts = av_gettime();
> +    frame->duration = pw_ctx->frame_duration;
> +    frame->sample_aspect_ratio = (AVRational) {1, 1};
> +
> +    return ff_filter_frame(outlink, frame);
> +}
> +
> +static int pipewiregrab_query_formats(AVFilterContext *ctx)
> +{
> +    PipewireGrabContext *pw_ctx = ctx->priv;
> +    enum AVPixelFormat pix_fmts[] = {pw_ctx->av_pxl_format, AV_PIX_FMT_NONE};
> +
> +    return ff_set_common_formats_from_list(ctx, pix_fmts);
> +}
> +
> +static const AVFilterPad pipewiregrab_outputs[] = {
> +    {
> +        .name          = "default",
> +        .type          = AVMEDIA_TYPE_VIDEO,
> +        .request_frame = pipewiregrab_request_frame,
> +        .config_props  = pipewiregrab_config_props,
> +    },
> +};
> +
> +const AVFilter ff_vsrc_pipewiregrab= {
> +    .name = "pipewiregrab",
> +    .description = NULL_IF_CONFIG_SMALL("Capture screen or window using PipeWire."),
> +    .priv_size = sizeof(struct PipewireGrabContext),
> +    .priv_class = &pipewiregrab_class,
> +    .init = pipewiregrab_init,
> +    .uninit = pipewiregrab_uninit,
> +    .inputs = NULL,
> +    FILTER_OUTPUTS(pipewiregrab_outputs),
> +    FILTER_QUERY_FUNC(pipewiregrab_query_formats),
> +};
> -- 
> 2.34.1
> 
> _______________________________________________
> ffmpeg-devel mailing list
> ffmpeg-devel@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
> 
> To unsubscribe, visit link above, or email
> ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-08-07  3:00       ` Quack Doc
@ 2024-08-07 17:29         ` François-Simon Fauteux-Chapleau
  0 siblings, 0 replies; 13+ messages in thread
From: François-Simon Fauteux-Chapleau @ 2024-08-07 17:29 UTC (permalink / raw)
  To: FFmpeg development discussions and patches

----- On Aug 6, 2024, at 11:00 PM, Quack Doc quackdoctech@gmail.com wrote:
> For a POC I commented out the fd stuff so it wouldn't fail on me and
> replaced pw_context_connect_fd() with pw_context_connect() and it
> seemed to work fine with gamescope, gamescope only produces RGB frames
> and not YUV frames however. I did try my camera and it did not seem to
> work, which is more or less what I had expected.
> 
> perhaps a couple simple if statements would be good enough and adding
> YUV formats?

If all it takes to solve the issue is adding a couple of if statements,
then I'll be more than happy to make the change and include it in the
next version of the patch. Adding a few YUV formats should also be
straightforward.

> also Looking at it, it seems like this only supports 8bit formats?
> Though it didn't make my capture fail despite my monitor running in
> 10bit so perhaps pipewire can handle that itself?

Maybe. I don't know how PipeWire handles pixel formats under the hood,
all I can say is that the patch doesn't explicitly support any 10-bit
format, as you've already noticed.
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-08-06 15:34     ` François-Simon Fauteux-Chapleau
  2024-08-06 16:39       ` Quack Doc
@ 2024-08-07  3:00       ` Quack Doc
  2024-08-07 17:29         ` François-Simon Fauteux-Chapleau
  1 sibling, 1 reply; 13+ messages in thread
From: Quack Doc @ 2024-08-07  3:00 UTC (permalink / raw)
  To: FFmpeg development discussions and patches

> Sorry for the ambiguity in my previous reply. The answer is yes: the current
> version of the patch will try to use the XDG portal unless a file descriptor
> is provided. The "node" option was meant to be used together with the "fd"
> option, not as an alternative to it. So when I said that I thought these
> options could be used to bypass the portal, I should have specified I was
> thinking of the case where FFmpeg is used by a program as a library (which
> is what we're doing at my company); we'll need a different solution if we want
> this to work when using the command-line tool. Sorry again for the confusion.

For a POC I commented out the fd stuff so it wouldn't fail on me and
replaced pw_context_connect_fd() with pw_context_connect() and it
seemed to work fine with gamescope, gamescope only produces RGB frames
and not YUV frames however. I did try my camera and it did not seem to
work, which is more or less what I had expected.

perhaps a couple simple if statements would be good enough and adding
YUV formats?

also Looking at it, it seems like this only supports 8bit formats?
Though it didn't make my capture fail despite my monitor running in
10bit so perhaps pipewire can handle that itself?
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-08-06 15:34     ` François-Simon Fauteux-Chapleau
@ 2024-08-06 16:39       ` Quack Doc
  2024-08-07  3:00       ` Quack Doc
  1 sibling, 0 replies; 13+ messages in thread
From: Quack Doc @ 2024-08-06 16:39 UTC (permalink / raw)
  To: FFmpeg development discussions and patches

> Sorry for the ambiguity in my previous reply. The answer is yes: the current
> version of the patch will try to use the XDG portal unless a file descriptor
> is provided. The "node" option was meant to be used together with the "fd"
> option, not as an alternative to it. So when I said that I thought these
> options could be used to bypass the portal, I should have specified I was
> thinking of the case where FFmpeg is used by a program as a library (which
> is what we're doing at my company); we'll need a different solution if we want
> this to work when using the command-line tool. Sorry again for the confusion.

Ah I see, It would indeed be nice to have

> Does it work reliably if you don't try to use hardware acceleration?
> (You'll have to set the "enable_dmabuf" option to 0 for that.)

while it does start reliably, and seems to get the proper dimensions,
it doesn't actually connect to the pipewire video source node,
encoding just black frames. below is the ffmpeg output. I am currently
running pipewire-git commit hash 3e5a85b3 as well as stable 1:1.2.2-1

This issue may be a cosmic specific issue however.

[Parsed_pipewiregrab_0 @ 0x647dc471c440] stream state: "connecting"
[Parsed_pipewiregrab_0 @ 0x647dc471c440] Starting screen capture ...
[Parsed_pipewiregrab_0 @ 0x647dc471c440] Library version: 1.3.0
[Parsed_pipewiregrab_0 @ 0x647dc471c440] stream state: "paused"
[Parsed_pipewiregrab_0 @ 0x647dc471c440] Ignoring non-Format param change
    Last message repeated 1 times
[Parsed_pipewiregrab_0 @ 0x647dc471c440] Negotiated format:
[Parsed_pipewiregrab_0 @ 0x647dc471c440] Format: 11 (Spa:Enum:VideoFormat:RGBA)
[Parsed_pipewiregrab_0 @ 0x647dc471c440] Size: 1280x960
[Parsed_pipewiregrab_0 @ 0x647dc471c440] Framerate: 60/1
[Parsed_pipewiregrab_0 @ 0x647dc471c440] stream state: "error"
[Parsed_pipewiregrab_0 @ 0x647dc471c440] PipeWire core error: error
alloc buffers: Invalid argument (id=2, seq=15, res=-32: Broken pipe)


On Tue, Aug 6, 2024 at 11:34 AM François-Simon Fauteux-Chapleau
<francois-simon.fauteux-chapleau@savoirfairelinux.com> wrote:
>
> ----- On Aug 4, 2024, at 4:11 PM, Quack Doc quackdoctech@gmail.com wrote:
> > I see, I tried to pull the patch and test it. How does invocation with
> > node work? I'm a bit confused with the invocation. For testing I tried
> > using "gamescope --headless -- glxgears" to generate a raw pipewire
> > stream. (cameras will automatically create one with pipewire) used
> > "pw-dump | jq '.[] | select(.info.props["node.name"] == "gamescope") |
> > .id'" to get the node id and tried to use it but it still seemed to
> > trigger the portal. If you have a camera installed I use the below
> > command to dump all of the video sources, gamescope and cameras
> > included
> >
> > pw-dump | jq '.[] | select(.info.props["media.class"] ==
> > "Video/Source") | .info.props."node.name" + " | " +
> > .info.props."node.description" + " | " + (.id|tostring)'
> >
> > does the current patch have a hard requirement on file descriptors to
> > not use xdg?
>
> Sorry for the ambiguity in my previous reply. The answer is yes: the current
> version of the patch will try to use the XDG portal unless a file descriptor
> is provided. The "node" option was meant to be used together with the "fd"
> option, not as an alternative to it. So when I said that I thought these
> options could be used to bypass the portal, I should have specified I was
> thinking of the case where FFmpeg is used by a program as a library (which
> is what we're doing at my company); we'll need a different solution if we want
> this to work when using the command-line tool. Sorry again for the confusion.
>
> > I did also test xdg capture on cosmic, it seems to only sporadically
> > work, usually spitting out the below error. I can spam it to keep
> > retrying it until it works
> >
> > [Parsed_hwmap_0 @ 0x79fabc003600] Mapping requires a hardware context
> > (a device, or frames on input).
> > [Parsed_hwmap_0 @ 0x79fabc003600] Failed to configure output pad on
> > Parsed_hwmap_0
> > [vf#0:0 @ 0x55cf4daff480] Error reinitializing filters!
> > [vf#0:0 @ 0x55cf4daff480] Task finished with error code: -22 (Invalid argument)
> > [vf#0:0 @ 0x55cf4daff480] Terminating thread with return code -22
> > (Invalid argument)
> > [vost#0:0/h264_vaapi @ 0x55cf4db38080] Could not open encoder before EOF
> > [vost#0:0/h264_vaapi @ 0x55cf4db38080] Task finished with error code:
> > -22 (Invalid argument)
> > [vost#0:0/h264_vaapi @ 0x55cf4db38080] Terminating thread with return
> > code -22 (Invalid argument)
> > [out#0/mp4 @ 0x55cf4db37800] Nothing was written into output file,
> > because at least one of its streams received no packets.
>
> Does it work reliably if you don't try to use hardware acceleration?
> (You'll have to set the "enable_dmabuf" option to 0 for that.)
>
> _______________________________________________
> ffmpeg-devel mailing list
> ffmpeg-devel@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>
> To unsubscribe, visit link above, or email
> ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-08-04 20:11   ` Quack Doc
@ 2024-08-06 15:34     ` François-Simon Fauteux-Chapleau
  2024-08-06 16:39       ` Quack Doc
  2024-08-07  3:00       ` Quack Doc
  0 siblings, 2 replies; 13+ messages in thread
From: François-Simon Fauteux-Chapleau @ 2024-08-06 15:34 UTC (permalink / raw)
  To: FFmpeg development discussions and patches

----- On Aug 4, 2024, at 4:11 PM, Quack Doc quackdoctech@gmail.com wrote:
> I see, I tried to pull the patch and test it. How does invocation with
> node work? I'm a bit confused with the invocation. For testing I tried
> using "gamescope --headless -- glxgears" to generate a raw pipewire
> stream. (cameras will automatically create one with pipewire) used
> "pw-dump | jq '.[] | select(.info.props["node.name"] == "gamescope") |
> .id'" to get the node id and tried to use it but it still seemed to
> trigger the portal. If you have a camera installed I use the below
> command to dump all of the video sources, gamescope and cameras
> included
> 
> pw-dump | jq '.[] | select(.info.props["media.class"] ==
> "Video/Source") | .info.props."node.name" + " | " +
> .info.props."node.description" + " | " + (.id|tostring)'
> 
> does the current patch have a hard requirement on file descriptors to
> not use xdg?

Sorry for the ambiguity in my previous reply. The answer is yes: the current
version of the patch will try to use the XDG portal unless a file descriptor
is provided. The "node" option was meant to be used together with the "fd"
option, not as an alternative to it. So when I said that I thought these
options could be used to bypass the portal, I should have specified I was
thinking of the case where FFmpeg is used by a program as a library (which
is what we're doing at my company); we'll need a different solution if we want
this to work when using the command-line tool. Sorry again for the confusion.

> I did also test xdg capture on cosmic, it seems to only sporadically
> work, usually spitting out the below error. I can spam it to keep
> retrying it until it works
> 
> [Parsed_hwmap_0 @ 0x79fabc003600] Mapping requires a hardware context
> (a device, or frames on input).
> [Parsed_hwmap_0 @ 0x79fabc003600] Failed to configure output pad on
> Parsed_hwmap_0
> [vf#0:0 @ 0x55cf4daff480] Error reinitializing filters!
> [vf#0:0 @ 0x55cf4daff480] Task finished with error code: -22 (Invalid argument)
> [vf#0:0 @ 0x55cf4daff480] Terminating thread with return code -22
> (Invalid argument)
> [vost#0:0/h264_vaapi @ 0x55cf4db38080] Could not open encoder before EOF
> [vost#0:0/h264_vaapi @ 0x55cf4db38080] Task finished with error code:
> -22 (Invalid argument)
> [vost#0:0/h264_vaapi @ 0x55cf4db38080] Terminating thread with return
> code -22 (Invalid argument)
> [out#0/mp4 @ 0x55cf4db37800] Nothing was written into output file,
> because at least one of its streams received no packets.

Does it work reliably if you don't try to use hardware acceleration?
(You'll have to set the "enable_dmabuf" option to 0 for that.)

_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-08-02 19:41 ` François-Simon Fauteux-Chapleau
@ 2024-08-04 20:11   ` Quack Doc
  2024-08-06 15:34     ` François-Simon Fauteux-Chapleau
  0 siblings, 1 reply; 13+ messages in thread
From: Quack Doc @ 2024-08-04 20:11 UTC (permalink / raw)
  To: FFmpeg development discussions and patches

> The patch already supports passing a file descriptor and a PipeWire node ID
> directly via the "fd" and "node" options. The portal is only used if these
> values are not provided by the user.
>
> The original motivation for adding these options was to allow communication
> with the portal to be handled outside of FFmpeg, but I imagine they could also
> be used on e.g. Weston to bypass the portal entirely.

I see, I tried to pull the patch and test it. How does invocation with
node work? I'm a bit confused with the invocation. For testing I tried
using "gamescope --headless -- glxgears" to generate a raw pipewire
stream. (cameras will automatically create one with pipewire) used
"pw-dump | jq '.[] | select(.info.props["node.name"] == "gamescope") |
.id'" to get the node id and tried to use it but it still seemed to
trigger the portal. If you have a camera installed I use the below
command to dump all of the video sources, gamescope and cameras
included

pw-dump | jq '.[] | select(.info.props["media.class"] ==
"Video/Source") | .info.props."node.name" + " | " +
.info.props."node.description" + " | " + (.id|tostring)'

does the current patch have a hard requirement on file descriptors to
not use xdg?

I did also test xdg capture on cosmic, it seems to only sporadically
work, usually spitting out the below error. I can spam it to keep
retrying it until it works

[Parsed_hwmap_0 @ 0x79fabc003600] Mapping requires a hardware context
(a device, or frames on input).
[Parsed_hwmap_0 @ 0x79fabc003600] Failed to configure output pad on
Parsed_hwmap_0
[vf#0:0 @ 0x55cf4daff480] Error reinitializing filters!
[vf#0:0 @ 0x55cf4daff480] Task finished with error code: -22 (Invalid argument)
[vf#0:0 @ 0x55cf4daff480] Terminating thread with return code -22
(Invalid argument)
[vost#0:0/h264_vaapi @ 0x55cf4db38080] Could not open encoder before EOF
[vost#0:0/h264_vaapi @ 0x55cf4db38080] Task finished with error code:
-22 (Invalid argument)
[vost#0:0/h264_vaapi @ 0x55cf4db38080] Terminating thread with return
code -22 (Invalid argument)
[out#0/mp4 @ 0x55cf4db37800] Nothing was written into output file,
because at least one of its streams received no packets.


On Fri, Aug 2, 2024 at 3:41 PM François-Simon Fauteux-Chapleau
<francois-simon.fauteux-chapleau@savoirfairelinux.com> wrote:
>
> ----- On Aug 2, 2024, at 12:11 PM, Quack Doc quackdoctech@gmail.com wrote:
> > Pipewire video capture is more generic. Some compositors like weston
> > support pipewire as a backend without portals. Gamescope also creates a
> > pipewire output without need for portals, it would be *really* nice to
> > support gamescope capture with this. Pipewire also gives access to video
> > devices directly as well without needing portals, which allows
> > ergonomically letting multiple apps accsess v4l2 devices for instance like
> > firefox and say discord. So being able to support the file descriptor
> > directly, or using target-object much like the pipewiresrc gstreamer would
> > be greatly appreciated.
>
> The patch already supports passing a file descriptor and a PipeWire node ID
> directly via the "fd" and "node" options. The portal is only used if these
> values are not provided by the user.
>
> The original motivation for adding these options was to allow communication
> with the portal to be handled outside of FFmpeg, but I imagine they could also
> be used on e.g. Weston to bypass the portal entirely.
> _______________________________________________
> ffmpeg-devel mailing list
> ffmpeg-devel@ffmpeg.org
> https://ffmpeg.org/mailman/listinfo/ffmpeg-devel
>
> To unsubscribe, visit link above, or email
> ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-08-02 16:11 Quack Doc
@ 2024-08-02 19:41 ` François-Simon Fauteux-Chapleau
  2024-08-04 20:11   ` Quack Doc
  0 siblings, 1 reply; 13+ messages in thread
From: François-Simon Fauteux-Chapleau @ 2024-08-02 19:41 UTC (permalink / raw)
  To: FFmpeg development discussions and patches

----- On Aug 2, 2024, at 12:11 PM, Quack Doc quackdoctech@gmail.com wrote:
> Pipewire video capture is more generic. Some compositors like weston
> support pipewire as a backend without portals. Gamescope also creates a
> pipewire output without need for portals, it would be *really* nice to
> support gamescope capture with this. Pipewire also gives access to video
> devices directly as well without needing portals, which allows
> ergonomically letting multiple apps accsess v4l2 devices for instance like
> firefox and say discord. So being able to support the file descriptor
> directly, or using target-object much like the pipewiresrc gstreamer would
> be greatly appreciated.

The patch already supports passing a file descriptor and a PipeWire node ID
directly via the "fd" and "node" options. The portal is only used if these
values are not provided by the user.

The original motivation for adding these options was to allow communication
with the portal to be handled outside of FFmpeg, but I imagine they could also
be used on e.g. Weston to bypass the portal entirely.
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
@ 2024-08-02 16:11 Quack Doc
  2024-08-02 19:41 ` François-Simon Fauteux-Chapleau
  0 siblings, 1 reply; 13+ messages in thread
From: Quack Doc @ 2024-08-02 16:11 UTC (permalink / raw)
  To: ffmpeg-devel

Pipewire video capture is more generic. Some compositors like weston
support pipewire as a backend without portals. Gamescope also creates a
pipewire output without need for portals, it would be *really* nice to
support gamescope capture with this. Pipewire also gives access to video
devices directly as well without needing portals, which allows
ergonomically letting multiple apps accsess v4l2 devices for instance like
firefox and say discord. So being able to support the file descriptor
directly, or using target-object much like the pipewiresrc gstreamer would
be greatly appreciated.

Many XDG portals are not limited to systemd either. However as long as we
can provide a file descriptor at the very minimum a small python script can
be used to get the file descriptor.
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-08-01  7:44   ` Anton Khirnov
@ 2024-08-01 11:34     ` Rémi Denis-Courmont
  0 siblings, 0 replies; 13+ messages in thread
From: Rémi Denis-Courmont @ 2024-08-01 11:34 UTC (permalink / raw)
  To: FFmpeg development discussions and patches



Le 1 août 2024 10:44:43 GMT+03:00, Anton Khirnov <anton@khirnov.net> a écrit :
>Quoting François-Simon Fauteux-Chapleau (2024-07-30 21:53:05)
>> ----- On Jul 28, 2024, at 10:53 AM, Quack Doc quackdoctech@gmail.com wrote:
>> > Is it possible to use this without using the portals API and without
>> > systemd? That would be much appreciated if so since the portal is not very
>> > flexible.
>> 
>> I don't like the portals API either and would rather not have to use it, but
>> as far as I can tell there is no alternative. All the other projects that
>> use PipeWire for screen capture that I've looked at (e.g. WebRTC, OBS Studio)
>> use it, and according to the XDG Desktop Portal documentation,
>> "the primary way of capturing screens and windows on Wayland desktops is
>> through the ScreenCast portal, and some desktop environments don’t even expose
>> other means to capture the screen or windows."
>> (https://flatpak.github.io/xdg-desktop-portal/docs/reasons-to-use-portals.html)
>
>That looks a lot like "vendor says its product is the best thing ever".
>Does the pipewire project itself have a position?

Pipewire can't capture the desktop without the cooperation of the Wayland display server. There may be ways to grab the screen, which may involve Pipewire, or some vendor-specific Wayland protocol extension, but that will be dependent on each and every display server.

It should be possible to capture video from sensors though. But we can use V4L2 directly without needing Pipewire for that - unless we are in a container and then we have to use the desktop portal.
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-07-30 19:53 ` François-Simon Fauteux-Chapleau
@ 2024-08-01  7:44   ` Anton Khirnov
  2024-08-01 11:34     ` Rémi Denis-Courmont
  0 siblings, 1 reply; 13+ messages in thread
From: Anton Khirnov @ 2024-08-01  7:44 UTC (permalink / raw)
  To: FFmpeg development discussions and patches

Quoting François-Simon Fauteux-Chapleau (2024-07-30 21:53:05)
> ----- On Jul 28, 2024, at 10:53 AM, Quack Doc quackdoctech@gmail.com wrote:
> > Is it possible to use this without using the portals API and without
> > systemd? That would be much appreciated if so since the portal is not very
> > flexible.
> 
> I don't like the portals API either and would rather not have to use it, but
> as far as I can tell there is no alternative. All the other projects that
> use PipeWire for screen capture that I've looked at (e.g. WebRTC, OBS Studio)
> use it, and according to the XDG Desktop Portal documentation,
> "the primary way of capturing screens and windows on Wayland desktops is
> through the ScreenCast portal, and some desktop environments don’t even expose
> other means to capture the screen or windows."
> (https://flatpak.github.io/xdg-desktop-portal/docs/reasons-to-use-portals.html)

That looks a lot like "vendor says its product is the best thing ever".
Does the pipewire project itself have a position?

-- 
Anton Khirnov
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
  2024-07-28 14:53 Quack Doc
@ 2024-07-30 19:53 ` François-Simon Fauteux-Chapleau
  2024-08-01  7:44   ` Anton Khirnov
  0 siblings, 1 reply; 13+ messages in thread
From: François-Simon Fauteux-Chapleau @ 2024-07-30 19:53 UTC (permalink / raw)
  To: FFmpeg development discussions and patches

----- On Jul 28, 2024, at 10:53 AM, Quack Doc quackdoctech@gmail.com wrote:
> Is it possible to use this without using the portals API and without
> systemd? That would be much appreciated if so since the portal is not very
> flexible.

I don't like the portals API either and would rather not have to use it, but
as far as I can tell there is no alternative. All the other projects that
use PipeWire for screen capture that I've looked at (e.g. WebRTC, OBS Studio)
use it, and according to the XDG Desktop Portal documentation,
"the primary way of capturing screens and windows on Wayland desktops is
through the ScreenCast portal, and some desktop environments don’t even expose
other means to capture the screen or windows."
(https://flatpak.github.io/xdg-desktop-portal/docs/reasons-to-use-portals.html)

> As for systemd it would be great to be able to use this on
> non-systemd platforms.

Using the ScreenCast portal requires an XDG Desktop Portal backend
implementation, see e.g.
https://wiki.archlinux.org/title/XDG_Desktop_Portal
https://flatpak.github.io/xdg-desktop-portal/docs/writing-a-new-backend.html
I don't know if any backend currently supports non-systemd platforms (on my
system at least, the backend is implemented as a systemd service), but
assuming you can find one, then it should be possible to get screen capture
to work.



_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

* Re: [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab
@ 2024-07-28 14:53 Quack Doc
  2024-07-30 19:53 ` François-Simon Fauteux-Chapleau
  0 siblings, 1 reply; 13+ messages in thread
From: Quack Doc @ 2024-07-28 14:53 UTC (permalink / raw)
  To: ffmpeg-devel

Is it possible to use this without using the portals API and without
systemd? That would be much appreciated if so since the portal is not very
flexible. As for systemd it would be great to be able to use this on
non-systemd platforms.
_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".

^ permalink raw reply	[flat|nested] 13+ messages in thread

end of thread, other threads:[~2024-08-07 17:30 UTC | newest]

Thread overview: 13+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2024-05-10 21:12 [FFmpeg-devel] [PATCH v2] libavfilter: add PipeWire-based grab François-Simon Fauteux-Chapleau
2024-05-11 14:08 ` Andrew Sayers
2024-07-28 14:53 Quack Doc
2024-07-30 19:53 ` François-Simon Fauteux-Chapleau
2024-08-01  7:44   ` Anton Khirnov
2024-08-01 11:34     ` Rémi Denis-Courmont
2024-08-02 16:11 Quack Doc
2024-08-02 19:41 ` François-Simon Fauteux-Chapleau
2024-08-04 20:11   ` Quack Doc
2024-08-06 15:34     ` François-Simon Fauteux-Chapleau
2024-08-06 16:39       ` Quack Doc
2024-08-07  3:00       ` Quack Doc
2024-08-07 17:29         ` François-Simon Fauteux-Chapleau

Git Inbox Mirror of the ffmpeg-devel mailing list - see https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

This inbox may be cloned and mirrored by anyone:

	git clone --mirror https://master.gitmailbox.com/ffmpegdev/0 ffmpegdev/git/0.git

	# If you have public-inbox 1.1+ installed, you may
	# initialize and index your mirror using the following commands:
	public-inbox-init -V2 ffmpegdev ffmpegdev/ https://master.gitmailbox.com/ffmpegdev \
		ffmpegdev@gitmailbox.com
	public-inbox-index ffmpegdev

Example config snippet for mirrors.


AGPL code for this site: git clone https://public-inbox.org/public-inbox.git