From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from ffbox0-bg.mplayerhq.hu (ffbox0-bg.ffmpeg.org [79.124.17.100]) by master.gitmailbox.com (Postfix) with ESMTPS id 297AC4C2CA for ; Sat, 8 Mar 2025 14:59:03 +0000 (UTC) Received: from [127.0.1.1] (localhost [127.0.0.1]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTP id 1AF0268F3F6; Sat, 8 Mar 2025 16:59:01 +0200 (EET) Received: from mail-wm1-f49.google.com (mail-wm1-f49.google.com [209.85.128.49]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTPS id DBC1768F3F5 for ; Sat, 8 Mar 2025 16:58:59 +0200 (EET) Received: by mail-wm1-f49.google.com with SMTP id 5b1f17b1804b1-43bd45e4d91so16524865e9.1 for ; Sat, 08 Mar 2025 06:58:59 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1741445939; x=1742050739; darn=ffmpeg.org; h=content-language:thread-index:content-transfer-encoding :mime-version:message-id:date:subject:to:from:from:to:cc:subject :date:message-id:reply-to; bh=7JGPIWZl7Il7hS14de/CLyom6fpjuItsRmabAPx0aFs=; b=AJglKo+MUwDkelLm4PqOY9H428NpjTgtqO1FbjhCVG9rRUby5F+thf0dkQy3PZMobf qv5EXLdnkRNwvqchUUGIOSXSpX0E9jHLydme/+9PaRAZp2f0Ry7QRGrymKQ/LrtH7RA0 tQiWhAN5/Z54rUj8aU9TEc+AGueuXun8aTg5r3VrO+6Txn2ZoHdU5nnWhe1BYFFZQils 66f4YgWL+2KgnUHub3sX48JLYUXUdlaVt/ZI4UD+O4VEU4I3gn0Itsjnqqge16D2R8+S vjMdbtkFBAdlZT3bHsQQ24F4XauiedFjJVb+raIw9+pwy0tuZn2wumBcfklB5hrqEgf/ tN2w== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1741445939; x=1742050739; h=content-language:thread-index:content-transfer-encoding :mime-version:message-id:date:subject:to:from:x-gm-message-state :from:to:cc:subject:date:message-id:reply-to; bh=7JGPIWZl7Il7hS14de/CLyom6fpjuItsRmabAPx0aFs=; b=Csk8KRbEdFL0A4yxbjLHMPr/Fn18CBeo1jBGbLM+WMQoaK7DfPeiHyYWo2T10w6bFu 5Wu7KUyK1NoHOjJRSkGxPuZBVKS4KNcKaam17JS1nbn9lttcK3OgNEjLPJMUEamXcqTk A9v1SCA0oHCy0kEAu+vMSMvmoPFicOxmpX4T0H5bLYXYDQnYDT9hVkVqfhZUZX5LE7q5 NyjVaFVyMUJJGTw1xHCVmQVcPsnTTr9wvEz0f58KDfiDg3FG9JlYdYCUCS/Wq8InEbZG hrx/t9NNDD3stQl1p8Mbca+B7xErOWkX8k8S5RmOs7tMhPThZJ2g7xQlZGqacxlnvu9N CLdw== X-Gm-Message-State: AOJu0YyWbkGaiK1R3ftYGw0AQ2qrBU5czKIqb4yfZx0wnd7bs0qghM0y EtV3/VWPYuXVCXVRM9b2pQWfH0eUY8wizUXFAkUnFQx73EILOlWHXmfiMw== X-Gm-Gg: ASbGnct5/qJLCoaplwTc5FF7yC9XWudv5hHrkj+C7KZAXBwK0sEZGMDDTQDGMMJ/j45 u4Wk4jCYpU7olFSfzpGzlE5OjExkzfYClPRQds2KCpSvoZaNMqq4G04JtyOHftV5fmJcU78heCS +qogtnkBsYC6P9UaAEZ8taUydLSb96R5KEkMpy37KMnOnceW6kdmATm5Azcku2Vdh2PDtVsiOEj kMOLThqEQWW1+m7vtKGEkas+p5AYOJ5PAFQKlBwl1TLu5gQO86lDIc3pxsH6ffEvxPHNbHHHaOR QtPhbXn+X6vGfgwv2iaH8MtH2weEllxAzYajRRsHtHF3iQTxT6bDZWij1iM7nvI0snG8RmCyvPv iE275gStdPv1ot3Lb X-Google-Smtp-Source: AGHT+IEr5wzov9mUBMm6VUgi/Lxr9YTMUAZbbjrmykn/FcpBb1xHL8NqZ/DF89BzLSePr+tu3qDqyg== X-Received: by 2002:a05:600c:8581:b0:43b:c305:3954 with SMTP id 5b1f17b1804b1-43ce4ace7e2mr19495545e9.8.1741445938558; Sat, 08 Mar 2025 06:58:58 -0800 (PST) Received: from MK2 (80-108-16-220.cable.dynamic.surfer.at. [80.108.16.220]) by smtp.gmail.com with ESMTPSA id 5b1f17b1804b1-43cec60d122sm12350425e9.18.2025.03.08.06.58.58 for (version=TLS1_2 cipher=ECDHE-ECDSA-AES128-GCM-SHA256 bits=128/128); Sat, 08 Mar 2025 06:58:58 -0800 (PST) From: To: Date: Sat, 8 Mar 2025 15:59:01 +0100 Message-ID: <007501db903a$a0ecd4a0$e2c67de0$@gmail.com> MIME-Version: 1.0 X-Mailer: Microsoft Outlook 16.0 Thread-Index: AduQOIrvi+DuHCTbRESA395FkemFDQ== Content-Language: en-at Subject: [FFmpeg-devel] [PATCH FFmpeg 3/15] libavfilter: tokenizer implementation for batch tokenization using tokenizer-cpp library X-BeenThere: ffmpeg-devel@ffmpeg.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: FFmpeg development discussions and patches List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Reply-To: FFmpeg development discussions and patches Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: 7bit Errors-To: ffmpeg-devel-bounces@ffmpeg.org Sender: "ffmpeg-devel" Archived-At: List-Archive: List-Post: Implements batch tokenization support using the tokenizers-cpp library, providing functions to load tokenizers and encode text batches. This is crucial for CLIP/CLAP models that need to process text prompts. https://github.com/mlc-ai/tokenizers-cpp Try the new filters using my Github Repo https://github.com/MaximilianKaindl/DeepFFMPEGVideoClassification. Any Feedback is appreciated! Signed-off-by: MaximilianKaindl --- libavfilter/dnn_filter_common.c | 155 ++++++++++++++++++++++++++++++++ libavfilter/dnn_filter_common.h | 19 ++++ 2 files changed, 174 insertions(+) diff --git a/libavfilter/dnn_filter_common.c b/libavfilter/dnn_filter_common.c index 6b9c6f8d7f..6a1e9ace2e 100644 --- a/libavfilter/dnn_filter_common.c +++ b/libavfilter/dnn_filter_common.c @@ -20,6 +20,11 @@ #include "libavutil/avstring.h" #include "libavutil/mem.h" #include "libavutil/opt.h" +#include "libavformat/avio.h" + +#if (CONFIG_LIBTOKENIZERS == 1) +#include "tokenizers_c.h" +#endif #define MAX_SUPPORTED_OUTPUTS_NB 4 @@ -217,3 +222,153 @@ void ff_dnn_uninit(DnnContext *ctx) av_freep(&ctx->model_outputnames); } } + +static int load_file_content(const char *path, char **data, size_t *data_size, + void *log_ctx) { + AVIOContext *avio_ctx = NULL; + int ret; + int64_t size; + + ret = avio_open(&avio_ctx, path, AVIO_FLAG_READ); + if (ret < 0) { + if (log_ctx) + av_log(log_ctx, AV_LOG_ERROR, "Cannot open file: %s\n", path); + return ret; + } + + size = avio_size(avio_ctx); + if (size < 0) { + if (log_ctx) + av_log(log_ctx, AV_LOG_ERROR, "Failed to determine file size: %s\n", path); + avio_closep(&avio_ctx); + return size; + } + + *data = av_malloc(size + 1); + if (!*data) { + avio_closep(&avio_ctx); + return AVERROR(ENOMEM); + } + + ret = avio_read(avio_ctx, (unsigned char *)*data, size); + avio_closep(&avio_ctx); + + if (ret < 0) { + if (log_ctx) + av_log(log_ctx, AV_LOG_ERROR, "Failed to read file: %s\n", path); + av_freep(data); + return ret; + } + + if (ret != size) { + if (log_ctx) + av_log(log_ctx, AV_LOG_ERROR, "Incomplete read: %s\n", path); + av_freep(data); + return AVERROR(EIO); + } + + // Null-terminate the data + (*data)[size] = '\0'; + *data_size = size; + + return 0; +} + +#if (CONFIG_LIBTOKENIZERS == 1) +TokenizerHandle ff_dnn_tokenizer_create(const char *path, void *log_ctx) +{ + char *blob = NULL; + size_t blob_size = 0; + TokenizerHandle handle = NULL; + int ret; + + if (!path) { + if (log_ctx) + av_log(log_ctx, AV_LOG_ERROR, "Tokenizer path is NULL\n"); + return NULL; + } + + ret = load_file_content(path, &blob, &blob_size, log_ctx); + if (ret < 0) + return NULL; + + handle = tokenizers_new_from_str(blob, blob_size); + av_freep(&blob); + + if (!handle && log_ctx) + av_log(log_ctx, AV_LOG_ERROR, "Error creating tokenizer\n"); + + return handle; +} + +int ff_dnn_tokenizer_encode_batch(TokenizerHandle tokenizer, const char **texts, int text_count, + TokenizerEncodeResult **results, void *log_ctx) +{ + size_t *lengths = NULL; + int ret = 0; + + if (!tokenizer) { + if (log_ctx) + av_log(log_ctx, AV_LOG_ERROR, "Tokenizer is NULL\n"); + return AVERROR(EINVAL); + } + + if (!texts || text_count <= 0 || !results) { + if (log_ctx) + av_log(log_ctx, AV_LOG_ERROR, "Invalid parameters\n"); + return AVERROR(EINVAL); + } + + *results = av_calloc(text_count, sizeof(**results)); + if (!*results) { + ret = AVERROR(ENOMEM); + goto fail; + } + + lengths = av_calloc(text_count, sizeof(*lengths)); + if (!lengths) { + ret = AVERROR(ENOMEM); + goto fail; + } + + // Calculate text lengths + for (int i = 0; i < text_count; i++) { + lengths[i] = texts[i] ? strlen(texts[i]) : 0; + } + + // Tokenize all texts in batch - directly store results in the output array + tokenizers_encode_batch(tokenizer, texts, lengths, text_count, 1, *results); + + av_freep(&lengths); + return 0; + +fail: + av_freep(results); + av_freep(&lengths); + return ret; +} + +int ff_dnn_create_tokenizer_and_encode_batch(const char *path, const char **texts, int text_count, + TokenizerEncodeResult **results, void *log_ctx) +{ + int ret; + + // Create tokenizer + TokenizerHandle tokenizer = ff_dnn_tokenizer_create(path, log_ctx); + if (!tokenizer) { + av_log(log_ctx, AV_LOG_ERROR, "Error creating tokenizer\n"); + return AVERROR(EINVAL); + } + + // Tokenize batch + ret = ff_dnn_tokenizer_encode_batch(tokenizer, texts, text_count, results, log_ctx); + + if (ret < 0) { + av_log(log_ctx, AV_LOG_ERROR, "Failed to tokenize batch text\n"); + } + + // Clean up tokenizer + ff_dnn_tokenizer_free(tokenizer); + return ret; +} +#endif \ No newline at end of file diff --git a/libavfilter/dnn_filter_common.h b/libavfilter/dnn_filter_common.h index 42a4719997..fffa676a9e 100644 --- a/libavfilter/dnn_filter_common.h +++ b/libavfilter/dnn_filter_common.h @@ -25,6 +25,9 @@ #define AVFILTER_DNN_FILTER_COMMON_H #include "dnn_interface.h" +#if(CONFIG_LIBTOKENIZERS == 1) +#include "tokenizers_c.h" +#endif #define DNN_FILTER_CHILD_CLASS_ITERATE(name, backend_mask) \ static const AVClass *name##_child_class_iterate(void **iter) \ @@ -63,4 +66,20 @@ DNNAsyncStatusType ff_dnn_get_result(DnnContext *ctx, AVFrame **in_frame, AVFram int ff_dnn_flush(DnnContext *ctx); void ff_dnn_uninit(DnnContext *ctx); +#if(CONFIG_LIBTOKENIZERS == 1) +TokenizerHandle ff_dnn_tokenizer_create(const char *path, void *log_ctx); +int ff_dnn_tokenizer_encode_batch(TokenizerHandle tokenizer, const char **texts, int text_count, TokenizerEncodeResult **results, void *log_ctx); +int ff_dnn_create_tokenizer_and_encode_batch(const char *path, const char **texts, int text_count, TokenizerEncodeResult **results, void *log_ctx); + +inline void ff_dnn_tokenizer_free(TokenizerHandle tokenizer) { + if (tokenizer) + tokenizers_free(tokenizer); +} +inline void ff_dnn_tokenizer_free_results(TokenizerEncodeResult *results, int count) { + if (results) { + tokenizers_free_encode_results(results, count); + } +} +#endif + #endif -- 2.34.1 _______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org https://ffmpeg.org/mailman/listinfo/ffmpeg-devel To unsubscribe, visit link above, or email ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".