From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from ffbox0-bg.mplayerhq.hu (ffbox0-bg.ffmpeg.org [79.124.17.100]) by master.gitmailbox.com (Postfix) with ESMTP id B68674A422 for ; Tue, 30 Apr 2024 07:13:36 +0000 (UTC) Received: from [127.0.1.1] (localhost [127.0.0.1]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTP id 231DB68D5EB; Tue, 30 Apr 2024 10:12:41 +0300 (EEST) Received: from out162-62-57-252.mail.qq.com (out162-62-57-252.mail.qq.com [162.62.57.252]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTPS id 9DD3B68D5AE for ; Tue, 30 Apr 2024 10:12:24 +0300 (EEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=foxmail.com; s=s201512; t=1714461137; bh=GhKaAHSv7wf4qcgN1IDpIdWTbP5243HVTDilGfgWKyE=; h=From:To:Cc:Subject:Date:In-Reply-To:References; b=giqCRDOUaKuNqHFRiTzf5yaV1RuWkw42g/oro6NtxfIuvj3aHDfhhT5Ib0joc9THg Pq9HSk3wQPBSarso7nBoLz5Lwexviu2bkxAdGnbeCS0POINh40/A781Eu2hbC/XnI5 2jkgJarjRCXBKmTOaYAXoYgcKw+FhAbj4EQqssm4= Received: from localhost.localdomain ([113.108.77.51]) by newxmesmtplogicsvrsza10-0.qq.com (NewEsmtp) with SMTP id 3091B0E9; Tue, 30 Apr 2024 15:12:09 +0800 X-QQ-mid: xmsmtpt1714461136tohmh9uc2 Message-ID: X-QQ-XMAILINFO: NvfE96cLltb5gWlwT6ROpiAAWp3Y/oYH7p3Slxl9OvRFspWMKiO25r3WCHfN79 jToESDOj9XHbycLQYGulsWD6TlSmyhdY8JdOalgcHODf72YGYgrF2mGUywck8Zw8tDVIpbr5pchS TUj8EaQpCXMw8tljORO9rZUdK526L3GtG9efeZxf5FJxfl6BwxismrqEGM0Y9Tjp+CHiWHJkJNyf l8gbOIX9CLEUXUgDW43nwJ+wZcO7wkEmF1R2sRyfGArrCnuqhXl6lnQR3KBJtaZxz8HDF8E6aA1M rtmTQiN9dEBmtEQWt/nDbQO7JXrvyqCWnaamQ6uaxFfSw62CUOhvlwPzaFPUrwdTBX5/R12W/sVr /J9RML1RxXZwoC3AvkOFRvnc2xRETy29N1jX0j543LXCljd0q3iKbZponGNNExuZjIVzHz5dS+BN 4oc5D2e0zKzOby5M9R56cxiKXK04mri49IMEU9Bk+BeIF4ZJNib0hN2k/h1nmBB6Gn8jqWNuCgK7 VZUqy0eBLV93LkTv6PQQ4UOaZqTvy/botG9W+jD32xC5ymyg8q8ubTfrriPqyGd91hAfOUzUrddy B51wB77o+jQgYpqtR7HiHHnL911qBzwUbBF+xSevVSb7RWtogoyodXy4HZmBoDJlIGWP4Vmwtjv0 yfqHYF6BNrK6009PYzXnDQNPUjHhLQg28xjjU04E75M99azkXPFUxj8Wuuw3HgVPSY2Kz+f+hQGu Tln4pTG2kE6wIqnJe57D6TqS6cRFgbyY/TzrDTbG/qNeS2wICGon5KgCbXWHODRatFSrK3D+V0/x D/cvKYTQSNdqk1LvxG8kQqxkdZQ97LLsq7hZ+xr3VmRqAsFTLwfjX9GMKihNFJ281ss4RRW8bQm5 T2QoZEQIwyDz2G1xxH2axCoiYJ507UEEErTX0fUfuxgcEiBIgBAFQDk2ka07RZyurexWU8JreDpb 9+6YPfgC3yKWdMYmZ8S6TGWFisY35WhhTaxt2edMQwqAn7I4ioJS1sjXVCS2yTjnCmxhDd7dUzZr BQaF6hex08C31M04W0YTCmEh9lt1h7lr6PdeHYbgROIzpz/a0uKxZ0QYfxQk4= X-QQ-XMRINFO: Mp0Kj//9VHAxr69bL5MkOOs= From: Zhao Zhili To: ffmpeg-devel@ffmpeg.org Date: Tue, 30 Apr 2024 15:12:06 +0800 X-OQ-MSGID: <20240430071208.126817-9-quinkblack@foxmail.com> X-Mailer: git-send-email 2.25.1 In-Reply-To: <20240430071208.126817-1-quinkblack@foxmail.com> References: <20240430071208.126817-1-quinkblack@foxmail.com> MIME-Version: 1.0 Subject: [FFmpeg-devel] [PATCH v3 08/10] avfilter/dnn: Remove a level of dereference X-BeenThere: ffmpeg-devel@ffmpeg.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: FFmpeg development discussions and patches List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Reply-To: FFmpeg development discussions and patches Cc: Zhao Zhili Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: 7bit Errors-To: ffmpeg-devel-bounces@ffmpeg.org Sender: "ffmpeg-devel" Archived-At: List-Archive: List-Post: From: Zhao Zhili For code such as 'model->model = ov_model' is confusing. We can just drop the member variable and use cast to get the subclass. --- libavfilter/dnn/dnn_backend_openvino.c | 17 ++++++++--------- libavfilter/dnn/dnn_backend_tf.c | 19 +++++++++---------- libavfilter/dnn/dnn_backend_torch.cpp | 15 +++++++-------- libavfilter/dnn_filter_common.c | 6 +++--- libavfilter/dnn_interface.h | 6 ++---- 5 files changed, 29 insertions(+), 34 deletions(-) diff --git a/libavfilter/dnn/dnn_backend_openvino.c b/libavfilter/dnn/dnn_backend_openvino.c index 1acc54b791..d8a6820dc2 100644 --- a/libavfilter/dnn/dnn_backend_openvino.c +++ b/libavfilter/dnn/dnn_backend_openvino.c @@ -517,7 +517,7 @@ static void dnn_free_model_ov(DNNModel **model) if (!model || !*model) return; - ov_model = (*model)->model; + ov_model = (OVModel *)(*model); while (ff_safe_queue_size(ov_model->request_queue) != 0) { OVRequestItem *item = ff_safe_queue_pop_front(ov_model->request_queue); if (item && item->infer_request) { @@ -1059,9 +1059,9 @@ err: return ret; } -static int get_input_ov(void *model, DNNData *input, const char *input_name) +static int get_input_ov(DNNModel *model, DNNData *input, const char *input_name) { - OVModel *ov_model = model; + OVModel *ov_model = (OVModel *)model; DnnContext *ctx = ov_model->ctx; int input_resizable = ctx->ov_option.input_resizable; @@ -1255,7 +1255,7 @@ static int extract_lltask_from_task(DNNFunctionType func_type, TaskItem *task, Q } } -static int get_output_ov(void *model, const char *input_name, int input_width, int input_height, +static int get_output_ov(DNNModel *model, const char *input_name, int input_width, int input_height, const char *output_name, int *output_width, int *output_height) { #if HAVE_OPENVINO2 @@ -1268,7 +1268,7 @@ static int get_output_ov(void *model, const char *input_name, int input_width, i input_shapes_t input_shapes; #endif int ret; - OVModel *ov_model = model; + OVModel *ov_model = (OVModel *)model; DnnContext *ctx = ov_model->ctx; TaskItem task; OVRequestItem *request; @@ -1383,7 +1383,6 @@ static DNNModel *dnn_load_model_ov(DnnContext *ctx, DNNFunctionType func_type, A return NULL; ov_model->ctx = ctx; model = &ov_model->model; - model->model = ov_model; #if HAVE_OPENVINO2 status = ov_core_create(&core); @@ -1470,7 +1469,7 @@ err: static int dnn_execute_model_ov(const DNNModel *model, DNNExecBaseParams *exec_params) { - OVModel *ov_model = model->model; + OVModel *ov_model = (OVModel *)model; DnnContext *ctx = ov_model->ctx; OVRequestItem *request; TaskItem *task; @@ -1558,13 +1557,13 @@ static int dnn_execute_model_ov(const DNNModel *model, DNNExecBaseParams *exec_p static DNNAsyncStatusType dnn_get_result_ov(const DNNModel *model, AVFrame **in, AVFrame **out) { - OVModel *ov_model = model->model; + OVModel *ov_model = (OVModel *)model; return ff_dnn_get_result_common(ov_model->task_queue, in, out); } static int dnn_flush_ov(const DNNModel *model) { - OVModel *ov_model = model->model; + OVModel *ov_model = (OVModel *)model; DnnContext *ctx = ov_model->ctx; OVRequestItem *request; #if HAVE_OPENVINO2 diff --git a/libavfilter/dnn/dnn_backend_tf.c b/libavfilter/dnn/dnn_backend_tf.c index c7716e696d..06ea6cbb8c 100644 --- a/libavfilter/dnn/dnn_backend_tf.c +++ b/libavfilter/dnn/dnn_backend_tf.c @@ -262,9 +262,9 @@ static TF_Tensor *allocate_input_tensor(const DNNData *input) input_dims[1] * input_dims[2] * input_dims[3] * size); } -static int get_input_tf(void *model, DNNData *input, const char *input_name) +static int get_input_tf(DNNModel *model, DNNData *input, const char *input_name) { - TFModel *tf_model = model; + TFModel *tf_model = (TFModel *)model; DnnContext *ctx = tf_model->ctx; TF_Status *status; TF_DataType dt; @@ -310,11 +310,11 @@ static int get_input_tf(void *model, DNNData *input, const char *input_name) return 0; } -static int get_output_tf(void *model, const char *input_name, int input_width, int input_height, +static int get_output_tf(DNNModel *model, const char *input_name, int input_width, int input_height, const char *output_name, int *output_width, int *output_height) { int ret; - TFModel *tf_model = model; + TFModel *tf_model = (TFModel *)model; DnnContext *ctx = tf_model->ctx; TaskItem task; TFRequestItem *request; @@ -486,7 +486,7 @@ static void dnn_free_model_tf(DNNModel **model) if (!model || !*model) return; - tf_model = (*model)->model; + tf_model = (TFModel *)(*model); while (ff_safe_queue_size(tf_model->request_queue) != 0) { TFRequestItem *item = ff_safe_queue_pop_front(tf_model->request_queue); destroy_request_item(&item); @@ -530,7 +530,6 @@ static DNNModel *dnn_load_model_tf(DnnContext *ctx, DNNFunctionType func_type, A if (!tf_model) return NULL; model = &tf_model->model; - model->model = tf_model; tf_model->ctx = ctx; if (load_tf_model(tf_model, ctx->model_filename) != 0){ @@ -611,7 +610,7 @@ static int fill_model_input_tf(TFModel *tf_model, TFRequestItem *request) { task = lltask->task; request->lltask = lltask; - ret = get_input_tf(tf_model, &input, task->input_name); + ret = get_input_tf(&tf_model->model, &input, task->input_name); if (ret != 0) { goto err; } @@ -803,7 +802,7 @@ err: static int dnn_execute_model_tf(const DNNModel *model, DNNExecBaseParams *exec_params) { - TFModel *tf_model = model->model; + TFModel *tf_model = (TFModel *)model; DnnContext *ctx = tf_model->ctx; TaskItem *task; TFRequestItem *request; @@ -851,13 +850,13 @@ static int dnn_execute_model_tf(const DNNModel *model, DNNExecBaseParams *exec_p static DNNAsyncStatusType dnn_get_result_tf(const DNNModel *model, AVFrame **in, AVFrame **out) { - TFModel *tf_model = model->model; + TFModel *tf_model = (TFModel *)model; return ff_dnn_get_result_common(tf_model->task_queue, in, out); } static int dnn_flush_tf(const DNNModel *model) { - TFModel *tf_model = model->model; + TFModel *tf_model = (TFModel *)model; DnnContext *ctx = tf_model->ctx; TFRequestItem *request; int ret; diff --git a/libavfilter/dnn/dnn_backend_torch.cpp b/libavfilter/dnn/dnn_backend_torch.cpp index 818ec5b713..24e9f2c8e2 100644 --- a/libavfilter/dnn/dnn_backend_torch.cpp +++ b/libavfilter/dnn/dnn_backend_torch.cpp @@ -119,7 +119,7 @@ static void dnn_free_model_th(DNNModel **model) if (!model || !*model) return; - th_model = (THModel *) (*model)->model; + th_model = (THModel *) (*model); while (ff_safe_queue_size(th_model->request_queue) != 0) { THRequestItem *item = (THRequestItem *)ff_safe_queue_pop_front(th_model->request_queue); destroy_request_item(&item); @@ -144,7 +144,7 @@ static void dnn_free_model_th(DNNModel **model) *model = NULL; } -static int get_input_th(void *model, DNNData *input, const char *input_name) +static int get_input_th(DNNModel *model, DNNData *input, const char *input_name) { input->dt = DNN_FLOAT; input->order = DCO_RGB; @@ -179,7 +179,7 @@ static int fill_model_input_th(THModel *th_model, THRequestItem *request) task = lltask->task; infer_request = request->infer_request; - ret = get_input_th(th_model, &input, NULL); + ret = get_input_th(&th_model->model, &input, NULL); if ( ret != 0) { goto err; } @@ -356,7 +356,7 @@ err: return ret; } -static int get_output_th(void *model, const char *input_name, int input_width, int input_height, +static int get_output_th(DNNModel *model, const char *input_name, int input_width, int input_height, const char *output_name, int *output_width, int *output_height) { int ret = 0; @@ -421,7 +421,6 @@ static DNNModel *dnn_load_model_th(DnnContext *ctx, DNNFunctionType func_type, A if (!th_model) return NULL; model = &th_model->model; - model->model = th_model; th_model->ctx = ctx; c10::Device device = c10::Device(device_name); @@ -489,7 +488,7 @@ fail: static int dnn_execute_model_th(const DNNModel *model, DNNExecBaseParams *exec_params) { - THModel *th_model = (THModel *)model->model; + THModel *th_model = (THModel *)model; DnnContext *ctx = th_model->ctx; TaskItem *task; THRequestItem *request; @@ -538,13 +537,13 @@ static int dnn_execute_model_th(const DNNModel *model, DNNExecBaseParams *exec_p static DNNAsyncStatusType dnn_get_result_th(const DNNModel *model, AVFrame **in, AVFrame **out) { - THModel *th_model = (THModel *)model->model; + THModel *th_model = (THModel *)model; return ff_dnn_get_result_common(th_model->task_queue, in, out); } static int dnn_flush_th(const DNNModel *model) { - THModel *th_model = (THModel *)model->model; + THModel *th_model = (THModel *)model; THRequestItem *request; if (ff_queue_size(th_model->lltask_queue) == 0) diff --git a/libavfilter/dnn_filter_common.c b/libavfilter/dnn_filter_common.c index 860ca7591f..6b9c6f8d7f 100644 --- a/libavfilter/dnn_filter_common.c +++ b/libavfilter/dnn_filter_common.c @@ -157,15 +157,15 @@ int ff_dnn_set_classify_post_proc(DnnContext *ctx, ClassifyPostProc post_proc) int ff_dnn_get_input(DnnContext *ctx, DNNData *input) { - return ctx->model->get_input(ctx->model->model, input, ctx->model_inputname); + return ctx->model->get_input(ctx->model, input, ctx->model_inputname); } int ff_dnn_get_output(DnnContext *ctx, int input_width, int input_height, int *output_width, int *output_height) { char * output_name = ctx->model_outputnames && ctx->backend_type != DNN_TH ? ctx->model_outputnames[0] : NULL; - return ctx->model->get_output(ctx->model->model, ctx->model_inputname, input_width, input_height, - (const char *)output_name, output_width, output_height); + return ctx->model->get_output(ctx->model, ctx->model_inputname, input_width, input_height, + (const char *)output_name, output_width, output_height); } int ff_dnn_execute_model(DnnContext *ctx, AVFrame *in_frame, AVFrame *out_frame) diff --git a/libavfilter/dnn_interface.h b/libavfilter/dnn_interface.h index 4e544486cc..1154d50629 100644 --- a/libavfilter/dnn_interface.h +++ b/libavfilter/dnn_interface.h @@ -91,17 +91,15 @@ typedef int (*DetectPostProc)(AVFrame *frame, DNNData *output, uint32_t nb, AVFi typedef int (*ClassifyPostProc)(AVFrame *frame, DNNData *output, uint32_t bbox_index, AVFilterContext *filter_ctx); typedef struct DNNModel{ - // Stores model that can be different for different backends. - void *model; // Stores FilterContext used for the interaction between AVFrame and DNNData AVFilterContext *filter_ctx; // Stores function type of the model DNNFunctionType func_type; // Gets model input information // Just reuse struct DNNData here, actually the DNNData.data field is not needed. - int (*get_input)(void *model, DNNData *input, const char *input_name); + int (*get_input)(struct DNNModel *model, DNNData *input, const char *input_name); // Gets model output width/height with given input w/h - int (*get_output)(void *model, const char *input_name, int input_width, int input_height, + int (*get_output)(struct DNNModel *model, const char *input_name, int input_width, int input_height, const char *output_name, int *output_width, int *output_height); // set the pre process to transfer data from AVFrame to DNNData // the default implementation within DNN is used if it is not provided by the filter -- 2.25.1 _______________________________________________ ffmpeg-devel mailing list ffmpeg-devel@ffmpeg.org https://ffmpeg.org/mailman/listinfo/ffmpeg-devel To unsubscribe, visit link above, or email ffmpeg-devel-request@ffmpeg.org with subject "unsubscribe".