From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from ffbox0-bg.mplayerhq.hu (ffbox0-bg.ffmpeg.org [79.124.17.100]) by master.gitmailbox.com (Postfix) with ESMTP id 2498A43DC9 for ; Wed, 10 May 2023 02:25:54 +0000 (UTC) Received: from [127.0.1.1] (localhost [127.0.0.1]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTP id D581168BFDD; Wed, 10 May 2023 05:25:50 +0300 (EEST) Received: from out203-205-251-53.mail.qq.com (out203-205-251-53.mail.qq.com [203.205.251.53]) by ffbox0-bg.mplayerhq.hu (Postfix) with ESMTPS id 360E468B75F for ; Wed, 10 May 2023 05:25:43 +0300 (EEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=foxmail.com; s=s201512; t=1683685538; bh=ufAm3MLDFhyb8tYDE5sF/3FcASnfSzSqTvwBCkOJQaQ=; h=In-Reply-To:References:From:To:Subject:Date; b=dBEbx4jLbB8b39SItCPPXQvOn7v3ww2nBpiQPkZLONMSskx14vB6Ypb5C3EXcGuLp QOKAk7+Za7ostjEt2ZXXJ4fl9Ip7g0MLc79BCe1L+WpG4/M35uZJTH9pOyk7cC8AVI Z2yhlAuthwk3n5xP1d2++dFInW70uF1mIC9MOzyU= X-QQ-FEAT: oHWrrGTW1dBkUnEbHyTijimPDLI2IvSG X-QQ-SSF: 00000000000000F0000000000000 X-QQ-XMAILINFO: M+FIA52Od6KgGmqbZRCcNTUP85Ed1DyNLjS9U4LIhhEnk9EXsW7EQoppvlmAa1 f3ItKhsusggrA9eQ8vpW5pP8Tat027OmxhnkBAVvj7CNWUeyx9yJiUxmJ1F5l7Mi3T6kYZ6qTk5ka OgLB+wc15nha+lFVEPMwy2ceXbZoGj8n0Vvl/2irc0Bt/6csna7w8F5OekuVmJH370vq06p/VyhSs 6mZ5OGbsrV0fcVDXLM90ZwzrHGFMfEga2e1myxCZmZjXY027Ew6LsyhBs7/jyYd8u3vAZBkpoH/q6 meZKF67gKnvibtemxzAbWbMWfM33zdTWePserXYLKA7h8a0QRh5wQzzb9tz5BSelqcLKq4y2iF6u4 DLJ8GwwXYOmTwkUmwSHKTMAghz4jZOmikkANYTr+GEGvYcomWJDd+Lkr6XD5vWfbO9WOlbqcly3GI giNnPTfgXXDb0RqgdDZgpg+k8tJZJ29DtLZ2ve2HBsJwXP+I+9xI9UW3Ec8E2LAKj82Bnj3Mufpd/ B/GiFu/mT3ODEFWPLrMpyfM19WnSi13lsaqTHpLeMqP/AHnoPVw5NAsW4iLpXcGCU1HhqWJcxHmnn EgFwTYe7cq8ISnu9wJjMY0lS8BWd5s5wgtuvluvElhTc+WO2NK+hA7O8jYRdJ4xVVIItRl++MBMEy CjrBMEI4WnOjpQVPPVjJKf2LDQIEa8l11o1RMqhjzyAQHNrJA1fDktYssetL8DraDeS68XGrGzBQ0 yySL2YV/+jogd0gWRhc4MDJ6vuFlYHMVRPc77mMTFIMxg6gnllUzcrglLiYK9jNLnsVfu5OtgfNaH G/r57rzcUiiPE1JnIYGvyXR+XUt0WaYboTc/reb75EfEWN00oI2tmvKub/RExgyqbXG2wre1GUzTk 34FfYNGzxU2kVIFeS1Os7wt1DNh6QD0OTd8WG41fq6v8h93ob4LaqZCA2yx2ByXyMU2NvHWcavYpR Xpec5K9yZPGS5aZZbQVZrgNjHVPmIj0KEKpBHXHqTCW6j40vvC X-HAS-ATTACH: no X-QQ-BUSINESS-ORIGIN: 2 X-Originating-IP: 222.71.54.113 In-Reply-To: References: X-QQ-STYLE: X-QQ-mid: webmail416t1683685537t9675699 From: "=?ISO-8859-1?B?V2VuemhlV2FuZw==?=" To: "=?ISO-8859-1?B?ZmZtcGVnLWRldmVs?=" Mime-Version: 1.0 Date: Wed, 10 May 2023 10:25:37 +0800 X-Priority: 3 Message-ID: X-QQ-MIME: TCMime 1.0 by Tencent X-Mailer: QQMail 2.x X-QQ-Mailer: QQMail 2.x X-Content-Filtered-By: Mailman/MimeDel 2.1.29 Subject: Re: [FFmpeg-devel] [PATCH v1] libavfi/dnn: add Paddle Inference as one of DNN backend X-BeenThere: ffmpeg-devel@ffmpeg.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: FFmpeg development discussions and patches List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Reply-To: FFmpeg development discussions and patches Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: 7bit Errors-To: ffmpeg-devel-bounces@ffmpeg.org Sender: "ffmpeg-devel" Archived-At: List-Archive: List-Post: Dear Madam or Sir, Hope this email finds you well. I am writing this email since i recently found FFmepg remove DNN native backend, and i will be really grateful if you let me know if there is any new plan on libavfilter/dnn. I would like to explain to you again about the addition of dnn paddle backend. At present, ffmpeg only supports openvino and tensorflow backend. Among the current deep learning frameworks, TensorFlow is the most active in development. TensorFlow has 174k stars and pytorch has 66.5k. openvino is 4.2k, and the models that openvino can implement are relatively few. But in terms of attention on GitHub, there's no doubt that TensorFlow and pytorch are more promising. Currently, the paddle framework has reached 20.2k stars on github, which is much more widely used and active than frameworks such as mxnet and caffe. Tensoflow has a very rich ecosystem. The TensorFlow models library updates very quickly and has existing examples of deep learning applications for image classification, object detection, image generation text, and generation of adversus-network models. The dnn libavfilter module is undoubtedly very necessary for tensorflow backend to support. But the complexity of the TensorFlow API and the complexity of the training are almost prohibitive, making it a love-hate framework. PyTorch framework tends to be applied to academic fast implementation, and its industrial application performance is not good. For example, Pytorch framework makes a model to run on a server, Android phone or embedded system, and its performance is poor compared with other deep learning frameworks. PaddlePadddle is an open source framework of Baidu, which is also used by many people in China. It is very consistent with the usage habits of developers, but the practicability of the API still needs to be further strengthened. However, Paddle is the only deep learning framework I have ever used, which does not configure any third-party libraries and can be used directly by cloning make. Besides, Paddle occupies a small amount of memory and is fast. It also serves a considerable number of projects inside Baidu, which is very strong in industrial application. And PaddlePaddle supports multiple machine and multiple card training. Users' choice of different deep learning frameworks is a personal choice, and the reason why most of us chose paddle is because of its better support for embedded development and different hardware platforms and because the community is very active and has proposed industrial improvements and implementations for some advanced models. Especially for the GPU, it supports cuda and opencl, which means we can optimize the model no matter what kind of graphics card is used. In my opinion, more backend support can better improve dnn libavfilter modules. If there are any new changes in dnn libavfilter module, I will be very willing to adjust our implementation with the new planning and provide continuous maintenance. Best Regards, Wenzhe Wang WenzheWang wongwwz@foxmail.com   ------------------ Original ------------------ From: "WenzheWang"