Full Code of gesanqiu/gstreamer-example for AI

main fd320e812221 cached
94 files
39.2 MB
209.9k tokens
212 symbols
1 requests
Download .txt
Showing preview only (788K chars total). Download the full file or copy to clipboard to get everything.
Repository: gesanqiu/gstreamer-example
Branch: main
Commit: fd320e812221
Files: 94
Total size: 39.2 MB

Directory structure:
gitextract_uf0uoqov/

├── .gitignore
├── LICENSE
├── README.md
├── ai_integration/
│   ├── deepstream/
│   │   ├── CMakeLists.txt
│   │   ├── doc/
│   │   │   └── video-pipeline.dot
│   │   ├── inc/
│   │   │   ├── Common.h
│   │   │   ├── DoubleBufferCache.h
│   │   │   ├── Logger.h
│   │   │   └── VideoPipeline.h
│   │   ├── sp_mp4.json
│   │   ├── sp_rtsp.json
│   │   ├── sp_uc.json
│   │   └── src/
│   │       ├── VideoPipeline.cpp
│   │       └── main.cpp
│   ├── test_30fps.h264
│   ├── test_30fps.ts
│   └── video-pipeline.dot
├── application_develop/
│   ├── GstPadProbe/
│   │   ├── CMakeLists.txt
│   │   ├── README.md
│   │   ├── doc/
│   │   │   └── gstpadprobe.md
│   │   ├── inc/
│   │   │   ├── Common.h
│   │   │   ├── DoubleBufferCache.h
│   │   │   └── VideoPipeline.h
│   │   └── src/
│   │       ├── VideoPipeline.cpp
│   │       └── main.cpp
│   ├── README.md
│   ├── app/
│   │   ├── CMakeLists.txt
│   │   ├── README.md
│   │   ├── doc/
│   │   │   ├── app.md
│   │   │   ├── appsink.md
│   │   │   └── appsrc.md
│   │   ├── inc/
│   │   │   ├── Common.h
│   │   │   ├── DoubleBufferCache.h
│   │   │   ├── appsink.h
│   │   │   └── appsrc.h
│   │   └── src/
│   │       ├── appsink.cpp
│   │       ├── appsrc.cpp
│   │       └── main.cpp
│   ├── build_pipeline/
│   │   ├── CMakeLists.txt
│   │   ├── README.md
│   │   ├── doc/
│   │   │   └── build_pipeline.md
│   │   ├── inc/
│   │   │   ├── Common.h
│   │   │   └── VideoPipeline.h
│   │   └── src/
│   │       ├── VideoPipeline.cpp
│   │       ├── gst_element_factory_make.cpp
│   │       └── gst_parse_launch.cpp
│   ├── custom_user_plugin/
│   │   ├── CMakeLists.txt
│   │   ├── README.md
│   │   ├── config.h
│   │   ├── gstoverlay.c
│   │   └── gstoverlay.h
│   └── uridecodebin/
│       ├── CMakeLists.txt
│       ├── README.md
│       ├── doc/
│       │   └── uridecodebin.md
│       ├── inc/
│       │   ├── Common.h
│       │   ├── DoubleBufferCache.h
│       │   └── VideoPipeline.h
│       └── src/
│           ├── VideoPipeline.cpp
│           └── main.cpp
├── basic_theory/
│   ├── README.md
│   ├── app_dev_manual/
│   │   ├── autoplugging.md
│   │   ├── fundamental.md
│   │   ├── interfaces.md
│   │   ├── metadata.md
│   │   └── threads.md
│   ├── basic_tutorial/
│   │   ├── dynamic_pipelines.md
│   │   ├── gstreamer_concepts.md
│   │   ├── hello_world.md
│   │   ├── media_format.md
│   │   ├── multithread.md
│   │   └── short_cutting_pipeline.md
│   └── playback/
│       ├── hardware_decode.md
│       ├── playbin.md
│       ├── playbin_sink.md
│       ├── progressive_stream.md
│       ├── shortcut_pipeline.md
│       └── subtitle.md
├── deepstream/
│   ├── DeepStreamSample.md
│   └── nvdsosd.md
├── postscript.md
├── qti_gst_plugins/
│   └── qtioverlay/
│       ├── README.md
│       ├── qtimlmeta/
│       │   ├── CMakeLists.txt
│       │   ├── ml_meta.c
│       │   └── ml_meta.h
│       ├── qtioverlay/
│       │   ├── CMakeLists.txt
│       │   ├── config.h.in
│       │   ├── gstoverlay.cc
│       │   └── gstoverlay.h
│       └── qtiqmmf_overlay/
│           ├── CMakeLists.txt
│           ├── overlay_blit_kernel.cl
│           ├── qmmf_overlay.cc
│           └── qmmf_overlay_item.h
└── useful_tricks/
    ├── rtspsrc_1.md
    └── uridecodebin_1.md

================================================
FILE CONTENTS
================================================

================================================
FILE: .gitignore
================================================
# Prerequisites
*.d

# Compiled Object files
*.slo
*.lo
*.o
*.obj

# Precompiled Headers
*.gch
*.pch

# Compiled Dynamic libraries
*.so
*.dylib
*.dll

# Fortran module files
*.mod
*.smod

# Compiled Static libraries
*.lai
*.la
*.a
*.lib

# Executables
*.exe
*.out
*.app

# Build Files
build/

# MP4 files
*.mp4

# VS Code config files
.vscode/

================================================
FILE: LICENSE
================================================
MIT License

Copyright (c) 2021 Ricardo Lu

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.


================================================
FILE: README.md
================================================
# GStreamer-example

[![](https://img.shields.io/badge/Auther-@RicardoLu-red.svg)](https://github.com/gesanqiu)![](https://img.shields.io/badge/Version-2.0.0-blue.svg)[![](https://img.shields.io/github/stars/gesanqiu/gstreamer-example.svg?style=social&label=Stars)](https://github.com/gesanqiu/gstreamer-example)

[GStreamer](https://gstreamer.freedesktop.org/documentation/index.html?gi-language=c)是一个非常强大和通用的用于开发流媒体应用程序的框架。GStreamer框架的许多优点都来自于它的模块化:GStreamer可以无缝地合并新的插件模块,但是由于模块化和强大的功能往往以更大的复杂度为代价,开发新的应用程序并不总是简单。

出于以下两点原因,让我萌生了发起这个项目的想法:

- 网络上关于GStreamer的开发文档比较少,几乎只能依靠官方的[API Reference](https://gstreamer.freedesktop.org/documentation/libs.html?gi-language=c)和[Tutorials](https://gstreamer.freedesktop.org/documentation/tutorials/index.html?gi-language=c)英文文档;
- 目前项目只有我一个人在维护,因此更多是出于我个人开发的学习记录,但欢迎各位的加入。

## 更新日志

- 2022.02.06:更新interfaces教程。
- 2022.02.06:解决ai_intergration pipeline bugs。
- 2022.02.03:更新ai_integration pipeline ver2.0,增加usb camera和rtmp推流支持。
- 2022.01.26:更新meradata教程。
- 2022.11.06:更新threads教程。
- 2022.09.12:更新`uridecodebin`源码剖析①。
- 2022.09.10:更新`rtspsrc`源码剖析①。
- 2022.07.17:更新基于deepstream-6.1开发的pipeline,后续用于集成yolov5s.trt模型。
- 2022.02.10:增加更新日志,修改更新计划,整理已更新内容,删除多余的初始化文档,后续随缘更新。
- 2022.01.25:将Tutorial文档merge进来。
- 2021.10.31:更新`nvdsosd`插件教程。
- 2021.09,09:更新GstPadProbe教程。
- 2021.09.04:增加audio轨道处理分支。
- 2021.08.31:更新`uridecodebin`插件教程。
- 2021.08.29:更新`appsink/appsrc`插件教程。
- 2021.08.27:更新pipeline构建教程。
- 2021.08.26:更新`qtioverlay`插件教程。
- 2021.08.24:初始化提交。

## 更新计划‌

### 基础理论

本章节主要是[GStreamer Tutorial](https://gstreamer.freedesktop.org/documentation/tutorials/index.html?gi-language=c)的翻译。

### 应用开发

本章节将结合我的开发经历,讲解使用GStreamer开发一个视频流应用会需要用到的基础技术。

- 构建pipeline的两种方式:`gst_parse_launch()`和`gst_element_factory_make()`(done)
- uridecodebin详解(done)
- appsink/appsrc(done)
- GstPadProbe(done)
- 自定义plugin

### 平台定制plugins

本章节将介绍`Qualcomm`和`Nvidia`两个平台的一些定制插件,由于我现在更多在`Qualcomm`平台上进行开发,并且`Nvidia`有相对健全的Issue机制和论坛维护,**因此**`Nvidia`**仅作为补充内容,更新计划待定**。

- [Qualcomm GStreamer Plugins](https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/software-reference-manual/application-semantics/gstreamer-plugins)
  - qtivdec
  - qtioverlay
- [Nvidia GStreamer Plugin Overview](https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_Intro.html)
  - nvdsosd

**注**:作者才疏学浅,如有纰漏,欢迎指正。

## 贡献者

翻译:[@Yinan Fu](https://github.com/fengxueem)

校对:[@thetffs](https://github.com/thetffs)

## 联系方式‌

- 在线阅读:https://ricardolu.gitbook.io/gstreamer/
- Github:https://github.com/gesanqiu/gstreamer-example
- E-mail:[shenglu1202@163.com](mailto:shenglu1202@163.com)


================================================
FILE: ai_integration/deepstream/CMakeLists.txt
================================================
# create by Ricardo Lu in 07/15/2022

cmake_minimum_required(VERSION 3.10)

project(ds-yolov5s)

set(CMAKE_CXX_STANDARD 17)

find_package(OpenCV REQUIRED)
find_package(spdlog REQUIRED)

include(FindPkgConfig)
pkg_check_modules(GST    REQUIRED gstreamer-1.0)
pkg_check_modules(GSTAPP REQUIRED gstreamer-app-1.0)
pkg_check_modules(GLIB   REQUIRED glib-2.0)
pkg_check_modules(GFLAGS REQUIRED gflags)
pkg_check_modules(JSONCPP REQUIRED jsoncpp)

set(DeepStream_ROOT "/opt/nvidia/deepstream/deepstream-6.1")
set(DeepStream_INCLUDE_DIRS "${DeepStream_ROOT}/sources/includes")
set(DeepStream_LIBRARY_DIRS "${DeepStream_ROOT}/lib")

message(STATUS "GST:   ${GST_INCLUDE_DIRS},${GST_LIBRARY_DIRS},${GST_LIBRARIES}")
message(STATUS "GSTAPP:${GSTAPP_INCLUDE_DIRS},${GSTAPP_LIBRARY_DIRS},${GSTAPP_LIBRARIES}")
message(STATUS "GLIB:  ${GLIB_INCLUDE_DIRS},${GLIB_LIBRARY_DIRS},${GLIB_LIBRARIES}")
message(STATUS "JSON:  ${JSON_INCLUDE_DIRS},${JSON_LIBRARY_DIRS},${JSON_LIBRARIES}")
message(STATUS "GFLAGS:${GFLAGS_INCLUDE_DIRS},${GFLAGS_LIBRARY_DIRS},${GFLAGS_LIBRARIES}")
message(STATUS "OpenCV:${OpenCV_INCLUDE_DIRS},${OpenCV_LIBRARY_DIRS},${OpenCV_LIBRARIES}")
message(STATUS "DeepStream: ${DeepStream_INCLUDE_DIRS}, ${DeepStream_LIBRARY_DIRS}")

include_directories(
    ${PROJECT_SOURCE_DIR}/inc
    ${GST_INCLUDE_DIRS}
    ${GSTAPP_INCLUDE_DIRS}
    ${GLIB_INCLUDE_DIRS}
    ${GFLAGS_INCLUDE_DIRS}
    ${JSONCPP_INCLUDE_DIRS}
    ${OpenCV_INCLUDE_DIRS}
    ${spdlog_INCLUDE_DIRS}
    ${DeepStream_INCLUDE_DIRS}
)

link_directories(
    ${GST_LIBRARY_DIRS}
    ${GSTAPP_LIBRARY_DIRS}
    ${GLIB_LIBRARY_DIRS}
    ${GFLAGS_LIBRARY_DIRS}
    ${JSONCPP_LIBRARY_DIRS}
    ${OpenCV_LIBRARY_DIRS}
    ${spdlog_LIBRARY_DIRS}
    ${DeepStream_LIBRARY_DIRS}
)

# Config Logger
if(NOT DEFINED LOG_LEVEL)
    message(STATUS "Not define log print level, default is 'info'")
    set(LOG_LEVEL "info")
endif()
add_definitions(-DLOG_LEVEL="${LOG_LEVEL}")
message(STATUS "log level: ${LOG_LEVEL}")

option(DUMP_LOG "Dump log into a file." OFF)
option(MULTI_LOG "Dump log and stdout." OFF)

if(DUMP_LOG OR MULTI_LOG)
    if(NOT DEFINED LOG_PATH)
        message(STATUS "Not define log path, use default")
        set(LOG_PATH "./log")
        message(STATUS "log path: ${LOG_PATH}")
    endif()
    if(NOT DEFINED LOG_FILE_PREFIX)
        message(STATUS "Not define log name prefix, use default")
        set(LOG_FILE_PREFIX ${PROJECT_NAME})
        message(STATUS "log file prefix: ${LOG_FILE_PREFIX}")
    endif()

    add_definitions(
        -DDUMP_LOG
        -DLOG_PATH="${LOG_PATH}"
        -DLOG_FILE_PREFIX="${LOG_FILE_PREFIX}"
    )
    if(MULTI_LOG)
        message(STATUS "Multi log set.")
        add_definitions(-DMULTI_LOG)
    endif()
endif()
# End Config Logger

add_executable(${PROJECT_NAME}
    src/VideoPipeline.cpp
    src/main.cpp
)

target_link_libraries(${PROJECT_NAME}
    ${GST_LIBRARIES}
    ${GSTAPP_LIBRARIES}
    ${GLIB_LIBRARIES}
    ${GFLAGS_LIBRARIES}
    ${JSONCPP_LIBRARIES}
    ${OpenCV_LIBRARIES}
    nvbufsurface
    nvdsgst_meta
    nvds_meta
    nvds_utils
)

================================================
FILE: ai_integration/deepstream/doc/video-pipeline.dot
================================================
digraph pipeline {
  rankdir=LR;
  fontname="sans";
  fontsize="10";
  labelloc=t;
  nodesep=.1;
  ranksep=.2;
  label="<GstPipeline>\nvideo-pipeline\n[>]";
  node [style="filled,rounded", shape=box, fontsize="9", fontname="sans", margin="0.0,0.0"];
  edge [labelfontsize="6", fontsize="9", fontname="monospace"];
  
  legend [
    pos="0,0!",
    margin="0.05,0.05",
    style="filled",
    label="Legend\lElement-States: [~] void-pending, [0] null, [-] ready, [=] paused, [>] playing\lPad-Activation: [-] none, [>] push, [<] pull\lPad-Flags: [b]locked, [f]lushing, [b]locking, [E]OS; upper-case is set\lPad-Task: [T] has started task, [t] has paused task\l",
  ];
  subgraph cluster_appsink_0x55d4d81b3c80 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstAppSink\nappsink\n[>]\nparent=(GstPipeline) video-pipeline\nlast-sample=((GstSample*) 0x7fe6c4124260)\neos=FALSE\nemit-signals=TRUE";
    subgraph cluster_appsink_0x55d4d81b3c80_sink {
      label="";
      style="invis";
      appsink_0x55d4d81b3c80_sink_0x55d4d81b4820 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    fillcolor="#aaaaff";
  }

  subgraph cluster_capfilter1_0x55d4d77c65b0 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstCapsFilter\ncapfilter1\n[>]\nparent=(GstPipeline) video-pipeline\ncaps=video/x-raw(memory:NVMM), format=(string)RGBA";
    subgraph cluster_capfilter1_0x55d4d77c65b0_sink {
      label="";
      style="invis";
      capfilter1_0x55d4d77c65b0_sink_0x55d4d81b4380 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_capfilter1_0x55d4d77c65b0_src {
      label="";
      style="invis";
      capfilter1_0x55d4d77c65b0_src_0x55d4d81b45d0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    capfilter1_0x55d4d77c65b0_sink_0x55d4d81b4380 -> capfilter1_0x55d4d77c65b0_src_0x55d4d81b45d0 [style="invis"];
    fillcolor="#aaffaa";
  }

  capfilter1_0x55d4d77c65b0_src_0x55d4d81b45d0 -> appsink_0x55d4d81b3c80_sink_0x55d4d81b4820 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_videocvt1_0x55d4d81b1df0 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="Gstnvvideoconvert\nvideocvt1\n[>]\nparent=(GstPipeline) video-pipeline\nsrc-crop=\"0:0:0:0\"\ndest-crop=\"0:0:0:0\"\nnvbuf-memory-type=nvbuf-mem-cuda-unified";
    subgraph cluster_videocvt1_0x55d4d81b1df0_sink {
      label="";
      style="invis";
      videocvt1_0x55d4d81b1df0_sink_0x55d4d76e9d00 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_videocvt1_0x55d4d81b1df0_src {
      label="";
      style="invis";
      videocvt1_0x55d4d81b1df0_src_0x55d4d81b4130 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    videocvt1_0x55d4d81b1df0_sink_0x55d4d76e9d00 -> videocvt1_0x55d4d81b1df0_src_0x55d4d81b4130 [style="invis"];
    fillcolor="#aaffaa";
  }

  videocvt1_0x55d4d81b1df0_src_0x55d4d81b4130 -> capfilter1_0x55d4d77c65b0_sink_0x55d4d81b4380 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_queue1_0x55d4d76ec390 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstQueue\nqueue1\n[>]\nparent=(GstPipeline) video-pipeline\ncurrent-level-buffers=4\ncurrent-level-bytes=256\ncurrent-level-time=133200000";
    subgraph cluster_queue1_0x55d4d76ec390_sink {
      label="";
      style="invis";
      queue1_0x55d4d76ec390_sink_0x55d4d76e9860 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_queue1_0x55d4d76ec390_src {
      label="";
      style="invis";
      queue1_0x55d4d76ec390_src_0x55d4d76e9ab0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb][T]", height="0.2", style="filled,solid"];
    }

    queue1_0x55d4d76ec390_sink_0x55d4d76e9860 -> queue1_0x55d4d76ec390_src_0x55d4d76e9ab0 [style="invis"];
    fillcolor="#aaffaa";
  }

  queue1_0x55d4d76ec390_src_0x55d4d76e9ab0 -> videocvt1_0x55d4d81b1df0_sink_0x55d4d76e9d00 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
  subgraph cluster_display_0x55d4d81ac3a0 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstEglGlesSink\ndisplay\n[>]\nparent=(GstPipeline) video-pipeline\nmax-lateness=5000000\nqos=TRUE\nlast-sample=((GstSample*) 0x7fe6c4124340)\nprocessing-deadline=15000000\nwindow-x=0\nwindow-y=0\nwindow-width=1920\nwindow-height=1080";
    subgraph cluster_display_0x55d4d81ac3a0_sink {
      label="";
      style="invis";
      display_0x55d4d81ac3a0_sink_0x55d4d76e9610 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    fillcolor="#aaaaff";
  }

  subgraph cluster_overlay_0x55d4d80f3c20 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstNvDsOsd\noverlay\n[>]\nparent=(GstPipeline) video-pipeline\nclock-font=NULL\nclock-font-size=0\nclock-color=0\nhw-blend-color-attr=\"3,1.000000,1.000000,0.000000,0.300000:\"\ndisplay-mask=FALSE";
    subgraph cluster_overlay_0x55d4d80f3c20_sink {
      label="";
      style="invis";
      overlay_0x55d4d80f3c20_sink_0x55d4d76e9170 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_overlay_0x55d4d80f3c20_src {
      label="";
      style="invis";
      overlay_0x55d4d80f3c20_src_0x55d4d76e93c0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    overlay_0x55d4d80f3c20_sink_0x55d4d76e9170 -> overlay_0x55d4d80f3c20_src_0x55d4d76e93c0 [style="invis"];
    fillcolor="#aaffaa";
  }

  overlay_0x55d4d80f3c20_src_0x55d4d76e93c0 -> display_0x55d4d81ac3a0_sink_0x55d4d76e9610 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_capfilter0_0x55d4d77c6270 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstCapsFilter\ncapfilter0\n[>]\nparent=(GstPipeline) video-pipeline\ncaps=video/x-raw(memory:NVMM), format=(string)RGBA";
    subgraph cluster_capfilter0_0x55d4d77c6270_sink {
      label="";
      style="invis";
      capfilter0_0x55d4d77c6270_sink_0x55d4d76e8cd0 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_capfilter0_0x55d4d77c6270_src {
      label="";
      style="invis";
      capfilter0_0x55d4d77c6270_src_0x55d4d76e8f20 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    capfilter0_0x55d4d77c6270_sink_0x55d4d76e8cd0 -> capfilter0_0x55d4d77c6270_src_0x55d4d76e8f20 [style="invis"];
    fillcolor="#aaffaa";
  }

  capfilter0_0x55d4d77c6270_src_0x55d4d76e8f20 -> overlay_0x55d4d80f3c20_sink_0x55d4d76e9170 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_videocvt0_0x55d4d777e980 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="Gstnvvideoconvert\nvideocvt0\n[>]\nparent=(GstPipeline) video-pipeline\nsrc-crop=\"0:0:0:0\"\ndest-crop=\"0:0:0:0\"\nnvbuf-memory-type=nvbuf-mem-cuda-unified";
    subgraph cluster_videocvt0_0x55d4d777e980_sink {
      label="";
      style="invis";
      videocvt0_0x55d4d777e980_sink_0x55d4d76e8830 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_videocvt0_0x55d4d777e980_src {
      label="";
      style="invis";
      videocvt0_0x55d4d777e980_src_0x55d4d76e8a80 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    videocvt0_0x55d4d777e980_sink_0x55d4d76e8830 -> videocvt0_0x55d4d777e980_src_0x55d4d76e8a80 [style="invis"];
    fillcolor="#aaffaa";
  }

  videocvt0_0x55d4d777e980_src_0x55d4d76e8a80 -> capfilter0_0x55d4d77c6270_sink_0x55d4d76e8cd0 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_queue0_0x55d4d76ec090 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstQueue\nqueue0\n[>]\nparent=(GstPipeline) video-pipeline\ncurrent-level-buffers=4\ncurrent-level-bytes=256\ncurrent-level-time=133200000";
    subgraph cluster_queue0_0x55d4d76ec090_sink {
      label="";
      style="invis";
      queue0_0x55d4d76ec090_sink_0x55d4d76e8390 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_queue0_0x55d4d76ec090_src {
      label="";
      style="invis";
      queue0_0x55d4d76ec090_src_0x55d4d76e85e0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb][T]", height="0.2", style="filled,solid"];
    }

    queue0_0x55d4d76ec090_sink_0x55d4d76e8390 -> queue0_0x55d4d76ec090_src_0x55d4d76e85e0 [style="invis"];
    fillcolor="#aaffaa";
  }

  queue0_0x55d4d76ec090_src_0x55d4d76e85e0 -> videocvt0_0x55d4d777e980_sink_0x55d4d76e8830 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
  subgraph cluster_tee0_0x55d4d76e6000 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstTee\ntee0\n[>]\nparent=(GstPipeline) video-pipeline\nnum-src-pads=2";
    subgraph cluster_tee0_0x55d4d76e6000_sink {
      label="";
      style="invis";
      tee0_0x55d4d76e6000_sink_0x55d4d76e8140 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_tee0_0x55d4d76e6000_src {
      label="";
      style="invis";
      tee0_0x55d4d76e6000_src_0_0x55d4d76e02e0 [color=black, fillcolor="#ffaaaa", label="src_0\n[>][bfb]", height="0.2", style="filled,dashed"];
      tee0_0x55d4d76e6000_src_1_0x55d4d76e0540 [color=black, fillcolor="#ffaaaa", label="src_1\n[>][bfb]", height="0.2", style="filled,dashed"];
    }

    tee0_0x55d4d76e6000_sink_0x55d4d76e8140 -> tee0_0x55d4d76e6000_src_0_0x55d4d76e02e0 [style="invis"];
    fillcolor="#aaffaa";
  }

  tee0_0x55d4d76e6000_src_0_0x55d4d76e02e0 -> queue0_0x55d4d76ec090_sink_0x55d4d76e8390 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
  tee0_0x55d4d76e6000_src_1_0x55d4d76e0540 -> queue1_0x55d4d76ec390_sink_0x55d4d76e9860 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
  subgraph cluster_uri_0x55d4d76e0060 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstURIDecodeBin\nuri\n[>]\nparent=(GstPipeline) video-pipeline\nuri=\"file:///home/ricardo/workSpace/gstreamer-example/ai_integration/test.mp4\"\nsource=(GstFileSrc) source\ncaps=video/x-raw(ANY); audio/x-raw(ANY); text/x-raw(ANY); subpicture/x-dvd; subpictur…";
    subgraph cluster_uri_0x55d4d76e0060_src {
      label="";
      style="invis";
      _proxypad4_0x55d4d76e1d10 [color=black, fillcolor="#ffdddd", label="proxypad4\n[>][bfb]", height="0.2", style="filled,dotted"];
    _proxypad4_0x55d4d76e1d10 -> uri_0x55d4d76e0060_src_0_0x55d4d8fdcaf0 [style=dashed, minlen=0]
      uri_0x55d4d76e0060_src_0_0x55d4d8fdcaf0 [color=black, fillcolor="#ffdddd", label="src_0\n[>][bfb]", height="0.2", style="filled,dotted"];
      _proxypad5_0x7fe6c832e130 [color=black, fillcolor="#ffdddd", label="proxypad5\n[>][bfb]", height="0.2", style="filled,dotted"];
    _proxypad5_0x7fe6c832e130 -> uri_0x55d4d76e0060_src_1_0x55d4d8fdcd70 [style=dashed, minlen=0]
      uri_0x55d4d76e0060_src_1_0x55d4d8fdcd70 [color=black, fillcolor="#ffdddd", label="src_1\n[>][bfb]", height="0.2", style="filled,dotted"];
    }

    fillcolor="#ffffff";
    subgraph cluster_decodebin0_0x55d4d8fda090 {
      fontname="Bitstream Vera Sans";
      fontsize="8";
      style="filled,rounded";
      color=black;
      label="GstDecodeBin\ndecodebin0\n[>]\nparent=(GstURIDecodeBin) uri\ncaps=video/x-raw(ANY); audio/x-raw(ANY); text/x-raw(ANY); subpicture/x-dvd; subpictur…";
      subgraph cluster_decodebin0_0x55d4d8fda090_sink {
        label="";
        style="invis";
        _proxypad0_0x55d4d76e07b0 [color=black, fillcolor="#ddddff", label="proxypad0\n[<][bfb]", height="0.2", style="filled,solid"];
      decodebin0_0x55d4d8fda090_sink_0x55d4d8fdc0f0 -> _proxypad0_0x55d4d76e07b0 [style=dashed, minlen=0]
        decodebin0_0x55d4d8fda090_sink_0x55d4d8fdc0f0 [color=black, fillcolor="#ddddff", label="sink\n[<][bfb]", height="0.2", style="filled,solid"];
      }

      subgraph cluster_decodebin0_0x55d4d8fda090_src {
        label="";
        style="invis";
        _proxypad2_0x55d4d76e0a10 [color=black, fillcolor="#ffdddd", label="proxypad2\n[>][bfb]", height="0.2", style="filled,dotted"];
      _proxypad2_0x55d4d76e0a10 -> decodebin0_0x55d4d8fda090_src_0_0x7fe6d00320a0 [style=dashed, minlen=0]
        decodebin0_0x55d4d8fda090_src_0_0x7fe6d00320a0 [color=black, fillcolor="#ffdddd", label="src_0\n[>][bfb]", height="0.2", style="filled,dotted"];
        _proxypad3_0x55d4d76e1390 [color=black, fillcolor="#ffdddd", label="proxypad3\n[>][bfb]", height="0.2", style="filled,dotted"];
      _proxypad3_0x55d4d76e1390 -> decodebin0_0x55d4d8fda090_src_1_0x7fe6d0032b20 [style=dashed, minlen=0]
        decodebin0_0x55d4d8fda090_src_1_0x7fe6d0032b20 [color=black, fillcolor="#ffdddd", label="src_1\n[>][bfb]", height="0.2", style="filled,dotted"];
      }

      decodebin0_0x55d4d8fda090_sink_0x55d4d8fdc0f0 -> decodebin0_0x55d4d8fda090_src_0_0x7fe6d00320a0 [style="invis"];
      fillcolor="#ffffff";
      subgraph cluster_nvv4l2decoder0_0x7fe6c8018ee0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="nvv4l2decoder\nnvv4l2decoder0\n[>]\nparent=(GstDecodeBin) decodebin0\ndevice=\"/dev/nvidia0\"\ndevice-name=\"\"\ndevice-fd=31\ndrop-frame-interval=0\nnum-extra-surfaces=0";
        subgraph cluster_nvv4l2decoder0_0x7fe6c8018ee0_sink {
          label="";
          style="invis";
          nvv4l2decoder0_0x7fe6c8018ee0_sink_0x7fe6c4132410 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_nvv4l2decoder0_0x7fe6c8018ee0_src {
          label="";
          style="invis";
          nvv4l2decoder0_0x7fe6c8018ee0_src_0x7fe6c4132660 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb][T]", height="0.2", style="filled,solid"];
        }

        nvv4l2decoder0_0x7fe6c8018ee0_sink_0x7fe6c4132410 -> nvv4l2decoder0_0x7fe6c8018ee0_src_0x7fe6c4132660 [style="invis"];
        fillcolor="#aaffaa";
      }

      nvv4l2decoder0_0x7fe6c8018ee0_src_0x7fe6c4132660 -> _proxypad2_0x55d4d76e0a10 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
      subgraph cluster_avdec_aac0_0x7fe6c41314d0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="avdec_aac\navdec_aac0\n[>]\nparent=(GstDecodeBin) decodebin0";
        subgraph cluster_avdec_aac0_0x7fe6c41314d0_sink {
          label="";
          style="invis";
          avdec_aac0_0x7fe6c41314d0_sink_0x7fe6c400b8f0 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_avdec_aac0_0x7fe6c41314d0_src {
          label="";
          style="invis";
          avdec_aac0_0x7fe6c41314d0_src_0x7fe6c400bb40 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        avdec_aac0_0x7fe6c41314d0_sink_0x7fe6c400b8f0 -> avdec_aac0_0x7fe6c41314d0_src_0x7fe6c400bb40 [style="invis"];
        fillcolor="#aaffaa";
      }

      avdec_aac0_0x7fe6c41314d0_src_0x7fe6c400bb40 -> _proxypad3_0x55d4d76e1390 [label="audio/x-raw\l              format: F32LE\l              layout: non-interleaved\l                rate: 48000\l            channels: 2\l        channel-mask: 0x0000000000000003\l"]
      subgraph cluster_aacparse0_0x7fe6c40900f0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstAacParse\naacparse0\n[>]\nparent=(GstDecodeBin) decodebin0";
        subgraph cluster_aacparse0_0x7fe6c40900f0_sink {
          label="";
          style="invis";
          aacparse0_0x7fe6c40900f0_sink_0x7fe6c400b450 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_aacparse0_0x7fe6c40900f0_src {
          label="";
          style="invis";
          aacparse0_0x7fe6c40900f0_src_0x7fe6c400b6a0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        aacparse0_0x7fe6c40900f0_sink_0x7fe6c400b450 -> aacparse0_0x7fe6c40900f0_src_0x7fe6c400b6a0 [style="invis"];
        fillcolor="#aaffaa";
      }

      aacparse0_0x7fe6c40900f0_src_0x7fe6c400b6a0 -> avdec_aac0_0x7fe6c41314d0_sink_0x7fe6c400b8f0 [label="audio/mpeg\l         mpegversion: 4\l              framed: true\l       stream-format: raw\l               level: 2\l        base-profile: lc\l             profile: lc\l          codec_data: 1190\l                rate: 48000\l            channels: 2\l"]
      subgraph cluster_capsfilter0_0x55d4d77c6f70 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstCapsFilter\ncapsfilter0\n[>]\nparent=(GstDecodeBin) decodebin0\ncaps=video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, parsed=(b…";
        subgraph cluster_capsfilter0_0x55d4d77c6f70_sink {
          label="";
          style="invis";
          capsfilter0_0x55d4d77c6f70_sink_0x7fe6c400a8c0 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_capsfilter0_0x55d4d77c6f70_src {
          label="";
          style="invis";
          capsfilter0_0x55d4d77c6f70_src_0x7fe6c400ab10 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        capsfilter0_0x55d4d77c6f70_sink_0x7fe6c400a8c0 -> capsfilter0_0x55d4d77c6f70_src_0x7fe6c400ab10 [style="invis"];
        fillcolor="#aaffaa";
      }

      capsfilter0_0x55d4d77c6f70_src_0x7fe6c400ab10 -> nvv4l2decoder0_0x7fe6c8018ee0_sink_0x7fe6c4132410 [label="video/x-h264\l       stream-format: byte-stream\l           alignment: au\l               level: 4.2\l             profile: high\l               width: 1920\l              height: 1080\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l      interlace-mode: progressive\l       chroma-format: 4:2:0\l      bit-depth-luma: 8\l    bit-depth-chroma: 8\l              parsed: true\l"]
      subgraph cluster_h264parse0_0x7fe6c40108a0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstH264Parse\nh264parse0\n[>]\nparent=(GstDecodeBin) decodebin0\nconfig-interval=-1";
        subgraph cluster_h264parse0_0x7fe6c40108a0_sink {
          label="";
          style="invis";
          h264parse0_0x7fe6c40108a0_sink_0x7fe6c400a420 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_h264parse0_0x7fe6c40108a0_src {
          label="";
          style="invis";
          h264parse0_0x7fe6c40108a0_src_0x7fe6c400a670 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        h264parse0_0x7fe6c40108a0_sink_0x7fe6c400a420 -> h264parse0_0x7fe6c40108a0_src_0x7fe6c400a670 [style="invis"];
        fillcolor="#aaffaa";
      }

      h264parse0_0x7fe6c40108a0_src_0x7fe6c400a670 -> capsfilter0_0x55d4d77c6f70_sink_0x7fe6c400a8c0 [label="video/x-h264\l       stream-format: byte-stream\l           alignment: au\l               level: 4.2\l             profile: high\l               width: 1920\l              height: 1080\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l      interlace-mode: progressive\l       chroma-format: 4:2:0\l      bit-depth-luma: 8\l    bit-depth-chroma: 8\l              parsed: true\l"]
      subgraph cluster_multiqueue0_0x7fe6c400d060 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstMultiQueue\nmultiqueue0\n[>]\nparent=(GstDecodeBin) decodebin0\nmax-size-bytes=2097152\nmax-size-time=0";
        subgraph cluster_multiqueue0_0x7fe6c400d060_sink {
          label="";
          style="invis";
          multiqueue0_0x7fe6c400d060_sink_0_0x55d4d81b5cf0 [color=black, fillcolor="#aaaaff", label="sink_0\n[>][bfb]", height="0.2", style="filled,dashed"];
          multiqueue0_0x7fe6c400d060_sink_1_0x7fe6c400afb0 [color=black, fillcolor="#aaaaff", label="sink_1\n[>][bfb]", height="0.2", style="filled,dashed"];
        }

        subgraph cluster_multiqueue0_0x7fe6c400d060_src {
          label="";
          style="invis";
          multiqueue0_0x7fe6c400d060_src_0_0x7fe6c400a1d0 [color=black, fillcolor="#ffaaaa", label="src_0\n[>][bfb][T]", height="0.2", style="filled,dotted"];
          multiqueue0_0x7fe6c400d060_src_1_0x7fe6c400b200 [color=black, fillcolor="#ffaaaa", label="src_1\n[>][bfb][T]", height="0.2", style="filled,dotted"];
        }

        multiqueue0_0x7fe6c400d060_sink_0_0x55d4d81b5cf0 -> multiqueue0_0x7fe6c400d060_src_0_0x7fe6c400a1d0 [style="invis"];
        fillcolor="#aaffaa";
      }

      multiqueue0_0x7fe6c400d060_src_0_0x7fe6c400a1d0 -> h264parse0_0x7fe6c40108a0_sink_0x7fe6c400a420 [label="video/x-h264\l       stream-format: avc\l           alignment: au\l               level: 4.2\l             profile: high\l          codec_data: 0164002affe10018676400...\l               width: 1920\l              height: 1080\l  pixel-aspect-ratio: 1/1\l"]
      multiqueue0_0x7fe6c400d060_src_1_0x7fe6c400b200 -> aacparse0_0x7fe6c40900f0_sink_0x7fe6c400b450 [label="audio/mpeg\l         mpegversion: 4\l              framed: true\l       stream-format: raw\l               level: 2\l        base-profile: lc\l             profile: lc\l          codec_data: 1190\l                rate: 48000\l            channels: 2\l"]
      subgraph cluster_qtdemux0_0x7fe6d007e140 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstQTDemux\nqtdemux0\n[>]\nparent=(GstDecodeBin) decodebin0";
        subgraph cluster_qtdemux0_0x7fe6d007e140_sink {
          label="";
          style="invis";
          qtdemux0_0x7fe6d007e140_sink_0x55d4d81b5160 [color=black, fillcolor="#aaaaff", label="sink\n[<][bfb][T]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_qtdemux0_0x7fe6d007e140_src {
          label="";
          style="invis";
          qtdemux0_0x7fe6d007e140_video_0_0x55d4d81b5aa0 [color=black, fillcolor="#ffaaaa", label="video_0\n[>][bfb]", height="0.2", style="filled,dotted"];
          qtdemux0_0x7fe6d007e140_audio_0_0x7fe6c400ad60 [color=black, fillcolor="#ffaaaa", label="audio_0\n[>][bfb]", height="0.2", style="filled,dotted"];
        }

        qtdemux0_0x7fe6d007e140_sink_0x55d4d81b5160 -> qtdemux0_0x7fe6d007e140_video_0_0x55d4d81b5aa0 [style="invis"];
        fillcolor="#aaffaa";
      }

      qtdemux0_0x7fe6d007e140_video_0_0x55d4d81b5aa0 -> multiqueue0_0x7fe6c400d060_sink_0_0x55d4d81b5cf0 [label="video/x-h264\l       stream-format: avc\l           alignment: au\l               level: 4.2\l             profile: high\l          codec_data: 0164002affe10018676400...\l               width: 1920\l              height: 1080\l  pixel-aspect-ratio: 1/1\l"]
      qtdemux0_0x7fe6d007e140_audio_0_0x7fe6c400ad60 -> multiqueue0_0x7fe6c400d060_sink_1_0x7fe6c400afb0 [label="audio/mpeg\l         mpegversion: 4\l              framed: true\l       stream-format: raw\l               level: 2\l        base-profile: lc\l             profile: lc\l          codec_data: 1190\l                rate: 48000\l            channels: 2\l"]
      subgraph cluster_typefind_0x55d4d90810b0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstTypeFindElement\ntypefind\n[>]\nparent=(GstDecodeBin) decodebin0\ncaps=video/quicktime, variant=(string)iso";
        subgraph cluster_typefind_0x55d4d90810b0_sink {
          label="";
          style="invis";
          typefind_0x55d4d90810b0_sink_0x55d4d81b4cc0 [color=black, fillcolor="#aaaaff", label="sink\n[<][bfb][t]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_typefind_0x55d4d90810b0_src {
          label="";
          style="invis";
          typefind_0x55d4d90810b0_src_0x55d4d81b4f10 [color=black, fillcolor="#ffaaaa", label="src\n[<][bfb]", height="0.2", style="filled,solid"];
        }

        typefind_0x55d4d90810b0_sink_0x55d4d81b4cc0 -> typefind_0x55d4d90810b0_src_0x55d4d81b4f10 [style="invis"];
        fillcolor="#aaffaa";
      }

      _proxypad0_0x55d4d76e07b0 -> typefind_0x55d4d90810b0_sink_0x55d4d81b4cc0 [label="ANY"]
      typefind_0x55d4d90810b0_src_0x55d4d81b4f10 -> qtdemux0_0x7fe6d007e140_sink_0x55d4d81b5160 [labeldistance="10", labelangle="0", label="                                                  ", taillabel="ANY", headlabel="video/quicktime\lvideo/mj2\laudio/x-m4a\lapplication/x-3gp\l"]
    }

    decodebin0_0x55d4d8fda090_src_0_0x7fe6d00320a0 -> _proxypad4_0x55d4d76e1d10 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
    decodebin0_0x55d4d8fda090_src_1_0x7fe6d0032b20 -> _proxypad5_0x7fe6c832e130 [label="audio/x-raw\l              format: F32LE\l              layout: non-interleaved\l                rate: 48000\l            channels: 2\l        channel-mask: 0x0000000000000003\l"]
    subgraph cluster_source_0x55d4d86dc3e0 {
      fontname="Bitstream Vera Sans";
      fontsize="8";
      style="filled,rounded";
      color=black;
      label="GstFileSrc\nsource\n[>]\nparent=(GstURIDecodeBin) uri\nlocation=\"/home/ricardo/workSpace/gstreamer-example/ai_integration/test.mp4\"";
      subgraph cluster_source_0x55d4d86dc3e0_src {
        label="";
        style="invis";
        source_0x55d4d86dc3e0_src_0x55d4d81b4a70 [color=black, fillcolor="#ffaaaa", label="src\n[<][bfb]", height="0.2", style="filled,solid"];
      }

      fillcolor="#ffaaaa";
    }

    source_0x55d4d86dc3e0_src_0x55d4d81b4a70 -> decodebin0_0x55d4d8fda090_sink_0x55d4d8fdc0f0 [label="ANY"]
  }

  uri_0x55d4d76e0060_src_0_0x55d4d8fdcaf0 -> tee0_0x55d4d76e6000_sink_0x55d4d76e8140 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
}


================================================
FILE: ai_integration/deepstream/inc/Common.h
================================================
/*
 * @Description: Common Utils.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-27 12:24:25
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2022-07-16 13:49:59
 */
#pragma once

#include <iostream>
#include <string>
#include <memory>
#include <functional>
#include <unistd.h>
#include <vector>

#include <opencv2/opencv.hpp>
#include <gst/gst.h>
#include <gst/app/app.h>

#include "Logger.h"

class OSDObject {
public:
    int x, y, width, height;
    double r, g, b, a;

    OSDObject(int _x, int _y, int _width, int _height,
        int _r, int _g, int _b, int _a = 1.0) : 
        x(_x), y(_y), width(_width), height(_height),
        r(_r), g(_g), b(_b), a(_a) {

    }
};

class GstBufferObject {
public:
    GstBufferObject(GstBuffer* buffer) {
        if(buffer) {
            gst_buffer_ref(buffer);
            m_buffer  = buffer;
        }
    }

   ~GstBufferObject() {
        if (m_buffer) {
            gst_buffer_unref(m_buffer);
            m_buffer = nullptr;
        }
    }

    GstBuffer* GetBuffer() {
        return m_buffer;
    }

    GstBuffer* RefBuffer() {
        if (m_buffer) {
            gst_buffer_ref(m_buffer);
        }

        return m_buffer;
    }

private:
    GstBuffer* m_buffer;
};

class GstSampleObject {
public:
    GstSampleObject(GstSample* sample, uint64_t timestamp) :
        m_sample   (sample),
        m_timestamp(timestamp),
        m_buffer   (nullptr),
        m_format   (nullptr),
        m_rows     (0),
        m_cols     (0),
        m_fpsn     (0),
        m_fpsd     (0) {

        }

   ~GstSampleObject() {
        if(m_sample) {
            gst_sample_unref(m_sample);
            m_sample = nullptr;
        }
    }

    GstSample* GetSample() {
        return m_sample;
    }

    GstSample* RefSample() {
        return gst_sample_ref(m_sample);
    }

    GstBuffer* GetBuffer(int& width, int& height, std::string& format) {
        if (!m_buffer) {
            GstCaps* caps = gst_sample_get_caps(m_sample);
            GstStructure* structure = gst_caps_get_structure(caps, 0);
            gst_structure_get_int(structure, "width",  &m_cols);
            gst_structure_get_int(structure, "height", &m_rows);
            gst_structure_get_fraction(structure, "framerate", &m_fpsn, &m_fpsd);
            m_format = gst_structure_get_string(structure, "format");
            m_buffer = gst_sample_get_buffer(m_sample);
        }

        width  = m_cols;
        height = m_rows;
        format = m_format;
        return m_buffer;
    }

    GstBuffer* RefBuffer(int& width, int& height, std::string& format) {
        return gst_buffer_ref(GetBuffer(width, height, format));
    }

    gint64 GetTimestamp() {
        return m_timestamp;
    }

private:

    GstSample* m_sample;
    GstBuffer* m_buffer;

    int         m_cols;
    int         m_rows;
    int         m_fpsn;
    int         m_fpsd;
    const char* m_format;
    uint64_t    m_timestamp;
};

// callback functions
typedef std::function<bool(GstSample* , void*)> PutFrameFunc;

typedef std::function<std::shared_ptr<GstSampleObject>(void*)> GetFrameFunc;

typedef std::function<bool(std::shared_ptr<std::vector<OSDObject> >, void*)> PutResultFunc;

typedef std::function<std::shared_ptr<std::vector<OSDObject> >(void*)> GetResultFunc;

typedef std::function<void(GstBuffer* buffer, const std::shared_ptr<std::vector<OSDObject> >& results)> ProcResultFunc;

================================================
FILE: ai_integration/deepstream/inc/DoubleBufferCache.h
================================================
/*
 * @Description: Double Buffer Cache Implement.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-29 08:51:01
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2022-07-16 13:52:32
 */
#pragma once

#include <mutex>
#include <atomic>
#include <memory>
#include <list>

/** 
 * @brief Shared-buffer cache manager.
 */
template<typename T>
class DoubleBufCache {
public:
    /**
     * @brief: constructor
     * @Author: Ricardo Lu
     * @param[in] notify_func When a new buffer is fed, it triggers the function handle.
     * @return {*}
     */    
    DoubleBufCache(std::function<bool()> notify_func =
            std::function<bool()>{nullptr}, std::string debug_info = "") noexcept : 
            debug_info(debug_info), swap_ready(false) {
        this->notify_func = notify_func;
    }

    /**
     * @brief deconstructor
     * @Author: Ricardo Lu
     */
    ~DoubleBufCache() noexcept {
        if (!debug_info.empty() ) {
            printf("DoubleBufCache %s destroyed.", debug_info.c_str());
        }
    }

    /**
     * @brief Put the latest buffer into cache queue to be processed.
     * Giving up control of previous front buffer.
     * @Author: Ricardo Lu
     * @param[in] pending - The latest buffer.
     */
    void feed(std::shared_ptr<T> pending) {
        if (nullptr == pending.get()) {
            throw "ERROR: feed an empty buffer to DoubleBufCache";
        }

        swap_mtx.lock();
        front_sp = pending;
        swap_mtx.unlock();
        swap_ready = true;
        if (notify_func) {
            notify_func();
        }
        return;
    }

    /**
     * @brief Get the front buffer.
     * @Author: Ricardo Lu
     * @return Front buffer.
     * */
    std::shared_ptr<T> front()  noexcept {
        return front_sp;
    }

    /**
     * @brief Fetch the shared back buffer.
     * @Author: Ricardo Lu
     * @return Back buffer.
     */
    std::shared_ptr<T> fetch()  noexcept {
        if (swap_ready) {
            swap_mtx.lock();
            back_sp = front_sp;
            swap_mtx.unlock();
            swap_ready = false;
        }
        return back_sp;
    }

private:
    //! Notification function will be called, if a new buffer fed.
    std::function<bool()> notify_func;
    //! The buffer cache can be swapped if the flag is equal to true.
    std::atomic<bool> swap_ready;
    //! Swapping mutex lock for thread safety.
    std::mutex swap_mtx;
    //! Front buffer for previous results saving.
    std::shared_ptr<T> front_sp;
    //! Back buffer to be fetched.
    std::shared_ptr<T> back_sp;
public:
    //! Indicate the name of an instantiated object for debug.
    std::string debug_info;
};

================================================
FILE: ai_integration/deepstream/inc/Logger.h
================================================
/*
 * @Description: Single Instance logger based on spdlog.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-09-11 20:05:38
 * @Last Editor: Ricardo Lu
 * @LastEditTime: 2022-07-09 08:22:06
 */
#pragma once

#include <iostream>
#include <cstring>
#include <sstream>
#include <time.h>
#include <chrono>
#include <memory>

#include "spdlog/spdlog.h"
#include "spdlog/async.h"
#include "spdlog/sinks/basic_file_sink.h"
#include "spdlog/sinks/rotating_file_sink.h"
#include "spdlog/sinks/stdout_color_sinks.h"

static inline int NowDateToInt()
{
    time_t now;
    time(&now);

    tm p;
    localtime_r(&now, &p);
    int now_date =(1900 + p.tm_year) * 10000 +(p.tm_mon + 1) * 100 + p.tm_mday;
    return now_date;
}

static inline int NowTimeToInt()
{
    time_t now;
    time(&now);

    tm p;
    localtime_r(&now, &p);

    int now_int = p.tm_hour * 10000 + p.tm_min * 100 + p.tm_sec;
    return now_int;
}

static inline spdlog::level::level_enum GetLogLevel(std::string& level)
{
    if (!(level.compare("trace"))) {
        return spdlog::level::trace;
    } else if (!(level.compare("debug"))) {
        return spdlog::level::debug;
    } else if (!(level.compare("info"))) {
        return spdlog::level::info;
    } else if (!(level.compare("warn"))) {
        return spdlog::level::warn;
    } else if (!(level.compare("error"))) {
        return spdlog::level::err;
    }
    return spdlog::level::trace;
}

class XLogger {
public:
    static XLogger* getInstance() {
        static XLogger xlogger;
        return &xlogger;
    }

    std::shared_ptr<spdlog::logger> getLogger() {
        return m_logger;
    }
private:
    XLogger() {
        try {
#ifdef DUMP_LOG
            int date = NowDateToInt();
            int timestamp = NowTimeToInt();
            std::stringstream file_logger_name;
            std::stringstream file_log_full_path;

            if (access(LOG_PATH, F_OK)) {
                spdlog::warn("Log diretory not exist, mkdir called");
                mkdir(LOG_PATH, S_IRWXU);
            }

            file_logger_name  << LOG_FILE_PREFIX << "_" << date << "_" << timestamp;
            file_log_full_path << LOG_PATH << "/" << file_logger_name.str() << ".log";

    #ifdef MULTI_LOG
            auto console_sink = std::make_shared<spdlog::sinks::stdout_color_sink_mt>();
            auto file_sink = std::make_shared<spdlog::sinks::basic_file_sink_mt>
                                (file_log_full_path.str(), true);

            spdlog::logger logger("multi_sink", {console_sink, file_sink});
            m_logger = std::make_shared<spdlog::logger>(logger);
    #else // fileout only
            m_logger = spdlog::basic_logger_mt("file_logger", file_log_full_path.str());
    #endif // MULTI_LOG
#else // stdout only
            m_logger = spdlog::stdout_color_mt("console_logger");
#endif // DUMP_LOG

            m_logger->set_pattern("%Y-%m-%d %H:%M:%S.%f <thread %t> [%^%l%$] [%@] [%!] %v");

            std::string log_level(LOG_LEVEL);
            spdlog::info("Set log level to {}.", log_level);
            m_logger->set_level(GetLogLevel(log_level));
            m_logger->flush_on(GetLogLevel(log_level));
        } catch(const spdlog::spdlog_ex& ex) {
            spdlog::error("XLogger initializetion failed: {}", ex.what());
        }
    }

    ~XLogger() {
        spdlog::drop_all(); // must do this
    }

    XLogger(const XLogger&) = delete;
    XLogger& operator=(const XLogger&) = delete;
private:
    std::shared_ptr<spdlog::logger> m_logger;
};

// use embedded macro to support file and line number
#define LOG_TRACE(...) SPDLOG_LOGGER_CALL(XLogger::getInstance()->getLogger().get(), spdlog::level::trace, __VA_ARGS__)
#define LOG_DEBUG(...) SPDLOG_LOGGER_CALL(XLogger::getInstance()->getLogger().get(), spdlog::level::debug, __VA_ARGS__)
#define LOG_INFO(...)  SPDLOG_LOGGER_CALL(XLogger::getInstance()->getLogger().get(), spdlog::level::info, __VA_ARGS__)
#define LOG_WARN(...)  SPDLOG_LOGGER_CALL(XLogger::getInstance()->getLogger().get(), spdlog::level::warn, __VA_ARGS__)
#define LOG_ERROR(...) SPDLOG_LOGGER_CALL(XLogger::getInstance()->getLogger().get(), spdlog::level::err, __VA_ARGS__)


================================================
FILE: ai_integration/deepstream/inc/VideoPipeline.h
================================================
/*
 * @Description: Implement of VideoPipeline on DeepStream.
 * @version: 2.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2022-07-15 22:07:29
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2023-02-06 21:03:47
 */
#pragma once

# include "Common.h"

static int pipeline_id = 0;

typedef enum _VideoType {
    FILE_STREAM = 0,
    RTSP_STREAM = 1,
    USB_CAMERE = 2
}VideoType;

typedef struct _VideoPipelineConfig {
    std::string pipeline_id;
    int         input_type { VideoType::FILE_STREAM };
    /*------------------uridecodebin------------------*/
    std::string src_uri;
    bool        file_loop;
    int         rtsp_latency;
    int         rtp_protocol;
    /*--------------------v4l2src--------------------*/
    std::string src_device;
    std::string src_format;
    int         src_width;
    int         src_height;
    int         src_framerate_n;
    int         src_framerate_d;
    /*-------------nveglglessink branch-------------*/
    bool        enable_hdmi;
    bool        hdmi_sync;
    int         window_x;
    int         window_y;
    int         window_width;
    int         window_height;
    /*----------------rtmpsink branch---------------*/
    // nvviconvert of this branch only convert color space to NV12(default behavior) //
    bool        enable_rtmp;
    int         enc_bitrate;
    int         enc_iframe_interval;
    std::string rtmp_uri;
    /*---------------inference branch---------------*/
    bool        enable_appsink;
    /*----------------nvvideoconvert----------------*/
    int         cvt_memory_type;
    std::string cvt_format;
    int         cvt_width;
    int         cvt_height;
    std::string crop;
}VideoPipelineConfig;

class VideoPipeline {
public:
    VideoPipeline      (const VideoPipelineConfig& config);
    ~VideoPipeline     ();
    bool Create        ();
    bool Start         ();
    bool Pause         ();
    bool Resume        ();
    void Destroy       ();
    void SetCallbacks  (PutFrameFunc func, void* args);
    void SetCallbacks  (GetResultFunc func, void* args);
    void SetCallbacks  (ProcResultFunc func);

private:
    GstElement* CreateUridecodebin();
    GstElement* CreateV4l2src();

public:
    PutFrameFunc        m_putFrameFunc;
    void*               m_putFrameArgs;
    GetResultFunc       m_getResultFunc;
    void*               m_getResultArgs;
    ProcResultFunc      m_procResultFunc;


    uint64_t            m_queue00_src_probe;     /* probe for nvvideoconvert sync ans osd process */
    uint64_t            m_cvt_sink_probe;        /* probe for inference rate control */
    uint64_t            m_cvt_src_probe;         /* probe for convert lock sync */
    uint64_t            m_dec_sink_probe;        /* probe for seek */

    uint64_t            m_prev_accumulated_base;    /* PTS offset for seek */
    uint64_t            m_accumulated_base;         /* PTS offset for seek */

    VideoPipelineConfig m_config;

    volatile int        m_syncCount;
    volatile bool       m_isExited;
    GMutex              m_syncMuxtex;
    GCond               m_syncCondition;
    GMutex              m_mutex;
    bool                m_dumped;           /* dump pipeline to dot */

    GstElement*         m_pipeline;
    GstElement*         m_source;           /* uridecodebin or v4l2src */
    GstElement*         m_streammuxer;      /* nvstreamuxer */
    GstElement*         m_capfilter0;        /* image/jpeg */
    GstElement*         m_decoder;          /* nvv4l2decoder or nvjpegdec */
    GstElement*         m_tee0;             /* display branch & inference branch */
    GstElement*         m_queue00;          /* for display branch */ 
    GstElement*         m_fakesink;         /* sync stream when disabled display */
    GstElement*         m_tee1;             /* nveglglessink branch & rtmpsink branch */
    GstElement*         m_queue10;          /* for nveglglessink branch */
    GstElement*         m_nveglglessink;    /* nveglglessink */
    GstElement*         m_queue11;          /* for rtmpsink branch */
    GstElement*         m_nvvideoconvert0;  /* convert RGBA(nvjpegdec) to NV12 */
    GstElement*         m_capfilter1;
    GstElement*         m_encoder;          /* nvv4l2h264enc */
    GstElement*         m_h264parse;        /* h264parse */
    GstElement*         m_flvmux;           /* flvmux */
    GstElement*         m_rtmpsink;         /* rtmpsink */
    GstElement*         m_queue01;          /* for inference branch */
    GstElement*         m_nvvideoconvert1;  /* convert NV12(nvv4l2decoder) to RGBA */
    GstElement*         m_capfilter2;
    GstElement*         m_appsink;          /* for AI inference */
};


/*

gst-launch-1.0 uridecodebin uri="rtsp://127.0.0.1:554/live/test" ! tee name=tee0 ! queue ! \
tee name=t1 ! queue ! nveglglessink tee1. ! queue ! nvvideoconvert ! video/x-raw(memory:NVMM),format=NV12 ! \
nvv4l2h264enc bitrate=4000000 iframeinterval=30 ! flvmux ! rtmpsink location=rtmp://127.0.0.1:1935/live/test \
tee0. ! queue ! nvvideoconvert ! video/x-raw(memory:NVMM),format=RGBA ! appsink

gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg,format=MJPG,width=1920,height=1080,framerate=30/1 ! \ 
nvjpegdec ! tee name=tee0 ! queue ! tee name=tee1 ! queue ! nveglglessink tee1. ! queue ! nvvideoconvert ! \
video/x-raw(memory:NVMM),format=NV12 ! nvv4l2h264enc bitrate=4000000 iframeinterval=30 ! flvmux ! \
rtmpsink location=rtmp://127.0.0.1:1935/live/test tee0. ! queue ! nvvideoconvert ! video/x-raw(memory:NVMM),format=RGBA ! appsink

*/

================================================
FILE: ai_integration/deepstream/sp_mp4.json
================================================
{
    "name":"pipeline0",
    "input-config":{
        "type":1,
        "stream":{
            "uri":"file:///home/ricardo/workSpace/gstreamer-example/ai_integration/test.mp4",
            "file-loop":false,
            "rtsp-latency":0,
            "rtp-protocol":4
        },
        "usb-camera":{
            "device":"/dev/video0",
            "format":"MJPG",
            "width":1920,
            "height":1080,
            "framerate-n":30,
            "framerate-d":1
        }
    },
    "output-config":{
        "display":{
            "enable":true,
            "sync":true,
            "left":0,
            "top":0,
            "width":1920,
            "height":1080
        },
        "rtmp":{
            "enable":true,
            "bitrate":100000,
            "iframeinterval":30,
            "uri":"rtmp://127.0.0.1:1935/live/test"
        },
        "inference":{
            "enable":true,
            "memory-type":3,
            "format":"RGBA"
        }
    }
}

================================================
FILE: ai_integration/deepstream/sp_rtsp.json
================================================
{
    "name":"pipeline0",
    "input-config":{
        "type":1,
        "stream":{
            "uri":"rtsp://127.0.0.1:554/live/test",
            "file-loop":false,
            "rtsp-latency":0,
            "rtp-protocol":4
        },
        "usb-camera":{
            "device":"/dev/video0",
            "format":"MJPG",
            "width":1920,
            "height":1080,
            "framerate-n":30,
            "framerate-d":1
        }
    },
    "output-config":{
        "display":{
            "enable":true,
            "sync":true,
            "left":0,
            "top":0,
            "width":1920,
            "height":1080
        },
        "rtmp":{
            "enable":false,
            "bitrate":100000,
            "iframeinterval":30,
            "uri":"rtmp://127.0.0.1:1935/live/test"
        },
        "inference":{
            "enable":true,
            "memory-type":3,
            "format":"RGBA"
        }
    }
}

================================================
FILE: ai_integration/deepstream/sp_uc.json
================================================
{
    "name":"pipeline0",
    "input-config":{
        "type":2,
        "stream":{
            "uri":"rtsp://127.0.0.1:554/live/test",
            "file-loop":false,
            "rtsp-latency":0,
            "rtp-protocol":4
        },
        "usb-camera":{
            "device":"/dev/video0",
            "format":"MJPG",
            "width":1920,
            "height":1080,
            "framerate-n":30,
            "framerate-d":1
        }
    },
    "output-config":{
        "display":{
            "enable":true,
            "sync":true,
            "left":0,
            "top":0,
            "width":1920,
            "height":1080
        },
        "rtmp":{
            "enable":true,
            "bitrate":100000,
            "iframeinterval":30,
            "uri":"rtmp://127.0.0.1:1935/live/test"
        },
        "inference":{
            "enable":true,
            "memory-type":3,
            "format":"RGBA"
        }
    }
}

================================================
FILE: ai_integration/deepstream/src/VideoPipeline.cpp
================================================
/*
 * @Description: Implement of VideoPipeline on DeepStream.
 * @version: 2.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2022-07-15 22:07:19
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2023-02-06 21:04:48
 */

#include "VideoPipeline.h"

static GstPadProbeReturn cb_sync_before_buffer_probe(
    GstPad* pad,
    GstPadProbeInfo* info,
    gpointer user_data)
{
    // LOG_INFO("cb_sync_before_buffer_probe called");

    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);
    GstBuffer* buffer = (GstBuffer*) info->data;

    return GST_PAD_PROBE_OK;
}

static GstPadProbeReturn cb_sync_after_buffer_probe(
    GstPad* pad,
    GstPadProbeInfo* info,
    gpointer user_data)
{
    // LOG_INFO("cb_sync_after_buffer_probe called");

    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);
    GstBuffer* buffer = (GstBuffer*) info->data;

    // sync
    if (info->type & GST_PAD_PROBE_TYPE_BUFFER) {
        g_mutex_lock(&vp->m_syncMuxtex);
        g_atomic_int_inc(&vp->m_syncCount);
        g_cond_signal(&vp->m_syncCondition);
        g_mutex_unlock(&vp->m_syncMuxtex);
    }

    return GST_PAD_PROBE_OK;
}

static GstPadProbeReturn cb_queue0_probe(
    GstPad* pad, 
    GstPadProbeInfo* info,
    gpointer user_data)
{
    // LOG_INFO("cb_queue0_probe called");

    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);
    GstBuffer* buffer = (GstBuffer*) info->data;

    // sync
    // if (info->type & GST_PAD_PROBE_TYPE_BUFFER && !vp->m_isExited) {
    //     g_mutex_lock(&vp->m_syncMuxtex);
    //     while(g_atomic_int_get(&vp->m_syncCount) <= 0)
    //         g_cond_wait(&vp->m_syncCondition, &vp->m_syncMuxtex);
    //     if (!g_atomic_int_dec_and_test(&vp->m_syncCount)) {
    //         //LOG_INFO("m_syncCount:{}/{}", vp->m_syncCount,
    //         //    pipeline_id);
    //     }
    //     g_mutex_unlock(&vp->m_syncMuxtex);
    // }

    // osd the result
    if (vp->m_getResultFunc) {
        const std::shared_ptr<std::vector<OSDObject> > results =
            vp->m_getResultFunc(vp->m_getResultArgs);
        // to-do: construct nvdsosd metadata
        if (vp->m_procResultFunc) {
            vp->m_procResultFunc(buffer, results);
        }
    }

    // LOG_INFO("cb_queue0_probe exited");

    return GST_PAD_PROBE_OK;
}

static GstFlowReturn cb_appsink_new_sample(
    GstElement* appsink,
    gpointer user_data)
{
    // LOG_INFO("cb_appsink_new_sample called");

    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);
    GstSample* sample = nullptr;

    if (!vp->m_dumped) {
        GST_DEBUG_BIN_TO_DOT_FILE(GST_BIN(vp->m_pipeline), GST_DEBUG_GRAPH_SHOW_ALL, "video-pipeline");
        vp->m_dumped = true;
    }

    g_signal_emit_by_name(appsink, "pull-sample", &sample);
    if (!sample) {
        return GST_FLOW_OK;
    }

    if (vp->m_putFrameFunc) {
        vp->m_putFrameFunc(sample, vp->m_putFrameArgs);
    } else {
        gst_sample_unref(sample);
    }

    return GST_FLOW_OK;
}

static gboolean cb_seek_decoded_file(gpointer user_data)
{
    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);

    LOG_INFO("============================================");
    LOG_INFO("cb_seek_decoded_file called({})", pipeline_id);
    LOG_INFO("============================================");

    gst_element_set_state(vp->m_pipeline, GST_STATE_PAUSED);

    if (!gst_element_seek(vp->m_pipeline, 1.0, GST_FORMAT_TIME,
        (GstSeekFlags)(GST_SEEK_FLAG_KEY_UNIT | GST_SEEK_FLAG_FLUSH),
        GST_SEEK_TYPE_SET, 0, GST_SEEK_TYPE_NONE, GST_CLOCK_TIME_NONE)) {
        LOG_WARN("Failed to seed the source file in pipeline");
    }

    gst_element_set_state(vp->m_pipeline, GST_STATE_PLAYING);

    return false;
}

static GstPadProbeReturn cb_reset_stream_probe(
    GstPad* pad,
    GstPadProbeInfo* info,
    gpointer user_data)
{
    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);

    if (info->type & GST_PAD_PROBE_TYPE_BUFFER) {
        GST_BUFFER_PTS(GST_BUFFER(info->data)) += vp->m_prev_accumulated_base;
    }

    if (info->type & GST_PAD_PROBE_TYPE_EVENT_BOTH) {
        GstEvent* event = GST_EVENT(info->data);
        if (GST_EVENT_TYPE(event) == GST_EVENT_SEGMENT) {
            GstSegment *segment;
            gst_event_parse_segment(event, (const GstSegment **) &segment);
            segment->base = vp->m_accumulated_base;
            vp->m_prev_accumulated_base = vp->m_accumulated_base;
            vp->m_accumulated_base += segment->stop;
        } else if (GST_EVENT_TYPE(event) == GST_EVENT_EOS) {
            g_timeout_add(1, cb_seek_decoded_file, vp);
        }

        switch(GST_EVENT_TYPE(event)) {
            case GST_EVENT_EOS:
            case GST_EVENT_QOS:
            case GST_EVENT_SEGMENT:
            case GST_EVENT_FLUSH_START:
            case GST_EVENT_FLUSH_STOP:
                return GST_PAD_PROBE_DROP;
            default:
                break;
        }
    }

    return GST_PAD_PROBE_OK;
}

static void cb_decodebin_child_added(GstChildProxy* child_proxy, GObject* object,
    gchar* name, gpointer user_data)
{
    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);

    LOG_INFO("cb_decodebin_child_added called({},'{}' added)", pipeline_id, name);

    if (g_strrstr(name, "nvv4l2decoder") == name) {
        g_object_set(object, "cudadec-memtype", 2, nullptr);

        if (g_strstr_len(vp->m_config.src_uri.c_str(), -1, "file:/") ==
            vp->m_config.src_uri.c_str() && vp->m_config.file_loop) {
            GstPad* gst_pad = gst_element_get_static_pad(GST_ELEMENT(object), "sink");
            vp->m_dec_sink_probe = gst_pad_add_probe(gst_pad, (GstPadProbeType)(
                GST_PAD_PROBE_TYPE_EVENT_BOTH | GST_PAD_PROBE_TYPE_EVENT_FLUSH |
                GST_PAD_PROBE_TYPE_BUFFER), cb_reset_stream_probe, static_cast<void*>(vp), nullptr);
            gst_object_unref(gst_pad);

            vp->m_decoder = GST_ELEMENT(object);
            gst_object_ref(object);
        } else if (g_strstr_len(vp->m_config.src_uri.c_str(), -1, "rtsp:/") ==
            vp->m_config.src_uri.c_str()) {
            vp->m_decoder = GST_ELEMENT(object);
            gst_object_ref(object);
        }
    } else if ((g_strrstr(name, "h264parse") == name) ||
            (g_strrstr(name, "h265parse") == name)) {
        LOG_INFO("set config-interval of {} to {}", name, -1);
        g_object_set(object, "config-interval", -1, nullptr);
    }

done:
    return;
}

static void cb_uridecodebin_source_setup(GstElement* object, GstElement* source,
    gpointer user_data)
{
    LOG_INFO("cb_uridecodebin_source_setup called");
    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);

    if (g_object_class_find_property(G_OBJECT_GET_CLASS(source), "latency")) {
        LOG_INFO("cb_uridecodebin_source_setup set {} latency", vp->m_config.rtsp_latency);
        g_object_set(G_OBJECT(source), "latency", vp->m_config.rtsp_latency, nullptr);
    }

    if (g_object_class_find_property(G_OBJECT_GET_CLASS(source), "protocols")) {
        LOG_INFO("set protocols of source to {}", vp->m_config.rtp_protocol);
        g_object_set(G_OBJECT(source), "protocols", vp->m_config.rtp_protocol, nullptr);
    }
}

static void cb_uridecodebin_pad_added(GstElement* decodebin, GstPad* pad,
    gpointer user_data)
{
    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);
    GstPad* sinkpad = nullptr;

    GstCaps* caps = gst_pad_query_caps(pad, nullptr);
    const GstStructure* str = gst_caps_get_structure(caps, 0);
    const gchar* name = gst_structure_get_name(str);
    
    LOG_INFO("cb_uridecodebin_pad_added called {}", name);
    LOG_INFO("structure:{}", gst_structure_to_string(str));

    if (g_str_has_prefix (name, "video/x-raw")) {
        if (vp->m_config.enable_hdmi || vp->m_config.enable_rtmp || vp->m_config.enable_appsink) {
            sinkpad = gst_element_get_static_pad(vp->m_tee0, "sink");
        } else {
            sinkpad = gst_element_get_static_pad(vp->m_fakesink, "sink");
        }

        if (sinkpad && gst_pad_link(pad, sinkpad) == GST_PAD_LINK_OK) {
            LOG_INFO("Success to link uridecodebin into pipeline");
        } else {
            LOG_ERROR("Failed to link uridecodebin to pipeline");
        }

        if (sinkpad) {
            gst_object_unref(sinkpad);
        }
    }

    gst_caps_unref(caps);
}

static void cb_uridecodebin_child_added(GstChildProxy* child_proxy,
    GObject* object, gchar* name, gpointer user_data)
{
    VideoPipeline* vp = static_cast<VideoPipeline*>(user_data);

    LOG_INFO("cb_uridecodebin_child_added called({},'{}' added)", pipeline_id, name);

    if (g_strrstr(name, "decodebin") == name) {
        g_signal_connect(G_OBJECT(object), "child-added",
            G_CALLBACK(cb_decodebin_child_added), vp);
    }

done:
    return;
}

VideoPipeline::VideoPipeline(const VideoPipelineConfig& config)
{
    m_config = config;
    m_syncCount = 0;
    m_isExited = false;
    m_queue00_src_probe = -1;
    m_cvt_sink_probe = -1;
    m_cvt_src_probe = -1;
    m_dec_sink_probe = -1;
    m_prev_accumulated_base = 0;
    m_accumulated_base = 0;
    m_dumped = false;

    m_putFrameFunc = nullptr;
    m_putFrameArgs = nullptr;
    m_getResultFunc = nullptr;
    m_getResultArgs = nullptr;
    m_procResultFunc = nullptr;

    g_mutex_init(&m_syncMuxtex);
    g_cond_init(&m_syncCondition);
    g_mutex_init(&m_mutex);
}

VideoPipeline::~VideoPipeline()
{
    Destroy();
}

GstElement* VideoPipeline::CreateUridecodebin()
{
    if (!(m_source = gst_element_factory_make("uridecodebin", "uridecodebin0"))) {
        LOG_ERROR("Failed to create element uridecodebin named uridecodebin0");
        return nullptr;
    }

    g_object_set (G_OBJECT(m_source), "uri", m_config.src_uri.c_str(), nullptr);
    LOG_INFO("Set uri of uridecodebin to {}", m_config.src_uri);

    g_signal_connect(G_OBJECT(m_source), "source-setup", G_CALLBACK(
        cb_uridecodebin_source_setup), this);
    g_signal_connect(G_OBJECT(m_source), "pad-added",    G_CALLBACK(
        cb_uridecodebin_pad_added),    this);
    g_signal_connect(G_OBJECT(m_source), "child-added",  G_CALLBACK(
        cb_uridecodebin_child_added),  this);

    gst_bin_add_many(GST_BIN(m_pipeline), m_source, nullptr);

    return m_source;
}

GstElement* VideoPipeline::CreateV4l2src()
{
    if (!(m_source = gst_element_factory_make("v4l2src", "v4l2src0"))) {
        LOG_ERROR("Failed to create element v4l2src named v4l2src0");
        return nullptr;
    }
    g_object_set (G_OBJECT (m_source), "device", m_config.src_device.c_str(), nullptr);
    gst_bin_add_many (GST_BIN (m_pipeline), m_source, nullptr);

    GstCaps* caps = gst_caps_new_simple ("image/jpeg",
            "width", G_TYPE_INT, m_config.src_width,
            "height", G_TYPE_INT, m_config.src_height,
            "framerate", GST_TYPE_FRACTION, m_config.src_framerate_n, m_config.src_framerate_d,
            "format", G_TYPE_STRING, m_config.src_format.c_str(), nullptr);

    if (!(m_capfilter0 = gst_element_factory_make ("capsfilter", "capfilter0"))) {
        LOG_ERROR("Failed to create element capsfilter named capfilter0");
        return nullptr;
    }

    g_object_set(G_OBJECT(m_capfilter0), "caps", caps, nullptr);
    gst_caps_unref(caps);

    gst_bin_add_many (GST_BIN (m_pipeline), m_capfilter0, nullptr);

    if (!(m_decoder = gst_element_factory_make("jpegdec", "jpegdec0"))) {
        LOG_ERROR("Failed to create element jpegdec named jpegdec0");
        return nullptr;
    }
    gst_bin_add_many(GST_BIN(m_pipeline), m_decoder, nullptr);

    if (!gst_element_link_many(m_source, m_capfilter0, m_decoder, nullptr)) {
        LOG_ERROR("Failed to link v4l2src0->capfilter0->nvjpegdec0");
        return nullptr;
    }

    return m_decoder;
}

bool VideoPipeline::Create()
{
    GstCaps* cvt_caps;
    GstPad* gst_pad;
    GstCapsFeatures* feature;
    GstElement* input;

    if (!(m_pipeline = gst_pipeline_new("video-pipeline"))) {
        LOG_ERROR("Failed to create pipeline named video-pipeline");
        goto exit;
    }
    gst_pipeline_set_auto_flush_bus(GST_PIPELINE(m_pipeline), true);

    input = m_config.input_type == VideoType::USB_CAMERE ? CreateV4l2src() : CreateUridecodebin();
    if (!input) {
        LOG_ERROR("Can't process input source.");
        goto exit;
    }

    if (!(m_tee0 = gst_element_factory_make("tee", "tee0"))) {
        LOG_ERROR("Failed to create element tee0 named tee0");
        goto exit;
    }
    gst_bin_add_many(GST_BIN(m_pipeline), m_tee0, nullptr);

    if (m_config.input_type == VideoType::USB_CAMERE) {
        if (!gst_element_link_many(input, m_tee0, nullptr)) {
            LOG_ERROR("Failed to link jpegdec0->tee0");
            goto exit;
        }
    }

    if (!(m_queue00 = gst_element_factory_make("queue", "queue00"))) {
        LOG_ERROR("Failed to create element queue named queue00");
        goto exit;
    }

    // add probe to queue0
    gst_pad = gst_element_get_static_pad(m_queue00, "src");
    m_queue00_src_probe = gst_pad_add_probe(gst_pad, (GstPadProbeType)(
                        GST_PAD_PROBE_TYPE_BUFFER), cb_queue0_probe, 
                        static_cast<void*>(this), nullptr);
    gst_object_unref(gst_pad);

    gst_bin_add_many(GST_BIN(m_pipeline), m_queue00, nullptr);

    if (!gst_element_link_many(m_tee0, m_queue00, nullptr)) {
        LOG_ERROR("Failed to link tee0->queue00");
        goto exit;
    }

    if (!m_config.enable_hdmi && !m_config.enable_rtmp) {
        if (!(m_fakesink = gst_element_factory_make("fakesink", "fakesink0"))) {
            LOG_ERROR("Failed to create element fakesink named fakesink0");
            goto exit;
        }
        g_object_set(G_OBJECT(m_fakesink), "sync", true, nullptr);

        gst_bin_add_many(GST_BIN(m_pipeline), m_fakesink, nullptr);

        if (!gst_element_link_many(m_queue00, m_fakesink, nullptr)) {
            LOG_ERROR("Failed to link queue00->fakesink0");
            goto exit;
        }
    } else {
        if (!(m_tee1 = gst_element_factory_make("tee", "tee1"))) {
        LOG_ERROR("Failed to create element tee0 named tee1");
        goto exit;
        }
        gst_bin_add_many(GST_BIN(m_pipeline), m_tee1, nullptr);

        if (!gst_element_link_many(m_queue00, m_tee1, nullptr)) {
            LOG_ERROR("Failed to link queue00->tee1");
            goto exit;
        }

        if (m_config.enable_hdmi) {
            if (!(m_queue10 = gst_element_factory_make("queue", "queue10"))) {
                LOG_ERROR("Failed to create element queue named queue10");
                goto exit;
            }
            gst_bin_add_many(GST_BIN(m_pipeline), m_queue10, nullptr);

            if (!(m_nveglglessink = gst_element_factory_make("nveglglessink", "nveglglessink0"))) {
                LOG_ERROR("Failed to create element nveglglessink named nveglglessink0");
                goto exit;
            }
            g_object_set(G_OBJECT(m_nveglglessink),
                "sync", m_config.hdmi_sync,
                "window-x", m_config.window_x,
                "window-y", m_config.window_y,
                "window-width", m_config.window_width,
                "window-height", m_config.window_height, nullptr);

            gst_bin_add_many(GST_BIN(m_pipeline), m_nveglglessink, nullptr);

            if (!gst_element_link_many(m_tee1, m_queue10, m_nveglglessink, nullptr)) {
                LOG_ERROR("Failed to link tee1->queue10->nveglglessink0");
                goto exit;
            }
        }

        if (m_config.enable_rtmp) {
            if (!(m_queue11 = gst_element_factory_make("queue", "queue11"))) {
                LOG_ERROR("Failed to create element queue named queue11");
                goto exit;
            }
            gst_bin_add_many(GST_BIN(m_pipeline), m_queue11, nullptr);

            if (!(m_nvvideoconvert0 = gst_element_factory_make("nvvideoconvert", "nvvideoconvert0"))) {
                LOG_ERROR("Failed to create element nvvideoconvert named nvvideoconvert0");
                goto exit;
            }
            g_object_set(G_OBJECT(m_nvvideoconvert0), "nvbuf-memory-type", m_config.cvt_memory_type, nullptr);
            gst_bin_add_many(GST_BIN(m_pipeline), m_nvvideoconvert0, nullptr);

            cvt_caps = gst_caps_new_simple("video/x-raw", "format", G_TYPE_STRING, "NV12", nullptr);
            feature = gst_caps_features_new("memory:NVMM", nullptr);
            gst_caps_set_features(cvt_caps, 0, feature);

            if (!(m_capfilter1 = gst_element_factory_make("capsfilter", "capfilter1"))) {
                LOG_ERROR("Failed to create element capsfilter named capfilter1");
                goto exit;
            }

            g_object_set(G_OBJECT(m_capfilter1), "caps", cvt_caps, nullptr);
            gst_caps_unref(cvt_caps);

            gst_bin_add_many(GST_BIN(m_pipeline), m_capfilter1, nullptr);

            if (!(m_encoder = gst_element_factory_make("nvv4l2h264enc", "nvv4l2h264enc0"))) {
                LOG_ERROR("Failed to create element nvv4l2h264enc named nvv4l2h264enc0");
                goto exit;
            }
            g_object_set(G_OBJECT(m_encoder), "bitrate", m_config.enc_bitrate,
                "iframeinterval", m_config.enc_iframe_interval, nullptr);
            gst_bin_add_many(GST_BIN(m_pipeline), m_encoder, nullptr);

            if (!(m_h264parse = gst_element_factory_make("h264parse", "h264parse0"))) {
                LOG_ERROR("Failed to create element h264parse named h264parse0");
                goto exit;
            }
            gst_bin_add_many(GST_BIN(m_pipeline), m_h264parse, nullptr);

            if (!(m_flvmux = gst_element_factory_make("flvmux", "flvmux0"))) {
                LOG_ERROR("Failed to create element flvmux named flvmux0");
                goto exit;
            }
            gst_bin_add_many(GST_BIN(m_pipeline), m_flvmux, nullptr);

            if (!(m_rtmpsink = gst_element_factory_make("rtmpsink", "rtmpsink"))) {
                LOG_ERROR("Failed to create element rtmpsink named rtmpsink0");
                goto exit;
            }
            g_object_set(G_OBJECT(m_rtmpsink), "location", m_config.rtmp_uri.c_str(), nullptr);
            gst_bin_add_many(GST_BIN(m_pipeline), m_rtmpsink, nullptr);

            if (!gst_element_link_many(m_tee1, m_queue11, m_nvvideoconvert0,
                m_capfilter1, m_encoder, m_h264parse, m_flvmux, m_rtmpsink, nullptr)) {
                LOG_ERROR("Failed to link tee1->queue11->nvvideoconvert0->capfilter1->nvv4l2h264enc0->h264parse->flvmux0->rtmpsink0");
                goto exit;
            }
        }
    }

    if (m_config.enable_appsink) {
        if (!(m_queue01 = gst_element_factory_make("queue", "queue01"))) {
            LOG_ERROR("Failed to create element queue named queue01");
            goto exit;
        }
        gst_bin_add_many(GST_BIN(m_pipeline), m_queue01, nullptr);

        if (!(m_nvvideoconvert1 = gst_element_factory_make("nvvideoconvert", "nvvideoconvert1"))) {
            LOG_ERROR("Failed to create element nvvideoconvert named nvvideoconvert1");
            goto exit;
        }

        g_object_set(G_OBJECT(m_nvvideoconvert1), "nvbuf-memory-type", m_config.cvt_memory_type, nullptr);

        gst_bin_add_many(GST_BIN(m_pipeline), m_nvvideoconvert1, nullptr);

        cvt_caps = gst_caps_new_simple("video/x-raw", "format", G_TYPE_STRING, m_config.cvt_format.c_str(), nullptr);
        feature = gst_caps_features_new("memory:NVMM", nullptr);
        gst_caps_set_features(cvt_caps, 0, feature);

        if (!(m_capfilter2 = gst_element_factory_make("capsfilter", "capfilter2"))) {
            LOG_ERROR("Failed to create element capsfilter named capfilter2");
            goto exit;
        }

        g_object_set(G_OBJECT(m_capfilter2), "caps", cvt_caps, nullptr);
        gst_caps_unref(cvt_caps);

        gst_bin_add_many(GST_BIN(m_pipeline), m_capfilter2, nullptr);

        // gst_pad = gst_element_get_static_pad(m_nvvideoconvert1, "sink");
        // m_cvt_sink_probe = gst_pad_add_probe(gst_pad, (GstPadProbeType)(
        //                     GST_PAD_PROBE_TYPE_BUFFER), cb_sync_before_buffer_probe,
        //                     static_cast<void*>(this), nullptr);
        // gst_object_unref(gst_pad);

        // gst_pad = gst_element_get_static_pad(m_nvvideoconvert1, "src");
        // m_cvt_sink_probe = gst_pad_add_probe(gst_pad, (GstPadProbeType)(
        //                     GST_PAD_PROBE_TYPE_BUFFER), cb_sync_after_buffer_probe,
        //                     static_cast<void*>(this), nullptr);
        // gst_object_unref(gst_pad);

        if (!(m_appsink = gst_element_factory_make("appsink", "appsink"))) {
            LOG_ERROR("Failed to create element appsink named appsink");
            goto exit;
        }

        g_object_set(m_appsink, "emit-signals", true, nullptr);

        g_signal_connect(m_appsink, "new-sample",
            G_CALLBACK(cb_appsink_new_sample), static_cast<void*>(this));

        gst_bin_add_many(GST_BIN(m_pipeline), m_appsink, nullptr);

        if (!gst_element_link_many(m_tee0, m_queue01, m_nvvideoconvert1, m_capfilter2, m_appsink, nullptr)) {
            LOG_ERROR("Failed to link tee0->queue01->nvvideoconvert1->capfilter1->appsink");
            goto exit;
        }
    }

    return true;

exit:
    LOG_ERROR("Failed to create video pipeline");
    return false;
}

bool VideoPipeline::Start(void)
{
    LOG_INFO("Start pipeline called");

    if (GST_STATE_CHANGE_FAILURE == gst_element_set_state(m_pipeline,
        GST_STATE_PLAYING)) {
        LOG_ERROR("Failed to set pipeline to playing state");
        return false;
    }

    return true;
}

bool VideoPipeline::Pause(void)
{
    GstState state, pending;

    LOG_INFO("Stop Pipeline called");

    if (GST_STATE_CHANGE_ASYNC == gst_element_get_state(
        m_pipeline, &state, &pending, 5 * GST_SECOND / 1000)) {
        LOG_WARN("Failed to get state of pipeline");
        return false;
    }

    if (state == GST_STATE_PAUSED) {
        return true;
    } else if (state == GST_STATE_PLAYING) {
        gst_element_set_state(m_pipeline, GST_STATE_PAUSED);
        gst_element_get_state(m_pipeline, &state, &pending,
            GST_CLOCK_TIME_NONE);
        return true;
    } else {
        LOG_WARN("Invalid state of pipeline {}",
            GST_STATE_CHANGE_ASYNC);
        return false;
    }
}

bool VideoPipeline::Resume(void)
{
    GstState state, pending;

    LOG_INFO("Restart Pipeline called");

    if (GST_STATE_CHANGE_ASYNC == gst_element_get_state(
        m_pipeline, &state, &pending, 5 * GST_SECOND / 1000)) {
        LOG_WARN("Failed to get state of pipeline");
        return false;
    }

    if (state == GST_STATE_PLAYING) {
        return true;
    } else if (state == GST_STATE_PAUSED) {
        gst_element_set_state(m_pipeline, GST_STATE_PLAYING);
        gst_element_get_state(m_pipeline, &state, &pending,
            GST_CLOCK_TIME_NONE);
        return true;
    } else {
        LOG_WARN("Invalid state of pipeline{}",
            GST_STATE_CHANGE_ASYNC);
        return false;
    }
}

void VideoPipeline::Destroy(void)
{
    GstPad* teeSrcPad;
    while(teeSrcPad = gst_element_get_request_pad(m_tee0, "src_%u")) {
        gst_element_release_request_pad(m_tee0, teeSrcPad);
        g_object_unref(teeSrcPad);
    }

    while(teeSrcPad = gst_element_get_request_pad(m_tee1, "src_%u")) {
        gst_element_release_request_pad(m_tee1, teeSrcPad);
        g_object_unref(teeSrcPad);
    }

    if (m_pipeline) {
        m_isExited = true;
        g_mutex_lock(&m_syncMuxtex);
        g_atomic_int_inc(&m_syncCount);
        g_cond_signal(&m_syncCondition);
        g_mutex_unlock(&m_syncMuxtex);

        gst_element_set_state(m_pipeline, GST_STATE_NULL);

        gst_object_unref(m_pipeline);

        m_pipeline = nullptr;
    }

    // if (m_cvt_sink_probe != -1 && m_nvvideoconvert1) {
    //     GstPad *gstpad = gst_element_get_static_pad(m_nvvideoconvert1, "sink");
    //     if (!gstpad) {
    //         LOG_ERROR("Could not find '{}' in '{}'", "sink", GST_ELEMENT_NAME(m_nvvideoconvert1));
    //     }
    //     gst_pad_remove_probe(gstpad, m_cvt_sink_probe);
    //     gst_object_unref(gstpad);
    //     m_cvt_sink_probe = -1;
    // }

    // if (m_cvt_src_probe != -1 && m_nvvideoconvert1) {
    //     GstPad *gstpad = gst_element_get_static_pad(m_nvvideoconvert1, "src");
    //     if (!gstpad) {
    //         LOG_ERROR("Could not find '{}' in '{}'", "src", GST_ELEMENT_NAME(m_nvvideoconvert1));
    //     }
    //     gst_pad_remove_probe(gstpad, m_cvt_src_probe);
    //     gst_object_unref(gstpad);
    //     m_cvt_src_probe = -1;
    // }

    if (m_dec_sink_probe != -1 && m_decoder) {
        GstPad *gstpad = gst_element_get_static_pad(m_decoder, "sink");
        if (!gstpad) {
            LOG_ERROR("Could not find '{}' in '{}'", "sink", GST_ELEMENT_NAME(m_decoder));
        }
        gst_pad_remove_probe(gstpad, m_dec_sink_probe);
        gst_object_unref(gstpad);
        m_dec_sink_probe = -1;
    }

    if (m_queue00_src_probe != -1 && m_queue00) {
        GstPad *gstpad = gst_element_get_static_pad(m_queue00, "src");
        if (!gstpad) {
            LOG_ERROR("Could not find '{}' in '{}'", "src", GST_ELEMENT_NAME(m_queue00));
        }
        gst_pad_remove_probe(gstpad, m_queue00_src_probe);
        gst_object_unref(gstpad);
        m_queue00_src_probe = -1;
    }

    g_mutex_clear(&m_mutex);
    g_mutex_clear(&m_syncMuxtex);
    g_cond_clear(&m_syncCondition);
}

void VideoPipeline::SetCallbacks(PutFrameFunc func, void* args)
{
    LOG_INFO("set PutFrameFunc callback called");

    m_putFrameFunc = func;
    m_putFrameArgs = args;
}

void VideoPipeline::SetCallbacks(GetResultFunc func, void* args)
{
    LOG_INFO("set GetResultFunc callback called");

    m_getResultFunc = func;
    m_getResultArgs = args;
}

void VideoPipeline::SetCallbacks(ProcResultFunc func)
{
    LOG_INFO("set ProcResultFunc callback called");

    m_procResultFunc = func;
}


================================================
FILE: ai_integration/deepstream/src/main.cpp
================================================
/*
 * @Description: Test program of VideoPipeline.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2022-07-15 22:07:33
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2023-02-06 20:45:00
 */

#include <sys/stat.h>
#include <iostream>
#include <sstream>
#include <fstream>

#include <jsoncpp/json/json.h>
#include <gflags/gflags.h>
#include <gstnvdsmeta.h>
#include <nvbufsurface.h>

#include "Common.h"
#include "VideoPipeline.h"
#include "DoubleBufferCache.h"

static GMainLoop* g_main_loop = NULL;
Json::Reader g_reader;

static void Parse(VideoPipelineConfig& config, std::string& config_path)
{
    Json::Value root;
    std::ifstream in(config_path, std::ios::binary);
    g_reader.parse(in, root);

    if (root.isMember("name")) {
        config.pipeline_id = root["name"].asString();
        LOG_INFO("New pieline name: {}", config.pipeline_id);
    }

    if (root.isMember("input-config")) {
        Json::Value inputConfig = root["input-config"];
        config.input_type = inputConfig["type"].asInt();    // 0-MP4 / 1-RTSP / 2-USB Camera
        LOG_INFO("Pipeline[{}]: type: {}", config.pipeline_id, config.input_type);

        config.src_uri = inputConfig["stream"]["uri"].asString();
        LOG_INFO("Pipeline[{}]: input: {}", config.pipeline_id, config.src_uri);
        config.file_loop = inputConfig["stream"]["file-loop"].asBool();
        LOG_INFO("Pipeline[{}]: file-loop: {}", config.pipeline_id, config.file_loop);
        config.rtsp_latency = inputConfig["stream"]["rtsp-latency"].asInt();
        LOG_INFO("Pipeline[{}]: rtsp-latency: {}", config.pipeline_id, config.rtsp_latency);
        config.rtp_protocol = inputConfig["stream"]["rtp-protocol"].asInt();
        LOG_INFO("Pipeline[{}]: rtp-protocol: {}", config.pipeline_id, config.rtp_protocol);
        config.src_device = inputConfig["usb-camera"]["device"].asString();
        LOG_INFO("Pipeline[{}]: usb camera device: {}", config.pipeline_id, config.src_device);
        config.src_format = inputConfig["usb-camera"]["format"].asString();
        LOG_INFO("Pipeline[{}]: usb camera output format: {}", config.pipeline_id, config.src_format);
        config.src_width = inputConfig["usb-camera"]["width"].asInt();
        LOG_INFO("Pipeline[{}]: usb camera output width: {}", config.pipeline_id, config.src_width);
        config.src_height = inputConfig["usb-camera"]["height"].asInt();
        LOG_INFO("Pipeline[{}]: usb camera output height: {}", config.pipeline_id, config.src_height);
        config.src_framerate_n = inputConfig["usb-camera"]["framerate-n"].asInt();
        LOG_INFO("Pipeline[{}]: usb camera output height: {}", config.pipeline_id, config.src_framerate_n);
        config.src_framerate_d = inputConfig["usb-camera"]["framerate-d"].asInt();
        LOG_INFO("Pipeline[{}]: usb camera output height: {}", config.pipeline_id, config.src_framerate_d);
    }

    if (root.isMember("output-config")) {
        Json::Value outputConfig = root["output-config"];
        if (outputConfig.isMember("display")) {
            Json::Value displayConfig = outputConfig["display"];
            config.enable_hdmi = displayConfig["enable"].asBool();
            LOG_INFO("Pipeline[{}]: enable-hdmi: {}", config.pipeline_id, config.enable_hdmi);
            config.hdmi_sync = displayConfig["sync"].asBool();
            LOG_INFO("Pipeline[{}]: hdmi-sync: {}", config.pipeline_id, config.hdmi_sync);
            config.window_x = displayConfig["left"].asInt();
            LOG_INFO("Pipeline[{}]: window-x: {}", config.pipeline_id, config.window_x);
            config.window_y = displayConfig["top"].asInt();
            LOG_INFO("Pipeline[{}]: window-y: {}", config.pipeline_id, config.window_y);
            config.window_width = displayConfig["width"].asInt();
            LOG_INFO("Pipeline[{}]: window-width: {}", config.pipeline_id, config.window_width);
            config.window_height = displayConfig["height"].asInt();
            LOG_INFO("Pipeline[{}]: window-height: {}", config.pipeline_id, config.window_height);
        }

        if (outputConfig.isMember("rtmp")) {
            Json::Value rtmpConfig = outputConfig["rtmp"];
            config.enable_rtmp = rtmpConfig["enable"].asBool();
            LOG_INFO("Pipeline[{}]: enable-rtmp: {}", config.pipeline_id, config.enable_rtmp);
            config.enc_bitrate = rtmpConfig["bitrate"].asInt();
            LOG_INFO("Pipeline[{}]: encode-birtate: {}", config.pipeline_id, config.enc_bitrate);
            config.enc_iframe_interval = rtmpConfig["iframeinterval"].asInt();
            LOG_INFO("Pipeline[{}]: encode-iframeinterval: {}", config.pipeline_id, config.enc_iframe_interval);
            config.rtmp_uri = rtmpConfig["uri"].asString();
            LOG_INFO("Pipeline[{}]: rtmp-uri: {}", config.pipeline_id, config.rtmp_uri);
        }

        if (outputConfig.isMember("inference")) {
            Json::Value inferenceConfig = outputConfig["inference"];
            config.enable_appsink = inferenceConfig["enable"].asBool();
            LOG_INFO("Pipeline[{}]: enable-appsink: {}", config.pipeline_id, config.enable_appsink);
            config.cvt_memory_type = inferenceConfig["memory-type"].asInt();
            LOG_INFO("Pipeline[{}]: videoconvert memory type: {}", config.pipeline_id, config.cvt_memory_type);
            config.cvt_format = inferenceConfig["format"].asString();
            LOG_INFO("Pipeline[{}]: videoconvert format: {}", config.pipeline_id, config.cvt_format);
        }
    }
}

static bool validateConfigPath(const char* name, const std::string& value) 
{ 
    if (0 == value.compare ("")) {
        LOG_ERROR("You must specify a config file!");
        return false;
    }

    struct stat statbuf;
    if (0 == stat(value.c_str(), &statbuf)) {
        return true;
    }

    LOG_ERROR("Can't stat model file: {}", value);
    return false;
}

DEFINE_string(config_path, "./pipeline.json", "Model config file path.");
DEFINE_validator(config_path, &validateConfigPath);

int main(int argc, char* argv[])
{
    google::ParseCommandLineFlags(&argc, &argv, true);

    VideoPipelineConfig m_vpConfig;
    VideoPipeline *m_vp;

    Parse(m_vpConfig, FLAGS_config_path);

    gst_init(&argc, &argv);

    g_setenv("GST_DEBUG_DUMP_DOT_DIR", "/home/ricardo/workSpace/gstreamer-example/ai_integration/deepstream/build", true);

    if (!(g_main_loop = g_main_loop_new(NULL, FALSE))) {
        LOG_ERROR("Failed to new a object with type GMainLoop");
        goto exit;
    }

    m_vp = new VideoPipeline(m_vpConfig);

    if (!m_vp->Create()) {
        LOG_ERROR("Pipeline Create failed: lack of elements");
        goto exit;
    }

    m_vp->Start();

    g_main_loop_run(g_main_loop);

exit:
    if (g_main_loop) g_main_loop_unref(g_main_loop);

    if (m_vp) {
        // m_vp->Destroy();
        delete m_vp;
        m_vp = NULL;
    }

    google::ShutDownCommandLineFlags();
    return 0;
}

================================================
FILE: ai_integration/test_30fps.h264
================================================
[File too large to display: 15.0 MB]

================================================
FILE: ai_integration/test_30fps.ts
================================================
[File too large to display: 23.5 MB]

================================================
FILE: ai_integration/video-pipeline.dot
================================================
digraph pipeline {
  rankdir=LR;
  fontname="sans";
  fontsize="10";
  labelloc=t;
  nodesep=.1;
  ranksep=.2;
  label="<GstPipeline>\nvideo-pipeline\n[>]";
  node [style="filled,rounded", shape=box, fontsize="9", fontname="sans", margin="0.0,0.0"];
  edge [labelfontsize="6", fontsize="9", fontname="monospace"];
  
  legend [
    pos="0,0!",
    margin="0.05,0.05",
    style="filled",
    label="Legend\lElement-States: [~] void-pending, [0] null, [-] ready, [=] paused, [>] playing\lPad-Activation: [-] none, [>] push, [<] pull\lPad-Flags: [b]locked, [f]lushing, [b]locking, [E]OS; upper-case is set\lPad-Task: [T] has started task, [t] has paused task\l",
  ];
  subgraph cluster_appsink_0x55726e1fbc80 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstAppSink\nappsink\n[>]\nparent=(GstPipeline) video-pipeline\nlast-sample=((GstSample*) 0x7f59fc124260)\neos=FALSE\nemit-signals=TRUE";
    subgraph cluster_appsink_0x55726e1fbc80_sink {
      label="";
      style="invis";
      appsink_0x55726e1fbc80_sink_0x55726e1fc820 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    fillcolor="#aaaaff";
  }

  subgraph cluster_capfilter1_0x55726d80e5b0 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstCapsFilter\ncapfilter1\n[>]\nparent=(GstPipeline) video-pipeline\ncaps=video/x-raw(memory:NVMM), format=(string)RGBA";
    subgraph cluster_capfilter1_0x55726d80e5b0_sink {
      label="";
      style="invis";
      capfilter1_0x55726d80e5b0_sink_0x55726e1fc380 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_capfilter1_0x55726d80e5b0_src {
      label="";
      style="invis";
      capfilter1_0x55726d80e5b0_src_0x55726e1fc5d0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    capfilter1_0x55726d80e5b0_sink_0x55726e1fc380 -> capfilter1_0x55726d80e5b0_src_0x55726e1fc5d0 [style="invis"];
    fillcolor="#aaffaa";
  }

  capfilter1_0x55726d80e5b0_src_0x55726e1fc5d0 -> appsink_0x55726e1fbc80_sink_0x55726e1fc820 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_videocvt1_0x55726e1f9df0 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="Gstnvvideoconvert\nvideocvt1\n[>]\nparent=(GstPipeline) video-pipeline\nsrc-crop=\"0:0:0:0\"\ndest-crop=\"0:0:0:0\"\nnvbuf-memory-type=nvbuf-mem-cuda-unified";
    subgraph cluster_videocvt1_0x55726e1f9df0_sink {
      label="";
      style="invis";
      videocvt1_0x55726e1f9df0_sink_0x55726d731d00 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_videocvt1_0x55726e1f9df0_src {
      label="";
      style="invis";
      videocvt1_0x55726e1f9df0_src_0x55726e1fc130 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    videocvt1_0x55726e1f9df0_sink_0x55726d731d00 -> videocvt1_0x55726e1f9df0_src_0x55726e1fc130 [style="invis"];
    fillcolor="#aaffaa";
  }

  videocvt1_0x55726e1f9df0_src_0x55726e1fc130 -> capfilter1_0x55726d80e5b0_sink_0x55726e1fc380 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_queue1_0x55726d734390 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstQueue\nqueue1\n[>]\nparent=(GstPipeline) video-pipeline\ncurrent-level-buffers=4\ncurrent-level-bytes=256\ncurrent-level-time=133200000";
    subgraph cluster_queue1_0x55726d734390_sink {
      label="";
      style="invis";
      queue1_0x55726d734390_sink_0x55726d731860 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_queue1_0x55726d734390_src {
      label="";
      style="invis";
      queue1_0x55726d734390_src_0x55726d731ab0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb][T]", height="0.2", style="filled,solid"];
    }

    queue1_0x55726d734390_sink_0x55726d731860 -> queue1_0x55726d734390_src_0x55726d731ab0 [style="invis"];
    fillcolor="#aaffaa";
  }

  queue1_0x55726d734390_src_0x55726d731ab0 -> videocvt1_0x55726e1f9df0_sink_0x55726d731d00 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
  subgraph cluster_display_0x55726e1f43a0 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstEglGlesSink\ndisplay\n[>]\nparent=(GstPipeline) video-pipeline\nmax-lateness=5000000\nqos=TRUE\nlast-sample=((GstSample*) 0x7f59fc124340)\nprocessing-deadline=15000000\nwindow-x=0\nwindow-y=0\nwindow-width=1920\nwindow-height=1080";
    subgraph cluster_display_0x55726e1f43a0_sink {
      label="";
      style="invis";
      display_0x55726e1f43a0_sink_0x55726d731610 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    fillcolor="#aaaaff";
  }

  subgraph cluster_overlay_0x55726e13bc20 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstNvDsOsd\noverlay\n[>]\nparent=(GstPipeline) video-pipeline\nclock-font=NULL\nclock-font-size=0\nclock-color=0\nhw-blend-color-attr=\"3,1.000000,1.000000,0.000000,0.300000:\"\ndisplay-mask=FALSE";
    subgraph cluster_overlay_0x55726e13bc20_sink {
      label="";
      style="invis";
      overlay_0x55726e13bc20_sink_0x55726d731170 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_overlay_0x55726e13bc20_src {
      label="";
      style="invis";
      overlay_0x55726e13bc20_src_0x55726d7313c0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    overlay_0x55726e13bc20_sink_0x55726d731170 -> overlay_0x55726e13bc20_src_0x55726d7313c0 [style="invis"];
    fillcolor="#aaffaa";
  }

  overlay_0x55726e13bc20_src_0x55726d7313c0 -> display_0x55726e1f43a0_sink_0x55726d731610 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_capfilter0_0x55726d80e270 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstCapsFilter\ncapfilter0\n[>]\nparent=(GstPipeline) video-pipeline\ncaps=video/x-raw(memory:NVMM), format=(string)RGBA";
    subgraph cluster_capfilter0_0x55726d80e270_sink {
      label="";
      style="invis";
      capfilter0_0x55726d80e270_sink_0x55726d730cd0 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_capfilter0_0x55726d80e270_src {
      label="";
      style="invis";
      capfilter0_0x55726d80e270_src_0x55726d730f20 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    capfilter0_0x55726d80e270_sink_0x55726d730cd0 -> capfilter0_0x55726d80e270_src_0x55726d730f20 [style="invis"];
    fillcolor="#aaffaa";
  }

  capfilter0_0x55726d80e270_src_0x55726d730f20 -> overlay_0x55726e13bc20_sink_0x55726d731170 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_videocvt0_0x55726d7c6980 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="Gstnvvideoconvert\nvideocvt0\n[>]\nparent=(GstPipeline) video-pipeline\nsrc-crop=\"0:0:0:0\"\ndest-crop=\"0:0:0:0\"\nnvbuf-memory-type=nvbuf-mem-cuda-unified";
    subgraph cluster_videocvt0_0x55726d7c6980_sink {
      label="";
      style="invis";
      videocvt0_0x55726d7c6980_sink_0x55726d730830 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_videocvt0_0x55726d7c6980_src {
      label="";
      style="invis";
      videocvt0_0x55726d7c6980_src_0x55726d730a80 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    videocvt0_0x55726d7c6980_sink_0x55726d730830 -> videocvt0_0x55726d7c6980_src_0x55726d730a80 [style="invis"];
    fillcolor="#aaffaa";
  }

  videocvt0_0x55726d7c6980_src_0x55726d730a80 -> capfilter0_0x55726d80e270_sink_0x55726d730cd0 [label="video/x-raw(memory:NVMM)\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l              format: RGBA\l        block-linear: false\l"]
  subgraph cluster_queue0_0x55726d734090 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstQueue\nqueue0\n[>]\nparent=(GstPipeline) video-pipeline\ncurrent-level-buffers=3\ncurrent-level-bytes=192\ncurrent-level-time=99900000";
    subgraph cluster_queue0_0x55726d734090_sink {
      label="";
      style="invis";
      queue0_0x55726d734090_sink_0x55726d730390 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_queue0_0x55726d734090_src {
      label="";
      style="invis";
      queue0_0x55726d734090_src_0x55726d7305e0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb][T]", height="0.2", style="filled,solid"];
    }

    queue0_0x55726d734090_sink_0x55726d730390 -> queue0_0x55726d734090_src_0x55726d7305e0 [style="invis"];
    fillcolor="#aaffaa";
  }

  queue0_0x55726d734090_src_0x55726d7305e0 -> videocvt0_0x55726d7c6980_sink_0x55726d730830 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
  subgraph cluster_tee0_0x55726d72e000 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstTee\ntee0\n[>]\nparent=(GstPipeline) video-pipeline\nnum-src-pads=2";
    subgraph cluster_tee0_0x55726d72e000_sink {
      label="";
      style="invis";
      tee0_0x55726d72e000_sink_0x55726d730140 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
    }

    subgraph cluster_tee0_0x55726d72e000_src {
      label="";
      style="invis";
      tee0_0x55726d72e000_src_0_0x55726d7282e0 [color=black, fillcolor="#ffaaaa", label="src_0\n[>][bfb]", height="0.2", style="filled,dashed"];
      tee0_0x55726d72e000_src_1_0x55726d728540 [color=black, fillcolor="#ffaaaa", label="src_1\n[>][bfb]", height="0.2", style="filled,dashed"];
    }

    tee0_0x55726d72e000_sink_0x55726d730140 -> tee0_0x55726d72e000_src_0_0x55726d7282e0 [style="invis"];
    fillcolor="#aaffaa";
  }

  tee0_0x55726d72e000_src_0_0x55726d7282e0 -> queue0_0x55726d734090_sink_0x55726d730390 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
  tee0_0x55726d72e000_src_1_0x55726d728540 -> queue1_0x55726d734390_sink_0x55726d731860 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
  subgraph cluster_uri_0x55726d728060 {
    fontname="Bitstream Vera Sans";
    fontsize="8";
    style="filled,rounded";
    color=black;
    label="GstURIDecodeBin\nuri\n[>]\nparent=(GstPipeline) video-pipeline\nuri=\"file:///home/ricardo/workSpace/gstreamer-example/ai_integration/test.mp4\"\nsource=(GstFileSrc) source\ncaps=video/x-raw(ANY); audio/x-raw(ANY); text/x-raw(ANY); subpicture/x-dvd; subpictur…";
    subgraph cluster_uri_0x55726d728060_src {
      label="";
      style="invis";
      _proxypad4_0x55726d729d10 [color=black, fillcolor="#ffdddd", label="proxypad4\n[>][bfb]", height="0.2", style="filled,dotted"];
    _proxypad4_0x55726d729d10 -> uri_0x55726d728060_src_0_0x55726f06eaf0 [style=dashed, minlen=0]
      uri_0x55726d728060_src_0_0x55726f06eaf0 [color=black, fillcolor="#ffdddd", label="src_0\n[>][bfb]", height="0.2", style="filled,dotted"];
      _proxypad5_0x7f5a0032c130 [color=black, fillcolor="#ffdddd", label="proxypad5\n[>][bfb]", height="0.2", style="filled,dotted"];
    _proxypad5_0x7f5a0032c130 -> uri_0x55726d728060_src_1_0x55726f06ed70 [style=dashed, minlen=0]
      uri_0x55726d728060_src_1_0x55726f06ed70 [color=black, fillcolor="#ffdddd", label="src_1\n[>][bfb]", height="0.2", style="filled,dotted"];
    }

    fillcolor="#ffffff";
    subgraph cluster_decodebin0_0x55726f06c090 {
      fontname="Bitstream Vera Sans";
      fontsize="8";
      style="filled,rounded";
      color=black;
      label="GstDecodeBin\ndecodebin0\n[>]\nparent=(GstURIDecodeBin) uri\ncaps=video/x-raw(ANY); audio/x-raw(ANY); text/x-raw(ANY); subpicture/x-dvd; subpictur…";
      subgraph cluster_decodebin0_0x55726f06c090_sink {
        label="";
        style="invis";
        _proxypad0_0x55726d7287b0 [color=black, fillcolor="#ddddff", label="proxypad0\n[<][bfb]", height="0.2", style="filled,solid"];
      decodebin0_0x55726f06c090_sink_0x55726f06e0f0 -> _proxypad0_0x55726d7287b0 [style=dashed, minlen=0]
        decodebin0_0x55726f06c090_sink_0x55726f06e0f0 [color=black, fillcolor="#ddddff", label="sink\n[<][bfb]", height="0.2", style="filled,solid"];
      }

      subgraph cluster_decodebin0_0x55726f06c090_src {
        label="";
        style="invis";
        _proxypad2_0x55726d728a10 [color=black, fillcolor="#ffdddd", label="proxypad2\n[>][bfb]", height="0.2", style="filled,dotted"];
      _proxypad2_0x55726d728a10 -> decodebin0_0x55726f06c090_src_0_0x7f5a080320a0 [style=dashed, minlen=0]
        decodebin0_0x55726f06c090_src_0_0x7f5a080320a0 [color=black, fillcolor="#ffdddd", label="src_0\n[>][bfb]", height="0.2", style="filled,dotted"];
        _proxypad3_0x55726d729390 [color=black, fillcolor="#ffdddd", label="proxypad3\n[>][bfb]", height="0.2", style="filled,dotted"];
      _proxypad3_0x55726d729390 -> decodebin0_0x55726f06c090_src_1_0x7f5a08032b20 [style=dashed, minlen=0]
        decodebin0_0x55726f06c090_src_1_0x7f5a08032b20 [color=black, fillcolor="#ffdddd", label="src_1\n[>][bfb]", height="0.2", style="filled,dotted"];
      }

      decodebin0_0x55726f06c090_sink_0x55726f06e0f0 -> decodebin0_0x55726f06c090_src_0_0x7f5a080320a0 [style="invis"];
      fillcolor="#ffffff";
      subgraph cluster_nvv4l2decoder0_0x7f5a00018ee0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="nvv4l2decoder\nnvv4l2decoder0\n[>]\nparent=(GstDecodeBin) decodebin0\ndevice=\"/dev/nvidia0\"\ndevice-name=\"\"\ndevice-fd=31\ndrop-frame-interval=0\nnum-extra-surfaces=0";
        subgraph cluster_nvv4l2decoder0_0x7f5a00018ee0_sink {
          label="";
          style="invis";
          nvv4l2decoder0_0x7f5a00018ee0_sink_0x7f59fc132410 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_nvv4l2decoder0_0x7f5a00018ee0_src {
          label="";
          style="invis";
          nvv4l2decoder0_0x7f5a00018ee0_src_0x7f59fc132660 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb][T]", height="0.2", style="filled,solid"];
        }

        nvv4l2decoder0_0x7f5a00018ee0_sink_0x7f59fc132410 -> nvv4l2decoder0_0x7f5a00018ee0_src_0x7f59fc132660 [style="invis"];
        fillcolor="#aaffaa";
      }

      nvv4l2decoder0_0x7f5a00018ee0_src_0x7f59fc132660 -> _proxypad2_0x55726d728a10 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
      subgraph cluster_avdec_aac0_0x7f59fc1314d0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="avdec_aac\navdec_aac0\n[>]\nparent=(GstDecodeBin) decodebin0";
        subgraph cluster_avdec_aac0_0x7f59fc1314d0_sink {
          label="";
          style="invis";
          avdec_aac0_0x7f59fc1314d0_sink_0x7f59fc00b8f0 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_avdec_aac0_0x7f59fc1314d0_src {
          label="";
          style="invis";
          avdec_aac0_0x7f59fc1314d0_src_0x7f59fc00bb40 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        avdec_aac0_0x7f59fc1314d0_sink_0x7f59fc00b8f0 -> avdec_aac0_0x7f59fc1314d0_src_0x7f59fc00bb40 [style="invis"];
        fillcolor="#aaffaa";
      }

      avdec_aac0_0x7f59fc1314d0_src_0x7f59fc00bb40 -> _proxypad3_0x55726d729390 [label="audio/x-raw\l              format: F32LE\l              layout: non-interleaved\l                rate: 48000\l            channels: 2\l        channel-mask: 0x0000000000000003\l"]
      subgraph cluster_aacparse0_0x7f59fc0900f0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstAacParse\naacparse0\n[>]\nparent=(GstDecodeBin) decodebin0";
        subgraph cluster_aacparse0_0x7f59fc0900f0_sink {
          label="";
          style="invis";
          aacparse0_0x7f59fc0900f0_sink_0x7f59fc00b450 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_aacparse0_0x7f59fc0900f0_src {
          label="";
          style="invis";
          aacparse0_0x7f59fc0900f0_src_0x7f59fc00b6a0 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        aacparse0_0x7f59fc0900f0_sink_0x7f59fc00b450 -> aacparse0_0x7f59fc0900f0_src_0x7f59fc00b6a0 [style="invis"];
        fillcolor="#aaffaa";
      }

      aacparse0_0x7f59fc0900f0_src_0x7f59fc00b6a0 -> avdec_aac0_0x7f59fc1314d0_sink_0x7f59fc00b8f0 [label="audio/mpeg\l         mpegversion: 4\l              framed: true\l       stream-format: raw\l               level: 2\l        base-profile: lc\l             profile: lc\l          codec_data: 1190\l                rate: 48000\l            channels: 2\l"]
      subgraph cluster_capsfilter0_0x55726d80ef70 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstCapsFilter\ncapsfilter0\n[>]\nparent=(GstDecodeBin) decodebin0\ncaps=video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, parsed=(b…";
        subgraph cluster_capsfilter0_0x55726d80ef70_sink {
          label="";
          style="invis";
          capsfilter0_0x55726d80ef70_sink_0x7f59fc00a8c0 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_capsfilter0_0x55726d80ef70_src {
          label="";
          style="invis";
          capsfilter0_0x55726d80ef70_src_0x7f59fc00ab10 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        capsfilter0_0x55726d80ef70_sink_0x7f59fc00a8c0 -> capsfilter0_0x55726d80ef70_src_0x7f59fc00ab10 [style="invis"];
        fillcolor="#aaffaa";
      }

      capsfilter0_0x55726d80ef70_src_0x7f59fc00ab10 -> nvv4l2decoder0_0x7f5a00018ee0_sink_0x7f59fc132410 [label="video/x-h264\l       stream-format: byte-stream\l           alignment: au\l               level: 4.2\l             profile: high\l               width: 1920\l              height: 1080\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l      interlace-mode: progressive\l       chroma-format: 4:2:0\l      bit-depth-luma: 8\l    bit-depth-chroma: 8\l              parsed: true\l"]
      subgraph cluster_h264parse0_0x7f59fc0108a0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstH264Parse\nh264parse0\n[>]\nparent=(GstDecodeBin) decodebin0\nconfig-interval=-1";
        subgraph cluster_h264parse0_0x7f59fc0108a0_sink {
          label="";
          style="invis";
          h264parse0_0x7f59fc0108a0_sink_0x7f59fc00a420 [color=black, fillcolor="#aaaaff", label="sink\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_h264parse0_0x7f59fc0108a0_src {
          label="";
          style="invis";
          h264parse0_0x7f59fc0108a0_src_0x7f59fc00a670 [color=black, fillcolor="#ffaaaa", label="src\n[>][bfb]", height="0.2", style="filled,solid"];
        }

        h264parse0_0x7f59fc0108a0_sink_0x7f59fc00a420 -> h264parse0_0x7f59fc0108a0_src_0x7f59fc00a670 [style="invis"];
        fillcolor="#aaffaa";
      }

      h264parse0_0x7f59fc0108a0_src_0x7f59fc00a670 -> capsfilter0_0x55726d80ef70_sink_0x7f59fc00a8c0 [label="video/x-h264\l       stream-format: byte-stream\l           alignment: au\l               level: 4.2\l             profile: high\l               width: 1920\l              height: 1080\l  pixel-aspect-ratio: 1/1\l           framerate: 20000/333\l      interlace-mode: progressive\l       chroma-format: 4:2:0\l      bit-depth-luma: 8\l    bit-depth-chroma: 8\l              parsed: true\l"]
      subgraph cluster_multiqueue0_0x7f59fc00d060 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstMultiQueue\nmultiqueue0\n[>]\nparent=(GstDecodeBin) decodebin0\nmax-size-bytes=2097152\nmax-size-time=0";
        subgraph cluster_multiqueue0_0x7f59fc00d060_sink {
          label="";
          style="invis";
          multiqueue0_0x7f59fc00d060_sink_0_0x55726e1fdcf0 [color=black, fillcolor="#aaaaff", label="sink_0\n[>][bfb]", height="0.2", style="filled,dashed"];
          multiqueue0_0x7f59fc00d060_sink_1_0x7f59fc00afb0 [color=black, fillcolor="#aaaaff", label="sink_1\n[>][bfb]", height="0.2", style="filled,dashed"];
        }

        subgraph cluster_multiqueue0_0x7f59fc00d060_src {
          label="";
          style="invis";
          multiqueue0_0x7f59fc00d060_src_0_0x7f59fc00a1d0 [color=black, fillcolor="#ffaaaa", label="src_0\n[>][bfb][T]", height="0.2", style="filled,dotted"];
          multiqueue0_0x7f59fc00d060_src_1_0x7f59fc00b200 [color=black, fillcolor="#ffaaaa", label="src_1\n[>][bfb][T]", height="0.2", style="filled,dotted"];
        }

        multiqueue0_0x7f59fc00d060_sink_0_0x55726e1fdcf0 -> multiqueue0_0x7f59fc00d060_src_0_0x7f59fc00a1d0 [style="invis"];
        fillcolor="#aaffaa";
      }

      multiqueue0_0x7f59fc00d060_src_0_0x7f59fc00a1d0 -> h264parse0_0x7f59fc0108a0_sink_0x7f59fc00a420 [label="video/x-h264\l       stream-format: avc\l           alignment: au\l               level: 4.2\l             profile: high\l          codec_data: 0164002affe10018676400...\l               width: 1920\l              height: 1080\l  pixel-aspect-ratio: 1/1\l"]
      multiqueue0_0x7f59fc00d060_src_1_0x7f59fc00b200 -> aacparse0_0x7f59fc0900f0_sink_0x7f59fc00b450 [label="audio/mpeg\l         mpegversion: 4\l              framed: true\l       stream-format: raw\l               level: 2\l        base-profile: lc\l             profile: lc\l          codec_data: 1190\l                rate: 48000\l            channels: 2\l"]
      subgraph cluster_qtdemux0_0x7f5a0807e140 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstQTDemux\nqtdemux0\n[>]\nparent=(GstDecodeBin) decodebin0";
        subgraph cluster_qtdemux0_0x7f5a0807e140_sink {
          label="";
          style="invis";
          qtdemux0_0x7f5a0807e140_sink_0x55726e1fd160 [color=black, fillcolor="#aaaaff", label="sink\n[<][bfb][T]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_qtdemux0_0x7f5a0807e140_src {
          label="";
          style="invis";
          qtdemux0_0x7f5a0807e140_video_0_0x55726e1fdaa0 [color=black, fillcolor="#ffaaaa", label="video_0\n[>][bfb]", height="0.2", style="filled,dotted"];
          qtdemux0_0x7f5a0807e140_audio_0_0x7f59fc00ad60 [color=black, fillcolor="#ffaaaa", label="audio_0\n[>][bfb]", height="0.2", style="filled,dotted"];
        }

        qtdemux0_0x7f5a0807e140_sink_0x55726e1fd160 -> qtdemux0_0x7f5a0807e140_video_0_0x55726e1fdaa0 [style="invis"];
        fillcolor="#aaffaa";
      }

      qtdemux0_0x7f5a0807e140_video_0_0x55726e1fdaa0 -> multiqueue0_0x7f59fc00d060_sink_0_0x55726e1fdcf0 [label="video/x-h264\l       stream-format: avc\l           alignment: au\l               level: 4.2\l             profile: high\l          codec_data: 0164002affe10018676400...\l               width: 1920\l              height: 1080\l  pixel-aspect-ratio: 1/1\l"]
      qtdemux0_0x7f5a0807e140_audio_0_0x7f59fc00ad60 -> multiqueue0_0x7f59fc00d060_sink_1_0x7f59fc00afb0 [label="audio/mpeg\l         mpegversion: 4\l              framed: true\l       stream-format: raw\l               level: 2\l        base-profile: lc\l             profile: lc\l          codec_data: 1190\l                rate: 48000\l            channels: 2\l"]
      subgraph cluster_typefind_0x55726f3c40b0 {
        fontname="Bitstream Vera Sans";
        fontsize="8";
        style="filled,rounded";
        color=black;
        label="GstTypeFindElement\ntypefind\n[>]\nparent=(GstDecodeBin) decodebin0\ncaps=video/quicktime, variant=(string)iso";
        subgraph cluster_typefind_0x55726f3c40b0_sink {
          label="";
          style="invis";
          typefind_0x55726f3c40b0_sink_0x55726e1fccc0 [color=black, fillcolor="#aaaaff", label="sink\n[<][bfb][t]", height="0.2", style="filled,solid"];
        }

        subgraph cluster_typefind_0x55726f3c40b0_src {
          label="";
          style="invis";
          typefind_0x55726f3c40b0_src_0x55726e1fcf10 [color=black, fillcolor="#ffaaaa", label="src\n[<][bfb]", height="0.2", style="filled,solid"];
        }

        typefind_0x55726f3c40b0_sink_0x55726e1fccc0 -> typefind_0x55726f3c40b0_src_0x55726e1fcf10 [style="invis"];
        fillcolor="#aaffaa";
      }

      _proxypad0_0x55726d7287b0 -> typefind_0x55726f3c40b0_sink_0x55726e1fccc0 [label="ANY"]
      typefind_0x55726f3c40b0_src_0x55726e1fcf10 -> qtdemux0_0x7f5a0807e140_sink_0x55726e1fd160 [labeldistance="10", labelangle="0", label="                                                  ", taillabel="ANY", headlabel="video/quicktime\lvideo/mj2\laudio/x-m4a\lapplication/x-3gp\l"]
    }

    decodebin0_0x55726f06c090_src_0_0x7f5a080320a0 -> _proxypad4_0x55726d729d10 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
    decodebin0_0x55726f06c090_src_1_0x7f5a08032b20 -> _proxypad5_0x7f5a0032c130 [label="audio/x-raw\l              format: F32LE\l              layout: non-interleaved\l                rate: 48000\l            channels: 2\l        channel-mask: 0x0000000000000003\l"]
    subgraph cluster_source_0x55726e7243e0 {
      fontname="Bitstream Vera Sans";
      fontsize="8";
      style="filled,rounded";
      color=black;
      label="GstFileSrc\nsource\n[>]\nparent=(GstURIDecodeBin) uri\nlocation=\"/home/ricardo/workSpace/gstreamer-example/ai_integration/test.mp4\"";
      subgraph cluster_source_0x55726e7243e0_src {
        label="";
        style="invis";
        source_0x55726e7243e0_src_0x55726e1fca70 [color=black, fillcolor="#ffaaaa", label="src\n[<][bfb]", height="0.2", style="filled,solid"];
      }

      fillcolor="#ffaaaa";
    }

    source_0x55726e7243e0_src_0x55726e1fca70 -> decodebin0_0x55726f06c090_sink_0x55726f06e0f0 [label="ANY"]
  }

  uri_0x55726d728060_src_0_0x55726f06eaf0 -> tee0_0x55726d72e000_sink_0x55726d730140 [label="video/x-raw(memory:NVMM)\l              format: NV12\l               width: 1920\l              height: 1080\l      interlace-mode: progressive\l      multiview-mode: mono\l     multiview-flags: 0:ffffffff:/right-view...\l  pixel-aspect-ratio: 1/1\l         chroma-site: mpeg2\l         colorimetry: bt709\l           framerate: 20000/333\l"]
}


================================================
FILE: application_develop/GstPadProbe/CMakeLists.txt
================================================
# created by Ricardo Lu in 08/29/2021

cmake_minimum_required(VERSION 3.10)

project(GstPadProbe)

set(CMAKE_CXX_STANDARD 11)

set(OpenCV_DIR "/opt/thundersoft/opencv-4.2.0/lib/cmake/opencv4")
find_package(OpenCV REQUIRED)

include(FindPkgConfig)
pkg_check_modules(GST    REQUIRED gstreamer-1.0)
pkg_check_modules(GSTAPP REQUIRED gstreamer-app-1.0)
pkg_check_modules(GLIB   REQUIRED glib-2.0)
pkg_check_modules(GFLAGS REQUIRED gflags)

include_directories(
    ${PROJECT_SOURCE_DIR}/inc
    ${GST_INCLUDE_DIRS}
    ${GSTAPP_INCLUDE_DIRS}
    ${GLIB_INCLUDE_DIRS}
    ${GFLAGS_INCLUDE_DIRS}
    ${OpenCV_INCLUDE_DIRS}
)

link_directories(
    ${GST_LIBRARY_DIRS}
    ${GSTAPP_LIBRARY_DIRS}
    ${GLIB_LIBRARY_DIRS}
    ${GFLAGS_LIBRARY_DIRS}
    ${OpenCV_LIBRARY_DIRS}
)

OPTION(COMPILE_FILE_SOURCE "build filesrc" OFF)
OPTION(COMPILE_RTSP_SOURCE "build rtspsrc" OFF)

if(COMPILE_FILE_SOURCE)
add_definitions(-DFILE_SOURCE)
endif(COMPILE_FILE_SOURCE)

if(COMPILE_RTSP_SOURCE)
add_definitions(-DRTSP_SOURCE)
endif(COMPILE_RTSP_SOURCE)

add_executable(${PROJECT_NAME}
    src/VideoPipeline.cpp
    src/main.cpp
)

target_link_libraries(${PROJECT_NAME}
    ${GST_LIBRARIES}
    ${GSTAPP_LIBRARIES}
    ${GLIB_LIBRARIES}
    ${GFLAGS_LIBRARIES}
    ${OpenCV_LIBRARIES}
    qtimlmeta
)

================================================
FILE: application_develop/GstPadProbe/README.md
================================================
# GstPadProbe

GstPad上发生的数据流、事件和查询可以通过探针进行监控,探针通过`gst_pad_add_probe()`安装,这为开发者提供了另外一种访问GStreamer pipeline数据的方式。

**教程地址:[GstPadProbe](https://ricardolu.gitbook.io/gstreamer/application-development/gstpadprobe)**

参考文档:

- [GstPad](https://gstreamer.freedesktop.org/documentation/gstreamer/gstpad.html)
- [Basic tutorial 7: Multithreading and Pad Availability](https://gstreamer.freedesktop.org/documentation/tutorials/basic/multithreading-and-pad-availability.html?gi-language=c)
- [Buffers not writable after tee](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/609)

## build & run

```shell
mkdir build
cd build

cmake .. -DCOMPILE_RTSP_SOURCE=ON -DCOMPILE_FILE_SOURCE=OFF
make
# rtspsrc
./GstPadProbe --srcuri rtsp://admin:1234@10.0.23.227:554

cmake .. -DCOMPILE_RTSP_SOURCE=OFF -DCOMPILE_FILE_SOURCE=ON
make
# filesrc
./GstPadProbe --srcuri /user/local/gstreamer-example/application_develop/video.mp4
```



================================================
FILE: application_develop/GstPadProbe/doc/gstpadprobe.md
================================================
# GstPadProbe

在[GStreamer-APP](https://ricardolu.gitbook.io/gstreamer/application-development/app)章节讲到了应用程序和GStreamer pipeline进行数据方式的一种方式,并且在示例中,使用`appsink`完成了从pipeline中取图像数据绘制,并把绘制后的图像经由`appsrc`重新送回pipeline中,这是目前基于GStreamer框架开发的应用程序最简单的一种架构。但是需要注意的是这里的`appsink`和`appsrc`实际上是两条pipeline,使用起来非常麻烦。在这篇教程中我将展示如何使用类似于[Basic tutorial 7: Multithreading and Pad Availability](https://ricardolu.gitbook.io/gstreamer/basic-theory/basic-tutorial-7-multithreading-and-pad-availability)中一样的example pipeline来实现相同的目标。

## GstPadProbe

GstElement实际是通过GstPad完成连接,这是一种非常轻量的原始链接点。数据在GstPad之间进行传递,

### gst_pad_add_probe()

```c
gulong
gst_pad_add_probe (GstPad * pad,
                   GstPadProbeType mask,
                   GstPadProbeCallback callback,
                   gpointer user_data,
                   GDestroyNotify destroy_data)
```

在pad状态发生改变的时候发出通知,为匹配掩码的每个状态调用提供的回调函数。

`pad`:添加probe的GstPad

`mask`:probe的掩码,详细参考[GstPadProbeType](https://gstreamer.freedesktop.org/documentation/gstreamer/gstpad.html?gi-language=c#GstPadProbeType)

`callback`:回调函数指针

`user_data`:用户传递给回调的数据

`destroy_data`:

返回值是一个无符号整型id,用于标识probe,`gst_pad_remove_probe()`释放probe用.

### GstPadProbeCallback

```c
GstPadProbeReturn
(*GstPadProbeCallback) (GstPad * pad,
                        GstPadProbeInfo * info,
                        gpointer user_data)
```

pad的对应状态下调用的probe回调函数,可以修改`info`指向的数据。

### GstPadProbeInfo

```c
struct _GstPadProbeInfo
{
  GstPadProbeType type;
  gulong id;
  gpointer data;
  guint64 offset;
  guint size;

  /*< private >*/
  union {
    gpointer _gst_reserved[GST_PADDING];
    struct {
      GstFlowReturn flow_ret;
    } abi;
  } ABI;
};
```

`data`根据不同的probe type具有不同的类型,可以直接操作`data`指针,也可以通过GstPadProbeInfo提供的借口获取其下的数据。常用的有`gst_pad_probe_info_get_buffer()`,用于获取经过pad的GstBuffer。

## Pipeline

### Overview

![SampleFrame](images/gstpadprobe/SampleFrame.png)

在[GStreamer-APP](https://ricardolu.gitbook.io/gstreamer/application-development/app)教程的实例中,我们通过`appsink`将GstBuffer传递到用户空间然后使用OpenCV绘制了一个红色的矩形框和appsink字符串,并且通过`appsrc`的回调中绘制了一个绿色的矩形框和appsrc字符串,最后将绘制后的cv::Mat转为GstBuffer送回pipeline中并用`waylandsink`显示在屏幕上。

开头说过,`appsink`和`appsrc`各为一条pipeline,为了程序的正常运行,需要用户自行维护两条pipeline的数据同步,这是一个令人头疼的问题,并且为了画图总共发生了两次内存拷贝,这在应用中将占用一部分CPU性能。在本教程中,我们通过在`queue0`的`src pad`中注册一个`GST_PAD_PROBE_TYPE_BUFFER`类型的probe回调,取出经过`queue0`的GstBuffer并将要绘制的内容直接加到该buffer的metadata中,使用`qtioverlay`完成了相关内容的绘制。

### qtioverlay

`qtioverlay`是高通平台上的一个Overlay插件,内部依赖`metadata`调用C2D库完成了在`NV12`图像上bounding box和一个简单的bbox text的绘制,为了支持动态修改overlay color,我为其添加了一个`meta-color`的property,有关修改和使用的详情请阅读[Qualcomm-gst-plugin: qtioverlay](https://ricardolu.gitbook.io/gstreamer/qualcomm-gstreamer-plugins/qtioverlay)。

**注:**假如只需要在`NV12`图像上画矩形框,[draw-yuv-rectangle](https://github.com/gesanqiu/draw-rectangle-on-YUV)库实现了相同的功能。

## Issue

### tee的request-pad

在[Basic tutorial 7: Multithreading and Pad Availability](https://ricardolu.gitbook.io/gstreamer/basic-theory/basic-tutorial-7-multithreading-and-pad-availability)中使用了`gst_element_request_pad_simple()`向`tee`请求生成的`src-pad`,并且使用`gst_element_release_request_pad()`释放请求的GstPad。

```c
GstPad *
gst_element_request_pad_simple (GstElement * element,
                                const gchar * name)
```

但是需要注意的是`gst_element_request_pad_simple()`是在GStreamer-1.20之后才引入的新特性,旧的版本应该使用`gst_element_get_request_pad()`来申请。

### GstBuffer isn't writable

- `cb_queue0_probe()`

```shell
(GstPadProbe:9069): GStreamer-CRITICAL **: 14:05:03.871: gst_buffer_add_meta: assertion 'gst_buffer_is_writable (buffer)' failed
```

1. 等待GstBuffer同步

   在[Buffers not writable after tee](https://gitlab.freedesktop.org/gstreamer/gstreamer/-/issues/609)这个关于`tee`的issue中提到:

   > If there are multiple references to a single buffer, writing while another thread may be reading results in data corruption.
   >
   > 假如传递的buffer存在多个引用,在一个线程读buffer的同时在另一个线程中执行写buffer操作会引起竞争。

   这句话的核心在于pipeline的多个分支线程中维护的其实是同一个buffer的不同引用,这是建立在`tee`插件只做了浅拷贝而不是深拷贝的基础上的,官方对于`tee`的说明其实比较含糊,只提到`Split data to multiple pads. `用的是split而不是copy也不是reference,所以我也并不确定`tee`的底层机制。

   假如基于`tee`只是增加引用计数的思路来考虑,这就意味着display branch和appsink branch使用的是同一个GstBuffer的不同引用,也就是当`cb_queue0_probe()`请求访问probe buffer的时候,`qtivtransform`有可能正在对这个buffer进行读写操作,这时候为了线程安全自然应该上锁,所以probe buffer不可写。

   因此我的解决思路是给`qtivtransform`的`src-pad`也加一个probe,当GstBuffer到达`src-pad`时说明`qtivtransform`的操作已经完成,这时进行一个unlock通知`cb_queue0_probe`取buffer并进行相关操作即可。

```c
    // sync
    if (info->type & GST_PAD_PROBE_TYPE_BUFFER && !vp->isExited) {
        g_mutex_lock (&vp->m_syncMuxtex);
        while (g_atomic_int_get (&vp->m_syncCount) <= 0)
            g_cond_wait (&vp->m_syncCondition, &vp->m_syncMuxtex);
        if (!g_atomic_int_dec_and_test (&vp->m_syncCount)) {
            //LOG_INFO_MSG ("m_syncCount:%d/%d", vp->m_syncCount,
            //    vp->pipeline_id_);
        }
        g_mutex_unlock (&vp->m_syncMuxtex);
    }

    // osd the result
    if (vp->m_getResultFunc) {
        const std::shared_ptr<cv::Rect> result =
            vp->m_getResultFunc (vp->m_getResultArgs);
        if (result && vp->m_procDataFunc) {
            vp->m_procDataFunc (buffer, result);
        }
    }
```

2. `gst_buffer_make_writable()`

   查看GstBffer文档可以知道,我们还可以通过`gst_buffer_make_writable()`来拷贝一份buffer,使得buffer可写,而且假如原buffer已经可写,那么这个调用只是简单的返回,拷贝并不会发生,因此不会造成过多的性能损耗。

   buffer操作完之后再使用`gst_pad_push()`将buffer传递给与`srd pad`连接的下一个插件的`sink pad`中即可。

   ```c++
       buffer = gst_buffer_make_writable (buffer);
   
       // osd the result
       if (vp->m_getResultFunc) {
           const std::shared_ptr<cv::Rect> result =
               vp->m_getResultFunc (vp->m_getResultArgs);
           if (result && vp->m_procDataFunc) {
               vp->m_procDataFunc (buffer, result);
           }
       }
   
       gst_pad_push (pad, buffer);
   ```

## Summary

至此我相信读者已经具备了开发自己的pipeline的能力,作为一个嵌入式平台的开发者,性能永远是第一目标,因此在实际使用中pipeline的架构需要反复斟酌优化。事实上通过GStreamer-APP和GstPadProbe两个例子,应该已经具备了初步的优化意识,关于架构优化,欢迎读者按顺序阅读下面三个repo的README,它记录了我基于GStreamer框架下的一个yolov3物体识别算法视频应用的从诞生到完善的完整流程:

[Ericsson-Yolov3-SNPE](https://github.com/gesanqiu/Ericsson-Yolov3-SNPE)

[Gst-AIDemo-Optimize](https://github.com/gesanqiu/Gst-AIDemo-Optimize)

[yolov3-thread-pool](https://github.com/gesanqiu/yolov3-thread-pool)

希望能给各位一些启发。



================================================
FILE: application_develop/GstPadProbe/inc/Common.h
================================================
/*
 * @Description: Common Utils.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-27 12:24:25
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-08-29 13:31:00
 */
#pragma once

#include <iostream>
#include <string>
#include <memory>
#include <functional>
#include <unistd.h>

#include <opencv2/opencv.hpp>
#include <gst/gst.h>
#include <gst/app/app.h>

#define LOG_ERROR_MSG(msg, ...)  \
    g_print("** ERROR: <%s:%s:%d>: " msg "\n", __FILE__, __func__, __LINE__, ##__VA_ARGS__)

#define LOG_INFO_MSG(msg, ...) \
    g_print("** INFO:  <%s:%s:%d>: " msg "\n", __FILE__, __func__, __LINE__, ##__VA_ARGS__)

#define LOG_WARN_MSG(msg, ...) \
    g_print("** WARN:  <%s:%s:%d>: " msg "\n", __FILE__, __func__, __LINE__, ##__VA_ARGS__)

// callback functions
typedef std::function<void(std::shared_ptr<cv::Mat>, void*)> SinkPutDataFunc;

typedef std::function<std::shared_ptr<cv::Mat>(void*)>       SrcGetDataFunc;

typedef std::function<std::shared_ptr<cv::Rect>(void*)>      ProbeGetResultFunc;

typedef std::function<void(GstBuffer*,
            const std::shared_ptr<cv::Rect>&)> ProcDataFunc;


================================================
FILE: application_develop/GstPadProbe/inc/DoubleBufferCache.h
================================================
/*
 * @Description: Double Buffer Cache Implement.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-29 08:51:01
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-08-29 12:39:35
 */
#pragma once

#include <mutex>
#include <atomic>
#include <memory>
#include <list>

/** @brief Shared-buffer cache manager.
 *
 * */
template<typename T>
class DoubleBufCache {
public:
    /** @brief constructor
     * @param[in] notify_func When a new buffer is fed, it triggers the function handle.
     * */
    DoubleBufCache(std::function<bool()> notify_func =
            std::function<bool()>{nullptr}) noexcept : swap_ready(false) {
        this->notify_func = notify_func;
    }

    /** @brief deconstructor
     * */
    ~DoubleBufCache() noexcept {
        if (!debug_info.empty() ) {
            printf("DoubleBufCache %s destroyed.", debug_info.c_str());
        }
    }

    /** @brief Put the latest buffer into cache queue to be processed.
     *
     * Giving up control of previous front buffer.
     * @param[in] The latest buffer.
     * */
    void feed(std::shared_ptr<T> pending) {
        if (nullptr == pending.get()) {
            throw "ERROR: feed an empty buffer to DoubleBufCache";
        }
        swap_mtx.lock();
        front_sp = pending;
        swap_mtx.unlock();
        swap_ready = true;
        if (notify_func) {
            notify_func();
        }
        return;
    }

    /** @brief Get the front buffer.
     * @return Front buffer.
     * */
    std::shared_ptr<T> front()  noexcept {
        return front_sp;
    }

    /** @brief Fetch the shared back buffer.
     * @return Back buffer.
     * */
    std::shared_ptr<T> fetch()  noexcept {
        if (swap_ready) {
            swap_mtx.lock();
            back_sp = front_sp;
            swap_mtx.unlock();
            swap_ready = false;
        }
        return back_sp;
    }

private:
    //! Notification function will be called, if a new buffer fed.
    std::function<bool()> notify_func;
    //! The buffer cache can be swapped if the flag is equal to true.
    std::atomic<bool> swap_ready;
    //! Swapping mutex lock for thread safety.
    std::mutex swap_mtx;
    //! Front buffer for previous results saving.
    std::shared_ptr<T> front_sp;
    //! Back buffer to be fetched.
    std::shared_ptr<T> back_sp;
public:
    //! Indicate the name of an instantiated object for debug.
    std::string debug_info;
};

================================================
FILE: application_develop/GstPadProbe/inc/VideoPipeline.h
================================================
/*
 * @Description: GstPipeline common header.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-27 08:11:39
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-09-10 03:26:45
 */
#pragma once

#include "Common.h"

typedef struct _VideoPipelineConfig {
    std::string src;
    /*-------------qtivtransform-------------*/
    std::string conv_format;
    int         conv_width;
    int         conv_height;
}VideoPipelineConfig;

class VideoPipeline
{
public:
    VideoPipeline     (const VideoPipelineConfig& config);
    bool Create       (void);
    bool Start        (void);
    bool Pause        (void);
    bool Resume       (void);
    void Destroy      (void);
    void SetCallbacks (SinkPutDataFunc func, void* args);
    void SetCallbacks (ProbeGetResultFunc func, void* args);
    void SetCallbacks (ProcDataFunc func, void* args);
    ~VideoPipeline    (void);

public:
    SinkPutDataFunc     m_putDataFunc;
    void*               m_putDataArgs;
    ProbeGetResultFunc  m_getResultFunc;
    void*               m_getResultArgs;
    ProcDataFunc        m_procDataFunc;
    void*               m_procDataArgs;

    unsigned long       m_queue0_probe;
    unsigned long       m_trans_sink_probe;
    unsigned long       m_trans_src_probe;

    VideoPipelineConfig m_config;
    GstElement*         m_gstPipeline;

    volatile gint       m_syncCount;
    volatile gboolean   isExited;
    GMutex              m_syncMuxtex;
    GCond               m_syncCondition;
    GMutex              m_mutex;

    GstElement*         m_source;
    GstElement*         m_qtdemux;
    GstElement*         m_rtph264depay;
    GstElement*         m_h264parse;
    GstElement*         m_decoder;
    GstElement*         m_tee;
    GstPad*             m_teeDisplayPad;
    GstPad*             m_teeAppsinkPad;
    GstElement*         m_queue0;
    GstElement*         m_qtioverlay;
    GstElement*         m_display;
    GstElement*         m_queue1;
    GstElement*         m_qtivtrans;
    GstElement*         m_capfilter;
    GstElement*         m_appsink;
};

/*
gst-launch-1.0 filesrc location=video.mp4 ! qtdemux ! h264parse ! qtivdec ! 
tee name=t1 t1. ! queue ! qtioverlay meta-color=true ! waylandsink t1. ! queue ! qtivtransform ! 
video/x-raw,format=BGR,width=1920,height=1080 ! appsink
*/

================================================
FILE: application_develop/GstPadProbe/src/VideoPipeline.cpp
================================================
/*
 * @Description: Implement of VideoPipeline.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-27 12:01:39
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-09-10 08:16:03
 */

#include "VideoPipeline.h"

static GstPadProbeReturn cb_sync_before_buffer_probe (
    GstPad* pad,
    GstPadProbeInfo* info,
    gpointer user_data)
{
    //LOG_INFO_MSG ("cb_sync_before_buffer_probe called");

    VideoPipeline* vp = reinterpret_cast<VideoPipeline*> (user_data);
    GstBuffer* buffer = (GstBuffer*) info->data;

    return GST_PAD_PROBE_OK;
}

static GstPadProbeReturn cb_sync_buffer_probe (
    GstPad* pad,
    GstPadProbeInfo* info,
    gpointer user_data)
{
    //LOG_INFO_MSG ("cb_sync_buffer_probe called");

    VideoPipeline* vp = reinterpret_cast<VideoPipeline*> (user_data);
    GstBuffer* buffer = (GstBuffer*) info->data;

    // sync
    if (info->type & GST_PAD_PROBE_TYPE_BUFFER) {
        g_mutex_lock (&vp->m_syncMuxtex);
        g_atomic_int_inc (&vp->m_syncCount);
        g_cond_signal (&vp->m_syncCondition);
        g_mutex_unlock (&vp->m_syncMuxtex);
    }

    return GST_PAD_PROBE_OK;
}

static GstPadProbeReturn cb_queue0_probe (
    GstPad* pad, 
    GstPadProbeInfo* info,
    gpointer user_data)
{
    // LOG_INFO_MSG ("cb_queue0_probe called");

    VideoPipeline* vp = reinterpret_cast<VideoPipeline*> (user_data);
    GstBuffer* buffer = (GstBuffer*) info->data;

    // sync
    if (info->type & GST_PAD_PROBE_TYPE_BUFFER && !vp->isExited) {
        g_mutex_lock (&vp->m_syncMuxtex);
        while (g_atomic_int_get (&vp->m_syncCount) <= 0)
            g_cond_wait (&vp->m_syncCondition, &vp->m_syncMuxtex);
        if (!g_atomic_int_dec_and_test (&vp->m_syncCount)) {
            //LOG_INFO_MSG ("m_syncCount:%d/%d", vp->m_syncCount,
            //    vp->pipeline_id_);
        }
        g_mutex_unlock (&vp->m_syncMuxtex);
    }

    // osd the result
    if (vp->m_getResultFunc) {
        const std::shared_ptr<cv::Rect> result =
            vp->m_getResultFunc (vp->m_getResultArgs);
        if (result && vp->m_procDataFunc) {
            vp->m_procDataFunc (buffer, result);
        }
    }

    // LOG_INFO_MSG ("cb_osd_buffer_probe exited");

    return GST_PAD_PROBE_OK;
}

static GstFlowReturn cb_appsink_new_sample (
    GstElement* appsink,
    gpointer user_data)
{
    // LOG_INFO_MSG ("cb_appsink_new_sample called, user data: %p", user_data);

    VideoPipeline* vp = reinterpret_cast<VideoPipeline*> (user_data);
    GstSample* sample = NULL;
    GstBuffer* buffer = NULL;
    GstMapInfo map;
    const GstStructure* info = NULL;
    GstCaps* caps = NULL;
    int sample_width = 0;
    int sample_height = 0;

    g_signal_emit_by_name (appsink, "pull-sample", &sample);

    if (sample) {
        buffer = gst_sample_get_buffer (sample);
        if ( buffer == NULL ) {
            LOG_ERROR_MSG ("get buffer is null");
            goto exit;
        }

        gst_buffer_map (buffer, &map, GST_MAP_READ);

        caps = gst_sample_get_caps (sample);
        if ( caps == NULL ) {
            LOG_ERROR_MSG ("get caps is null");
            goto exit;
        }

        info = gst_caps_get_structure (caps, 0);
        if ( info == NULL ) {
            LOG_ERROR_MSG ("get info is null");
            goto exit;
        }

        // ---- Read frame and convert to opencv format ---------------
        // convert gstreamer data to OpenCV Mat, you could actually
        // resolve height / width from caps...
        gst_structure_get_int (info, "width", &sample_width);
        gst_structure_get_int (info, "height", &sample_height);

        // appsink product queue produce
        {
            // init a cv::Mat with gst buffer address: deep copy
            if (map.data == NULL) {
                LOG_ERROR_MSG("appsink buffer data empty\n");
                return GST_FLOW_OK;
            }

            cv::Mat img (sample_height, sample_width, CV_8UC3,
                        (unsigned char*)map.data, cv::Mat::AUTO_STEP);
            img = img.clone();

            if (vp->m_putDataFunc) {
                vp->m_putDataFunc(std::make_shared<cv::Mat> (img),
                    vp->m_putDataArgs);
            } else {
                goto exit;
            }
        }
    }

exit:
    if (buffer) {
        gst_buffer_unmap (buffer, &map);
    }
    if (sample) {
        gst_sample_unref (sample);
    }
    return GST_FLOW_OK;
}

#ifdef RTSP_SOURCE
static void cb_rtspsrc_pad_added (
    GstElement *src, GstPad *new_pad, gpointer user_data)
{
    GstPadLinkReturn ret;
	GstCaps *new_pad_caps = NULL;
	GstStructure *new_pad_struct = NULL;
	const gchar *new_pad_type = NULL;

	VideoPipeline* vp = reinterpret_cast<VideoPipeline*> (user_data);

    GstPad* sink_pad = gst_element_get_static_pad (
                    reinterpret_cast<GstElement*> (vp->m_rtph264depay), "sink");

	LOG_INFO_MSG ("Received new pad '%s' from '%s':", GST_PAD_NAME (new_pad),
        GST_ELEMENT_NAME (src));

	/* Check the new pad's name */
	if (!g_str_has_prefix (GST_PAD_NAME(new_pad), "recv_rtp_src_")) {
		LOG_ERROR_MSG ("It is not the right pad.  Need recv_rtp_src_. Ignoring.");
		goto exit;
	}

	/* If our converter is already linked, we have nothing to do here */
	if (gst_pad_is_linked(sink_pad)) {
		LOG_ERROR_MSG (" Sink pad from %s already linked. Ignoring.\n",
            GST_ELEMENT_NAME (src));
		goto exit;
	}

	/* Check the new pad's type */
	new_pad_caps = gst_pad_query_caps(new_pad, NULL);
	new_pad_struct = gst_caps_get_structure(new_pad_caps, 0);
	new_pad_type = gst_structure_get_name(new_pad_struct);

	/* Attempt the link */
	ret = gst_pad_link(new_pad, sink_pad);
	if (GST_PAD_LINK_FAILED (ret)) {
		LOG_ERROR_MSG ("Fail to link rtspsrc and rtph264depay");
	}

exit:
	/* Unreference the new pad's caps, if we got them */
	if (new_pad_caps != NULL)
		gst_caps_unref(new_pad_caps);

	/* Unreference the sink pad */
	gst_object_unref(sink_pad);
}
#endif

#ifdef FILE_SOURCE
static void cb_qtdemux_pad_added (
    GstElement* src, GstPad* new_pad, gpointer user_data)
{
    GstPadLinkReturn ret;
    GstCaps*         new_pad_caps = NULL;
    GstStructure*    new_pad_struct = NULL;
    const gchar*     new_pad_type = NULL;

    VideoPipeline* vp = reinterpret_cast<VideoPipeline*> (user_data);

    GstPad* v_sinkpad = gst_element_get_static_pad (
                    reinterpret_cast<GstElement*> (vp->m_h264parse), "sink");

    new_pad_caps = gst_pad_get_current_caps (new_pad);
    new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
    new_pad_type = gst_structure_get_name (new_pad_struct);

    if (!g_str_has_prefix (new_pad_type, "video/x-h264")) {
        LOG_WARN_MSG ("It has type '%s' which is not raw video. Ignoring.",
            new_pad_type);
        goto exit;
    }

    /* Attempt the link */
    ret = gst_pad_link (new_pad, v_sinkpad);
    if (GST_PAD_LINK_FAILED (ret)) {
        LOG_ERROR_MSG ("Fail to link qtdemux and h264parse");
    }

exit:
    /* Unreference the new pad's caps, if we got them */
    if (new_pad_caps != NULL)
        gst_caps_unref (new_pad_caps);

    /* Unreference the sink pad */
    gst_object_unref (v_sinkpad);
}
#endif

VideoPipeline::VideoPipeline (const VideoPipelineConfig& config)
{
    m_config = config;
    m_syncCount = 0;
    isExited = false;
    m_queue0_probe = -1;
    m_trans_sink_probe = -1;
    m_trans_src_probe = -1;
    g_mutex_init (&m_syncMuxtex);
    g_cond_init  (&m_syncCondition);
    g_mutex_init (&m_mutex);
}

VideoPipeline::~VideoPipeline ()
{
    Destroy ();
}

bool VideoPipeline::Create (void)
{
    GstCaps* m_transCaps;
    GstPad *m_gstPad;

    if (!(m_gstPipeline = gst_pipeline_new ("video-pipeline"))) {
        LOG_ERROR_MSG ("Failed to create pipeline named video-pipeline");
        goto exit;
    }
    gst_pipeline_set_auto_flush_bus (GST_PIPELINE (m_gstPipeline), true);

#ifdef RTSP_SOURCE
    if (!(m_source = gst_element_factory_make ("rtspsrc", "src"))) {
        LOG_ERROR_MSG ("Failed to create element rtspsrc named src");
        goto exit;
    }
    g_object_set (G_OBJECT (m_source), "location",
            m_config.src.c_str(), NULL);
    g_signal_connect(GST_OBJECT (m_source), "pad-added",
        G_CALLBACK(cb_rtspsrc_pad_added), reinterpret_cast<void*> (this));
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_source, NULL);

    if (!(m_rtph264depay = gst_element_factory_make ("rtph264depay", "depay"))) {
        LOG_ERROR_MSG ("Failed to create element rtph264depay named depay");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_rtph264depay, NULL);
#endif

#ifdef FILE_SOURCE
    if (!(m_source = gst_element_factory_make ("filesrc", "src"))) {
        LOG_ERROR_MSG ("Failed to create element filesrc named src");
        goto exit;
    }
    g_object_set (G_OBJECT (m_source), "location",
            m_config.src.c_str(), NULL);
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_source, NULL);

    if (!(m_qtdemux = gst_element_factory_make ("qtdemux", "demux"))) {
        LOG_ERROR_MSG ("Failed to create element qtdemux named demux");
        goto exit;
    }
    // Link qtdemux with h264parse
    g_signal_connect (m_qtdemux, "pad-added",
        G_CALLBACK(cb_qtdemux_pad_added), reinterpret_cast<void*> (this));
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_qtdemux, NULL);

    if (!gst_element_link_many (m_source, m_qtdemux, NULL)) {
        LOG_ERROR_MSG ("Failed to link filesrc->qtdemux");
        goto exit;
    }
#endif

    if (!(m_h264parse = gst_element_factory_make ("h264parse", "parse"))) {
        LOG_ERROR_MSG ("Failed to create element h264parse named parse");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_h264parse, NULL);

    if (!(m_decoder = gst_element_factory_make ("qtivdec", "decode"))) {
        LOG_ERROR_MSG ("Failed to create element qtivdec named decode");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_decoder, NULL);

    if (!(m_tee = gst_element_factory_make ("tee", "t1"))) {
        LOG_ERROR_MSG ("Failed to create element tee named t1");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_tee, NULL);

    if (!(m_queue0 = gst_element_factory_make ("queue", "queue0"))) {
        LOG_ERROR_MSG ("Failed to create element queue named queue0");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_queue0, NULL);

    // add probe to queue0
    m_gstPad = gst_element_get_static_pad (m_queue0, "src");
    m_queue0_probe = gst_pad_add_probe (m_gstPad, (GstPadProbeType) (
                        GST_PAD_PROBE_TYPE_BUFFER), cb_queue0_probe, 
                        reinterpret_cast<void*> (this), NULL);
    gst_object_unref (m_gstPad);

    if (!(m_qtioverlay = gst_element_factory_make ("qtioverlay", "overlay"))) {
        LOG_ERROR_MSG ("Failed to create element qtioverlay named overlay");
        goto exit;
    }
    g_object_set (G_OBJECT (m_qtioverlay), "meta-color", true, NULL);
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_qtioverlay, NULL);

    if (!(m_display = gst_element_factory_make ("waylandsink", "display"))) {
        LOG_ERROR_MSG ("Failed to create element waylandsink named display");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_display, NULL);

#ifdef RTSP_SOURCE
    if (!gst_element_link_many (m_rtph264depay, m_h264parse, m_decoder,
            m_tee, m_queue0, m_qtioverlay, m_display, NULL)) {
        LOG_ERROR_MSG ("Failed to link rtph264depay->h264parse->qtivdec"
            "->tee->queue0->qtioverlay->waylandsink");
        goto exit;
    }
#endif

#ifdef FILE_SOURCE
    if (!gst_element_link_many (m_h264parse, m_decoder, m_tee, 
            m_queue0, m_qtioverlay, m_display, NULL)) {
        LOG_ERROR_MSG ("Failed to link h264parse->qtivdec"
            "->tee->queue0->qtioverlay->waylandsink");
        goto exit;
    }
#endif

    if (!(m_queue1 = gst_element_factory_make ("queue", "queue1"))) {
        LOG_ERROR_MSG ("Failed to create element queue named queue1");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_queue1, NULL);

    if (!(m_qtivtrans = gst_element_factory_make ("qtivtransform", "transform"))) {
        LOG_ERROR_MSG ("Failed to create element qtivtransform named transform");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_gstPipeline), m_qtivtrans, NULL);

    m_transCaps = gst_caps_new_simple ("video/x-raw", "format", G_TYPE_STRING,
        m_config.conv_format.c_str(), "width", G_TYPE_INT, m_config.conv_width,
        "height", G_TYPE_INT, m_config.conv_height, NULL);

    if (!(m_capfilter = gst_element_factory_make("capsfilter", "capfilter"))) {
        LOG_ERROR_MSG ("Failed to create element capsfilter named capfilter");
        goto exit;
    }

    g_object_set (G_OBJECT(m_capfilter), "caps", m_transCaps, NULL);
    gst_caps_unref (m_transCaps);

    gst_bin_add_many (GST_BIN (m_gstPipeline), m_capfilter, NULL);

    m_gstPad = gst_element_get_static_pad (m_qtivtrans, "sink");
    m_trans_sink_probe = gst_pad_add_probe (m_gstPad, (GstPadProbeType) (
                        GST_PAD_PROBE_TYPE_BUFFER), cb_sync_before_buffer_probe,
                        reinterpret_cast<void*> (this), NULL);
    gst_object_unref (m_gstPad);

    m_gstPad = gst_element_get_static_pad (m_qtivtrans, "src");
    m_trans_src_probe = gst_pad_add_probe (m_gstPad, (GstPadProbeType) (
                        GST_PAD_PROBE_TYPE_BUFFER), cb_sync_buffer_probe,
                        reinterpret_cast<void*> (this), NULL);
    gst_object_unref (m_gstPad);

    if (!(m_appsink = gst_element_factory_make ("appsink", "appsink"))) {
        LOG_ERROR_MSG ("Failed to create element appsink named appsink");
        goto exit;
    }

    g_object_set (m_appsink, "emit-signals", TRUE, NULL);

    g_signal_connect (m_appsink, "new-sample",
        G_CALLBACK (cb_appsink_new_sample), reinterpret_cast<void*> (this));

    gst_bin_add_many (GST_BIN (m_gstPipeline), m_appsink, NULL);

    if (!gst_element_link_many (m_tee, m_queue1, m_qtivtrans, 
            m_capfilter, m_appsink, NULL)) {
        LOG_ERROR_MSG ("Failed to link tee->queue1->"
            "qtivtransform->capfilter->appsink");
        goto exit;
    }

    return true;

exit:
    LOG_ERROR_MSG ("Failed to create video pipeline");
    return false;
}

bool VideoPipeline::Start (void)
{
    if (GST_STATE_CHANGE_FAILURE == gst_element_set_state (m_gstPipeline,
        GST_STATE_PLAYING)) {
        LOG_ERROR_MSG ("Failed to set pipeline to playing state");
        return false;
    }

    return true;
}

bool VideoPipeline::Pause (void)
{
    GstState state, pending;

    LOG_INFO_MSG ("StopPipeline called");

    if (GST_STATE_CHANGE_ASYNC == gst_element_get_state (
        m_gstPipeline, &state, &pending, 5 * GST_SECOND / 1000)) {
        LOG_WARN_MSG ("Failed to get state of pipeline");
        return false;
    }

    if (state == GST_STATE_PAUSED) {
        return true;
    } else if (state == GST_STATE_PLAYING) {
        gst_element_set_state (m_gstPipeline, GST_STATE_PAUSED);
        gst_element_get_state (m_gstPipeline, &state, &pending,
            GST_CLOCK_TIME_NONE);
        return true;
    } else {
        LOG_WARN_MSG ("Invalid state of pipeline(%d)",
            GST_STATE_CHANGE_ASYNC);
        return false;
    }
}

bool VideoPipeline::Resume (void)
{
    GstState state, pending;

    LOG_INFO_MSG ("StartPipeline called");

    if (GST_STATE_CHANGE_ASYNC == gst_element_get_state (
        m_gstPipeline, &state, &pending, 5 * GST_SECOND / 1000)) {
        LOG_WARN_MSG ("Failed to get state of pipeline");
        return false;
    }

    if (state == GST_STATE_PLAYING) {
        return true;
    } else if (state == GST_STATE_PAUSED) {
        gst_element_set_state (m_gstPipeline, GST_STATE_PLAYING);
        gst_element_get_state (m_gstPipeline, &state, &pending,
            GST_CLOCK_TIME_NONE);
        return true;
    } else {
        LOG_WARN_MSG ("Invalid state of pipeline(%d)",
            GST_STATE_CHANGE_ASYNC);
        return false;
    }
}

void VideoPipeline::Destroy (void)
{
    GstPad* teeSrcPad;
    while (teeSrcPad = gst_element_get_request_pad (m_tee, "src_%u")) {
        gst_element_release_request_pad (m_tee, teeSrcPad);
        g_object_unref (teeSrcPad);
    }

    if (m_gstPipeline) {
        isExited = true;
        g_mutex_lock (&m_syncMuxtex);
        g_atomic_int_inc (&m_syncCount);
        g_cond_signal (&m_syncCondition);
        g_mutex_unlock (&m_syncMuxtex);

        gst_element_set_state (m_gstPipeline, GST_STATE_NULL);

        gst_object_unref (m_gstPipeline);

        m_gstPipeline = NULL;
    }

    if (m_trans_src_probe != -1 && m_queue0) {
        GstPad *gstpad = gst_element_get_static_pad (m_qtivtrans, "sink");
        if (!gstpad) {
            LOG_ERROR_MSG ("Could not find '%s' in '%s'", "src",
                GST_ELEMENT_NAME(m_qtivtrans));
        }
        gst_pad_remove_probe(gstpad, m_trans_src_probe);
        gst_object_unref (gstpad);
        m_trans_src_probe = -1;
    }

    if (m_trans_src_probe != -1 && m_queue0) {
        GstPad *gstpad = gst_element_get_static_pad (m_qtivtrans, "src");
        if (!gstpad) {
            LOG_ERROR_MSG ("Could not find '%s' in '%s'", "src",
                GST_ELEMENT_NAME(m_qtivtrans));
        }
        gst_pad_remove_probe(gstpad, m_trans_src_probe);
        gst_object_unref (gstpad);
        m_trans_src_probe = -1;
    }

    if (m_queue0_probe != -1 && m_queue0) {
        GstPad *gstpad = gst_element_get_static_pad (m_queue0, "src");
        if (!gstpad) {
            LOG_ERROR_MSG ("Could not find '%s' in '%s'", "src",
                GST_ELEMENT_NAME(m_queue0));
        }
        gst_pad_remove_probe(gstpad, m_queue0_probe);
        gst_object_unref (gstpad);
        m_queue0_probe = -1;
    }

    g_mutex_clear (&m_mutex);
    g_mutex_clear (&m_syncMuxtex);
    g_cond_clear  (&m_syncCondition);
}

void VideoPipeline::SetCallbacks (SinkPutDataFunc func, void* args)
{
    LOG_INFO_MSG ("set pudata callback called");

    m_putDataFunc = func;
    m_putDataArgs = args;
}

void VideoPipeline::SetCallbacks (ProbeGetResultFunc func, void* args)
{
    LOG_INFO_MSG ("set getdata callback called");

    m_getResultFunc = func;
    m_getResultArgs = args;
}

void VideoPipeline::SetCallbacks (ProcDataFunc func, void* args)
{
    LOG_INFO_MSG ("set procdata callback called");

    m_procDataFunc = func;
    m_procDataArgs = args;
}

================================================
FILE: application_develop/GstPadProbe/src/main.cpp
================================================
/*
 * @Description: Test Program.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-28 09:17:16
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-09-10 03:27:11
 */

#include <gflags/gflags.h>
#include <sys/stat.h>
#include <iostream>
#include <sstream>
#include <qrb5165/ml-meta/ml_meta.h>

#include "VideoPipeline.h"
#include "DoubleBufferCache.h"

static GMainLoop* g_main_loop = NULL;

static bool validateSrcUri (const char* name, const std::string& value) { 
    if (!value.compare("")) {
        LOG_ERROR_MSG ("Source Uri required!");
        return false;
    }

    // for absolute path
    std::size_t pos = value.find("//");
    if (pos != std::string::npos) {
        std::string uri_type = value.substr(0, pos - 1);
        std::string uri_path = value.substr(pos);

        if (!uri_type.compare ("file:")) {  // make sure file exist.
            struct stat statbuf;
            if (!stat(uri_path.c_str(), &statbuf)) {
                LOG_INFO_MSG ("Found config file: %s", value.substr(pos).c_str());
                return true;
            }
        } else {
            return true;
        }
    }

    // for relative path
    struct stat statbuf;
    if (!stat(value.c_str(), &statbuf)) {
        LOG_INFO_MSG ("Found config file: %s", value.c_str());
        return true;
    }

    LOG_ERROR_MSG ("Invalid config file.");
    return false;
}

DEFINE_string (srcuri, "", "algorithm library with APIs: alg{Init/Proc/Ctrl/Fina}");
DEFINE_validator (srcuri, &validateSrcUri);

void putData (std::shared_ptr<cv::Mat> img, void* user_data)
{
    // LOG_INFO_MSG ("putData called");

    DoubleBufCache<cv::Mat>* db =
        reinterpret_cast<DoubleBufCache<cv::Mat>*> (user_data);

    std::string sinkwords ("appsink");
    cv::Point fontpos= cv::Point (100, 115);
    cv::Scalar fongcolor(150, 255, 40);
    cv::putText(*img, sinkwords, fontpos, cv::FONT_HERSHEY_COMPLEX,
                    0.8, fongcolor, 2, 0.3);
    cv::Rect rect (100, 100, 1720, 880);
    cv::Scalar rectcolor(0, 200, 0);
    cv::rectangle (*img, rect, rectcolor, 3);

    db->feed(img);
}

std::shared_ptr<cv::Mat> getData (void* user_data)
{
    // LOG_INFO_MSG ("getData called");

    DoubleBufCache<cv::Mat>* db =
        reinterpret_cast<DoubleBufCache<cv::Mat>*> (user_data);
    std::shared_ptr<cv::Mat> img;

    img = db->fetch();

    std::string srcwords ("appsrc");
    cv::Point fontpos= cv::Point (1700, 970);
    cv::Scalar fongcolor(150, 255, 40);
    cv::putText(*img, srcwords, fontpos, cv::FONT_HERSHEY_COMPLEX,
                    0.8, fongcolor, 2, 0.3);
    cv::Rect rect (110, 110, 1720, 880);
    cv::Scalar rectcolor(0, 0, 200);
    cv::rectangle (*img, rect, rectcolor, 3);

    return img;
}

std::shared_ptr<cv::Rect> getResult(void* user_data)
{
    // LOG_INFO_MSG ("getResult called");
    cv::Rect rect (110, 110, 1720, 880);

    return std::make_shared<cv::Rect> (rect);
}

// draw rectangle and text on NV12 with qtioverlay meta data.
void procData(GstBuffer* buffer, const std::shared_ptr<cv::Rect>& rect)
{
    // LOG_INFO_MSG ("procData called");
    std::string osd_text ("queue0_probe");

    GstMLDetectionMeta* meta = gst_buffer_add_detection_meta(buffer);

    if (!meta) {
        LOG_ERROR_MSG ("Failed to create metadata");
        return ;
    }

    GstMLClassificationResult *box_info = (GstMLClassificationResult*)malloc(
        sizeof(GstMLClassificationResult));

    uint32_t label_size = osd_text.size() + 1;
    box_info->name = (char *)malloc(label_size);
    snprintf(box_info->name, label_size, "%s", osd_text.c_str());

    meta->box_info = g_slist_append (meta->box_info, box_info);

    meta->bbox_color = (200 << 24) + (0 << 16) + (0 << 8) + 0xFF;

    meta->bounding_box.x = rect->x;
    meta->bounding_box.y = rect->y;

    meta->bounding_box.width = rect->width;
    meta->bounding_box.height = rect->height;
}

int main(int argc, char* argv[])
{
    google::ParseCommandLineFlags (&argc, &argv, true);
    VideoPipelineConfig       m_vpConfig;
    VideoPipeline*            m_vp;
    SinkPutDataFunc           m_putDataFunc;
    ProbeGetResultFunc        m_getResultFunc;
    ProcDataFunc              m_procDataFunc;
    DoubleBufCache<cv::Mat>*  m_dataBufferCache;
    DoubleBufCache<cv::Rect>* m_resultBufferCache;

    gst_init(&argc, &argv);

    if (!(g_main_loop = g_main_loop_new (NULL, FALSE))) {
        LOG_ERROR_MSG ("Failed to new a object with type GMainLoop");
        goto exit;
    }

    m_vpConfig.src = FLAGS_srcuri;
    m_vpConfig.conv_format = "BGR";
    m_vpConfig.conv_width = 960;
    m_vpConfig.conv_height = 540;

    m_vp = new VideoPipeline(m_vpConfig);

    m_putDataFunc = std::bind(putData,
                            std::placeholders::_1, std::placeholders::_2);
    m_getResultFunc = std::bind(getResult, std::placeholders::_1);
    m_procDataFunc = std::bind(procData,
                            std::placeholders::_1, std::placeholders::_2);
    m_dataBufferCache = new DoubleBufCache<cv::Mat> ();
    m_resultBufferCache = new DoubleBufCache<cv::Rect> ();

    m_vp->SetCallbacks (m_putDataFunc, m_dataBufferCache);
    m_vp->SetCallbacks (m_getResultFunc, m_resultBufferCache);
    m_vp->SetCallbacks (m_procDataFunc, NULL);

    if (!m_vp->Create()) {
        LOG_ERROR_MSG ("Pipeline Create failed.");
        goto exit;
    }

    m_vp->Start();

    g_main_loop_run (g_main_loop);

exit:
    if (g_main_loop) g_main_loop_unref (g_main_loop);

    if (m_vp) {
        m_vp->Destroy();
        delete m_vp;
        m_vp = NULL;
    }

    if (m_dataBufferCache) {
        delete m_dataBufferCache;
        m_dataBufferCache = NULL;
    }

    if (m_resultBufferCache) {
        delete m_resultBufferCache;
        m_dataBufferCache = NULL;
    }

    google::ShutDownCommandLineFlags ();
    return 0;
}

================================================
FILE: application_develop/README.md
================================================
# Application Development

[![](https://img.shields.io/badge/Author-@RucardoLu-red.svg)](https://github.com/gesanqiu)![](https://img.shields.io/badge/Version-1.0.0-blue.svg)[![](https://img.shields.io/badge/license-GPL-000000.svg)](https://opensource.org/licenses/GPL-3.0/)

## 概述

GStreamer作为一个音视频应用开发框架,提供了一个快速开发工具`gst-launch-1.0`,开发人员能够将现有的Pulgins以一定规则任意组合成一条Pipeline并运行起来。但这显然不能满足更高级的开发需求,对于开发人员来说我们往往需要对音视频的源数据进行操作,操作单位至少是一帧图片或一段音频,事实上这些数据就在Pipeline中以Stream的形式在各个Plugin之间传递,而为了能够操作这些数据,我们需要更高的访问权限,更细的控制粒度。

本章节旨在展示一个基于GStreamer框架的简单应用是如何被开发出来的,以及我们能够实现的功能。

本章节代码仓库:[application-develop](https://github.com/gesanqiu/gstreamer-example/tree/main/application_develop)

章节内容:

- 构建pipeline的两种方式:`gst_parse_launch()`和`gst_element_factory_make()`
- uridecodebin
- appsink/appsrc
- GstBufferPool
- GstPadProbe
- 自定义plugin

## 开发平台
- 开发平台:Qualcomm® QRB5165 (Linux-Ubuntu 18.04)
- 图形界面:Weston(Wayland)
- 开发框架:GStreamer, OpenCV
- 第三方库:gflags,json-glib-1.0,glib-2.0
- 构建工具:[CMake](https://ricardolu.gitbook.io/trantor/cmake-in-action)



================================================
FILE: application_develop/app/CMakeLists.txt
================================================
# created by Ricardo Lu in 08/29/2021

cmake_minimum_required(VERSION 3.10)

project(app)

set(CMAKE_CXX_STANDARD 11)

set(OpenCV_DIR "/opt/thundersoft/opencv-4.2.0/lib/cmake/opencv4")
find_package(OpenCV REQUIRED)

include(FindPkgConfig)
pkg_check_modules(GST    REQUIRED gstreamer-1.0)
pkg_check_modules(GSTAPP REQUIRED gstreamer-app-1.0)
pkg_check_modules(GLIB   REQUIRED glib-2.0)
pkg_check_modules(GFLAGS REQUIRED gflags)

include_directories(
    ${PROJECT_SOURCE_DIR}/inc
    ${GST_INCLUDE_DIRS}
    ${GSTAPP_INCLUDE_DIRS}
    ${GLIB_INCLUDE_DIRS}
    ${GFLAGS_INCLUDE_DIRS}
    ${OpenCV_INCLUDE_DIRS}
)

link_directories(
    ${GST_LIBRARY_DIRS}
    ${GSTAPP_LIBRARY_DIRS}
    ${GLIB_LIBRARY_DIRS}
    ${GFLAGS_LIBRARY_DIRS}
    ${OpenCV_LIBRARY_DIRS}
)

add_executable(${PROJECT_NAME}
    src/appsink.cpp
    src/appsrc.cpp
    src/main.cpp
)

target_link_libraries(${PROJECT_NAME}
    ${GST_LIBRARIES}
    ${GSTAPP_LIBRARIES}
    ${GLIB_LIBRARIES}
    ${GFLAGS_LIBRARIES}
    ${OpenCV_LIBRARIES}
)

================================================
FILE: application_develop/app/README.md
================================================
# app

为了完成应用程序与GStreamer Pipeline的数据交互,GStreamer提供了两个插件:

- [GstAppSink](https://gstreamer.freedesktop.org/documentation/applib/gstappsink.html) – 应用程序从管道中提取GstSample的简便方法
- [GstAppSrc](https://gstreamer.freedesktop.org/documentation/applib/gstappsrc.html) – 应用程序向管道中注入GstBuffer的简单方法

本教程是这两个插件的应用实例,**教程地址:[GStreamer-app](https://ricardolu.gitbook.io/gstreamer/application-development/app)**

## Build & Run

```shell
cmake -H. -Bbuild/
cd build
make

./app --srcuri ../video.mp4
# you will see one green rectangle named appsink
# and one red rectangle named appsrc on your video
```



================================================
FILE: application_develop/app/doc/app.md
================================================
# App

为了完成应用程序与GStreamer Pipeline的数据交互,GStreamer提供了两个插件:

- [GstAppSink](https://gstreamer.freedesktop.org/documentation/applib/gstappsink.html) – 应用程序从管道中提取GstSample的简便方法
- [GstAppSrc](https://gstreamer.freedesktop.org/documentation/applib/gstappsrc.html) – 应用程序向管道中注入GstBuffer的简单方法
- **Github:[gstreamer-app](https://github.com/gesanqiu/gstreamer-example/tree/main/application_develop/app)**



================================================
FILE: application_develop/app/doc/appsink.md
================================================
# Appsink

appsink是一个sink插件,具有多种方法允许应用程序获取Pipeline的数据句柄。与大部分插件不同,除了Action signals的方式以外,appsink还提供了一系列的外部接口`gst_app_sink_<function_name>()`用于数据交互以及appsink属性的动态设置(需要链接`libgstapp.so`)。

## Properties

#### emit-signals

appsink的`emit-signals`属性默认为false,假设需要发送`new-preroll`和`new-sample`信号,需要将其设置为true。

### caps

cpas属性用于设置Appsink可以接收的数据格式,但和`appsrc`必须要设置caps属性以便后续和plugin的链接不同,appsink的caps属性为可选项,因为appsink处理的数据单元为GstSample,可以通过`gst_sample_get_caps()`直接从GstSample中获取到其下的GstCaps。

## signals

- `eos`:流结束信号,由stream线程发出。
- `new-preroll`:preroll sample可用信号,只有当`emit-signals`属性为`true`时才会由stream线程发出。
  - `preroll`:一个sink元素当且仅当有一个buffer进入pad之后才能完成`PAUSED`状态的转变,这个过程叫做`preroll`。为了能够尽快完成向`PLAYING`状态的转变,避免给用户造成视觉上的延迟,向pipeline中填充buffer(Preroll)是有很有必要的。Preroll在音视频同步方面是非常关键的,确保不会有buffer被sink元素抛弃。
- `new-sample`:新的sample可用信号,只有当`emit-signals`属性为`true`时才会由stream线程发出。

## GST_APP_API

### GstAppSinkCallbacks

```c
typedef struct {
  void          (*eos)              (GstAppSink *appsink, gpointer user_data);
  GstFlowReturn (*new_preroll)      (GstAppSink *appsink, gpointer user_data);
  GstFlowReturn (*new_sample)       (GstAppSink *appsink, gpointer user_data);

  /*< private >*/
  gpointer     _gst_reserved[GST_PADDING];
} GstAppSinkCallbacks;
```

- `*eos`:`eos`信号触发的回调函数指针
- `*new_preroll`:`new_preroll`信号触发的回调函数指针
- `*new_sample`:`new_sample`信号触发的回调函数指针
- `*user_data`:用户向回调函数传递的数据。

### pull-sample

通常开发者可以使用`gst_app_sink_pull_sample()`和`gst_app_sink_pull_preroll()`来获取appsink中的GstSample, 这两个方法将block线程直到appsink中获取到可用的GstSample或者Pipeline停止播放(end-of-stream),同时还提供了timeout版本:`gst_app_sink_try_pull_sample`和`gst_app_sink_try_pull_preroll`。

- `gst_app_sink_pull_sample`

  ```c
  GstSample *
  gst_app_sink_pull_sample (GstAppSink * appsink)
  ```

- `gst_app_sink_try_pull_sample`

  ```c
  GstSample *
  gst_app_sink_try_pull_sample (GstAppSink * appsink,
                                GstClockTime timeout)
  ```

​	**注:**appsink内部使用一个队列来保存来自stream线程输出的buffer,假如应用程序pull-sample的速度不够快,那么队列将占用越来越多的内存,通常建议使用`max-buffers`属性设置内部队列长度,同时配合`drop`属性用于设置内部队列在队满时是丢帧或者block来避免内存泄露。

## Action signals

- `pull-sample`:

  ```c
  g_signal_emit_by_name (self, "pull-sample", user_data, &ret);
  ```

  - 将阻塞线程,直到获取到一个可用的GstSample,或收到EOS信号或者appsink插件状态变为`READY`或`NULL`。
  - 只有在appsink处于`PLAYING`状态下才会返回GstSample到user_data,所有新到达的GstSample都会加入appsink的内部队列,因此应用程序可以根据自己的需求以一定的速度来pull sample,但加入消耗速度不够快将造成大量的内存开销。

- `pull-preroll`

  ```c
  g_signal_emit_by_name (self, "pull-preroll", user_data, &ret);
  ```

  - 获取Appsink的最后一个preroll sample,即使得appsink变为`PAUSED`状态的sample。

**注:**假设`pull-sample`或`pull-preroll`操作返回的GstSample为空,那么appsink处于停止或者EOS状态,可以使用`gst_app_sink_is_eos()`进行查看。

## 代码实例

```c++
// 使用GST_APP_API和Action signal的方式
void CreatePipeline()
{
    // ...

	if (!(m_appsink = gst_element_factory_make ("appsink", "appsink"))) {
        LOG_ERROR_MSG ("Failed to create element appsink named appsink");
        goto exit;
    }

    // equals to gst_app_sink_set_emit_signals (GST_APP_SINK_CAST (m_appsink), true);
    g_object_set (m_appsink, "emit-signals", TRUE, NULL);

    // full definition of appsink callbacks
    /*
    GstAppSinkCallbacks callbacks = {cb_appsink_eos,
                            cb_appsink_new_preroll, cb_appsink_new_sample};
    gst_app_sink_set_callbacks (GST_APP_SINK_CAST (m_appsink),
        &callbacks, reinterpret_cast<void*> (this), NULL);
    */
    g_signal_connect (m_appsink, "new-sample",
        G_CALLBACK (cb_appsink_new_sample), reinterpret_cast<void*> (this));

    gst_bin_add_many (GST_BIN (m_sinkPipeline), m_appsink, NULL);
    
    //...
}

GstFlowReturn cb_appsink_new_sample (
    GstElement* appsink,
    gpointer user_data)
{
    // LOG_INFO_MSG ("cb_appsink_new_sample called, user data: %p", user_data);

    SinkPipeline* sp = reinterpret_cast<SinkPipeline*> (user_data);
    GstSample* sample = NULL;
    GstBuffer* buffer = NULL;
    GstMapInfo map;
    const GstStructure* info = NULL;
    GstCaps* caps = NULL;
    GstFlowReturn ret = GST_FLOW_OK;
    int sample_width = 0;
    int sample_height = 0;

    // equals to gst_app_sink_pull_sample (GST_APP_SINK_CAST (appsink), sample);
    g_signal_emit_by_name (appsink, "pull-sample", &sample, &ret);
    if (ret != GST_FLOW_OK) {
        LOG_ERROR_MSG ("can't pull GstSample.");
        return ret;
    }

    if (sample) {
        buffer = gst_sample_get_buffer (sample);
        if ( buffer == NULL ) {
            LOG_ERROR_MSG ("get buffer is null");
            goto exit;
        }

        gst_buffer_map (buffer, &map, GST_MAP_READ);

        caps = gst_sample_get_caps (sample);
        if ( caps == NULL ) {
            LOG_ERROR_MSG ("get caps is null");
            goto exit;
        }

        info = gst_caps_get_structure (caps, 0);
        if ( info == NULL ) {
            LOG_ERROR_MSG ("get info is null");
            goto exit;
        }

        // -------- Read frame and convert to opencv format --------
        // convert gstreamer data to OpenCV Mat, you could actually
        // resolve height / width from caps...
        gst_structure_get_int (info, "width", &sample_width);
        gst_structure_get_int (info, "height", &sample_height);

        // customized user action
        {
            // init a cv::Mat with gst buffer address: deep copy
            // sometime you may got a empty buffer
            if (map.data == NULL) {
                LOG_ERROR_MSG("appsink buffer data empty\n");
                return GST_FLOW_OK;
            }

            cv::Mat img (sample_height, sample_width, CV_8UC3,
                        	(unsigned char*)map.data, cv::Mat::AUTO_STEP);
            img = img.clone();

            // redirection outside operation: for decoupling use
            if (sp->m_putDataFunc) {
                sp->m_putDataFunc(std::make_shared<cv::Mat> (img),
                    sp->m_putDataArgs);
            } else {
                goto exit;
            }
        }
    }

exit:
    if (buffer) {
        gst_buffer_unmap (buffer, &map);
    }
    if (sample) {
        gst_sample_unref (sample);
    }
    return GST_FLOW_OK;
}
```

### customized user action

```c++
{
    cv::Mat img (sample_height, sample_width, CV_8UC3,
                	(unsigned char*)map.data, cv::Mat::AUTO_STEP);
    // deep copy
    img = img.clone();
}
```

在示例代码中有以上这段,为了在每一帧图像上画框,我使用了OpenCV的接口,因此需要将GstBuffer中的数据转化为`cv::Mat`。`GstBuffer`的数据存放的真实地址由相关的`GstMapInfo`管理,我使用映射的地址`map.data`来构造了一个`cv::Mat`对象。这次构造是浅拷贝,但在这之后我使用`cv::Mat.clone()`方法做了一次深拷贝,这次深拷贝的原因是在映射`gst_buffer_map (buffer, &map, GST_MAP_READ);`时我只申请了`READ`权限。

为什么不申请`WRITE`权限呢,是因为在实际使用过程中发现一旦我同时申请`WRITE`权限,程序终端将会输出一个报错:尝试向一个不可写的Buffer申请写权限,并且我拿到的`GstBuffer`是一个空的buffer,其下的数据也为空。

```c
gboolean
gst_buffer_map (GstBuffer * buffer,
                GstMapInfo * info,
                GstMapFlags flags)
```

查看`gst_buffer_map ()`的API说明可以知道,当你请求映射的buffer是可写的但memory是不可写的时候,将自动生成并返回一个可写的拷贝,同时用这个拷贝替换掉只读的buffer。

buffer可写但memory不可写的和硬解码相关,硬解码需要相应的硬件配合,对这类memory的读写通常需要通过相关的驱动接口,在示例代码的pipeline中我使用了高通平台下的硬件解码器`qtivdec`,底层依赖于ION,关于ION的相关资料可以参考文章[ION Memory Control](https://ricardolu.gitbook.io/trantor/ion-memory-control),文章描述了ION Buffer的使用方法,简单来说就是需要我们拿到ION Buffer的句柄,然后通过`mmap`将这块ION Memory映射到用户空间,才能对其进行操作。但是这个ION Buffer是在`qtivdec`插件中申请的,因此假如想要拿到它的句柄需要修改`qtivdec`的源码,维护这个句柄的生存周期并在sink这个buffer的时候将其加入`GstBuffer`的`GstStruture`结构中。

**注:**因为我目前基于高通平台开发,为了性能我在插件的选择上会尽可能选择具备硬件加速的插件,这就为程序引入了不可控因素,但在一定程度上是值得的。虽然这是个教学文档,但同时也是我的学习过程,因此我会将我在开发过程中遇到的一些问题和解决思路记录下来,供大家参考。

### 资源释放

样例代码由于进行了深拷贝并且将cv::Mat对象交给智能指针来管理,因此我们可以在回调完成之后手动释放相关的GstSample,释放GstSample的同时会自动释放其下的GstBuffer因此我们只用解除GstBuffer的映射。

在通常开发中完全可以appsink直接输出GstSample或者GstBuffer,并在不需要的时候再释放,以实现零拷贝。

================================================
FILE: application_develop/app/doc/appsrc.md
================================================
# Appsrc

应用程序可以通过Appsrc插件向GStreamer pipeline中插入数据,与大部分插件不同,除了Action signals的方式以外,appsrc还提供了一系列的外部接口`gst_app_src_<function_name>()`用于数据交互以及appsrc属性的动态设置(需要链接`libgstapp.so`)。

## Properties

#### emit-signals

appsrc的`emit-signals`属性默认为true。

### caps

除非push的buffer具有未知的caps,使用appsrc:

- 需要设置caps属性,指定我们会产生何种类型的数据,这样GStreamer会在连接阶段检查后续的Element是否支持此数据类型,否则回调只触发一次就被block在其他插件中。这里的 caps必须为GstCaps对象。
- 使用`gst_app_src_push_sample()`直接push sample,然后接口将接管这个GstSample的控制权(自动释放),并获取这个GstSample的GstCaps作为caps。

### max-buffers/max-bytes/max-time & block

appsrc内部维护一个数据队列,`max-buffers/max-bytes/max-time`这几个属性用于控制这个内部队列的长度。一个填满的队列将发送`enough-data`信号,这时应用程序应该停止向队列push data。

假如`block`属性设置为true,当内部队列为满时将block push-buffer相关方法直到队列不满。

### stream-type

### is-live

## signals

- `enough-data `:appsrc内部数据队列满,推荐在触发信号之后停止`push-buffer`直到need-data信号被触发。
- `need-data`:appsrc需要更多的数据,在回调或者其他线程中需要调用`push-buffer`或者`end-of-stream`,回调函数的`length`参数是一个隐藏参数,当`length=-1`时意味着appsrc可以接收任意bytes的buffer。可以重复调用`push-buffer`直至`enough-data`信号被触发。
- `seekdata`:需要seekable stream-type的支持,具有一个offset表明下一个要被push的buffer的位置。

### 两种工作模式

- `push-mode`

  push模式由应用程序来控制data的发送,应用程序重复调用`push-buffer/push-sample`方法来触发`enough-data`信号。配合`max-buffers`属性设置队列的长度,通过处理`enough-data`信号和`need-data`信号分别停止或开始调用`push-buffer/push-sample`来控制队列的大小。

- `pull-mode`

  pull模式和通过`need-data`信号触发`push-buffer`调用。

## GST_APP_API

### GstAppSrcCallbacks

```c
typedef struct {
  void      (*need_data)    (GstAppSrc *src, guint length, gpointer user_data);
  void      (*enough_data)  (GstAppSrc *src, gpointer user_data);
  gboolean  (*seek_data)    (GstAppSrc *src, guint64 offset, gpointer user_data);

  /*< private >*/
  gpointer     _gst_reserved[GST_PADDING];
} GstAppSrcCallbacks;
```

### push-buffer

- `gst_app_src_push_buffer`

  ```c
  GstFlowReturn
  gst_app_src_push_buffer (GstAppSrc * appsrc,
                           GstBuffer * buffer)
  ```

  - push-buffer只负责将数据插入appsrc的内部队列中,不负责这个buffer的传输。
  - API将接管这个GstBuffer的所有权,自动释放资源。

### Action signals

- `end-of-stream`:appsrc没有可用的buffer信号

- `push-buffer`:

  ```c
  g_signal_emit_by_name (self, "push-buffer", buffer, user_data, &ret);
  ```

  - 将GstBuffer添加到appsrc的src pad中,不持有这个GstBuffer的所有权,需要手动释放。

- `push-sample`:

  ```c
  g_signal_emit_by_name (self, "push-sample", sample, user_data, &ret);
  ```

  - 将GstSample下的GstBuffer添加到appsrc的src pad中,加入GstSample的GstCaps不符合当前appsrc的cpas,那么将同时把GstSample的GstCpas设置为appsrc的caps属性。不持有这个GstSample的所有权,需要手动释放。

## 代码实例

```c++
// 使用GST_APP_API和Action signal的方式
void CreatePipeline()
{
    // ...

    if (!(m_appsrc = gst_element_factory_make ("appsrc", "appsrc"))) {
        LOG_ERROR_MSG ("Failed to create element appsrc named appsrc");
        goto exit;
    }

    m_transCaps = gst_caps_new_simple ("video/x-raw", "format", G_TYPE_STRING,
        m_config.src_format.c_str(), "width", G_TYPE_INT, m_config.src_width,
          "height", G_TYPE_INT, m_config.src_height, NULL);
    // equals to gst_app_src_set_caps (GST_APP_SRC_CAST (m_appsrc), m_transCaps);
    g_object_set (G_OBJECT(m_appsrc), "caps", m_transCaps, NULL);
    gst_caps_unref (m_transCaps); 

    // equals to gst_app_src_set_stream_type (GST_APP_SRC_CAST (m_appsrc),
    //             GST_APP_STREAM_TYPE_STREAM);
    g_object_set (G_OBJECT(m_appsrc), "stream-type",
        GST_APP_STREAM_TYPE_STREAM, NULL);

    g_object_set (G_OBJECT(m_appsrc), "is-live", true, NULL);

    // full definition of appsrc callbacks
    /*
    GstAppSrcCallbacks callbacks = {cb_appsrc_need_data,
                            cb_appsrc_enough_data, cb_appsrc_seek_data};
    gst_app_src_set_callbacks (GST_APP_SRC_CAST (m_appsrc),
        &callbacks, reinterpret_cast<void*> (this), NULL);
    */
    g_signal_connect (m_appsrc, "need-data",
        G_CALLBACK (cb_appsrc_need_data), reinterpret_cast<void*> (this));

    gst_bin_add_many (GST_BIN (m_srcPipeline), m_appsrc, NULL);

    // ...
}

GstFlowReturn cb_appsrc_need_data (
    GstElement* appsrc,
    guint length,
    gpointer user_data)
{
    // LOG_INFO_MSG ("cb_appsrc_need_data called, user_data: %p", user_data);
    SrcPipeline* sp = reinterpret_cast<SrcPipeline*> (user_data);
    GstBuffer* buffer;
    GstMapInfo map;
    GstFlowReturn ret = GST_FLOW_OK;

    std::shared_ptr<cv::Mat> img;

    if (sp->m_getDataFunc) {
        img = sp->m_getDataFunc (sp->m_getDataArgs);

        int len = img->total() * img->elemSize();
        // zero-copy GstBuffer
        // buffer = gst_buffer_new_wrapped(img->data, len);
        buffer = gst_buffer_new_allocate (NULL, len, NULL);

        gst_buffer_map(buffer,&map,GST_MAP_READ);
        memcpy(map.data, img->data, len);

        GST_BUFFER_PTS (buffer) = sp->m_timestamp;
        GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 25);
        sp->m_timestamp += GST_BUFFER_DURATION (buffer) ;

        // equals to gst_app_src_push_buffer (GST_APP_SRC_CAST (appsrc), buffer);
        g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
        gst_buffer_unmap(buffer, &map);
        gst_buffer_unref (buffer);

        if (ret != GST_FLOW_OK) {
        /* something wrong, stop pushing */
        LOG_ERROR_MSG ("push-buffer fail");
        }
    }

    // usleep (25 * 1000);

    return ret;
}
```



================================================
FILE: application_develop/app/inc/Common.h
================================================
/*
 * @Description: Common Utils.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-27 12:24:25
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-08-29 12:10:16
 */
#pragma once

#include <iostream>
#include <string>
#include <memory>
#include <functional>
#include <unistd.h>

#include <opencv2/opencv.hpp>
#include <gst/gst.h>
#include <gst/app/app.h>

#define LOG_ERROR_MSG(msg, ...)  \
    g_print("** ERROR: <%s:%s:%d>: " msg "\n", __FILE__, __func__, __LINE__, ##__VA_ARGS__)

#define LOG_INFO_MSG(msg, ...) \
    g_print("** INFO:  <%s:%s:%d>: " msg "\n", __FILE__, __func__, __LINE__, ##__VA_ARGS__)

#define LOG_WARN_MSG(msg, ...) \
    g_print("** WARN:  <%s:%s:%d>: " msg "\n", __FILE__, __func__, __LINE__, ##__VA_ARGS__)

typedef std::function<void(std::shared_ptr<cv::Mat>, void*)> SinkPutDataFunc;
typedef std::function<std::shared_ptr<cv::Mat>(void*)>       SrcGetDataFunc;


================================================
FILE: application_develop/app/inc/DoubleBufferCache.h
================================================
/*
 * @Description: Double Buffer Cache Implement.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-29 08:51:01
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-08-29 12:36:32
 */
#pragma once

#include <mutex>
#include <atomic>
#include <memory>
#include <list>

/** @brief Shared-buffer cache manager.
 *
 * */
template<typename T>
class DoubleBufCache {
public:
    /** @brief constructor
     * @param[in] notify_func When a new buffer is fed, it triggers the function handle.
     * */
    DoubleBufCache(std::function<bool()> notify_func =
            std::function<bool()>{nullptr}) noexcept : swap_ready(false) {
        this->notify_func = notify_func;
    }

    /** @brief deconstructor
     * */
    ~DoubleBufCache() noexcept {
        if (!debug_info.empty() ) {
            printf("DoubleBufCache %s destroyed.", debug_info.c_str());
        }
    }

    /** @brief Put the latest buffer into cache queue to be processed.
     *
     * Giving up control of previous front buffer.
     * @param[in] The latest buffer.
     * */
    void feed(std::shared_ptr<T> pending) {
        if (nullptr == pending.get()) {
            throw "ERROR: feed an empty buffer to DoubleBufCache";
        }
        swap_mtx.lock();
        front_sp = pending;
        swap_mtx.unlock();
        swap_ready = true;
        if (notify_func) {
            notify_func();
        }
        return;
    }

    /** @brief Get the front buffer.
     * @return Front buffer.
     * */
    std::shared_ptr<T> front()  noexcept {
        return front_sp;
    }

    /** @brief Fetch the shared back buffer.
     * @return Back buffer.
     * */
    std::shared_ptr<T> fetch()  noexcept {
        if (swap_ready) {
            swap_mtx.lock();
            back_sp = front_sp;
            swap_mtx.unlock();
            swap_ready = false;
        }
        return back_sp;
    }

private:
    //! Notification function will be called, if a new buffer fed.
    std::function<bool()> notify_func;
    //! The buffer cache can be swapped if the flag is equal to true.
    std::atomic<bool> swap_ready;
    //! Swapping mutex lock for thread safety.
    std::mutex swap_mtx;
    //! Front buffer for previous results saving.
    std::shared_ptr<T> front_sp;
    //! Back buffer to be fetched.
    std::shared_ptr<T> back_sp;
public:
    //! Indicate the name of an instantiated object for debug.
    std::string debug_info;
};

================================================
FILE: application_develop/app/inc/appsink.h
================================================
/*
 * @Description: Appsink Pipeline Header.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-28 10:05:59
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-09-01 11:53:47
 */
#pragma once

#include "Common.h"

typedef struct _SinkPipelineConfig {
    std::string src;
    /*-------------qtivtransform-------------*/
    std::string conv_format;
    int         conv_width;
    int         conv_height;
}SinkPipelineConfig;

class SinkPipeline
{
public:
    SinkPipeline      (const SinkPipelineConfig& config);
    bool Create       (void);
    bool Start        (void);
    bool Pause        (void);
    bool Resume       (void);
    void Destroy      (void);
    void SetCallbacks (SinkPutDataFunc func, void* args);
    ~SinkPipeline     (void);

public:
    SinkPutDataFunc m_putDataFunc;
    void*           m_putDataArgs;

    SinkPipelineConfig m_config;

    GstElement* m_sinkPipeline;
    GstElement* m_source;
    GstElement* m_qtdemux;
    GstElement* m_h264parse;
    GstElement* m_decoder;
    GstElement* m_qtivtrans;
    GstElement* m_capfilter;
    GstElement* m_appsink;
};

/*
Decode Pipeline: 
    filesrc location=test.mp4 ! qtdemux ! qtivdec ! qtivtransform ! 
    video/x-raw,format=BGR,width=1920,height=1080 ! appsink
Display Pipeline: 
    appsrc stream-type=0 is-live=true caps=video/x-raw,format=BGR,width=1920,height=1080
     ! videoconvert ! video/x-raw,format=NV12,width=1920,height=1080 ! waylandsink
*/

================================================
FILE: application_develop/app/inc/appsrc.h
================================================
/*
 * @Description: Appsrc Pipeline Header.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-28 10:06:03
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-09-01 11:53:53
 */
#pragma once

#include "Common.h"

typedef struct _SrcPipelineConfig {
    /*--------------appsrc caps--------------*/
    std::string src_format;
    int         src_width;
    int         src_height;
    /*-------------videoconvert-------------*/
    std::string conv_format;
    int         conv_width;
    int         conv_height;
}SrcPipelineConfig;

class SrcPipeline
{
public:
    SrcPipeline       (const SrcPipelineConfig& config);
    bool Create       (void);
    bool Start        (void);
    bool Pause        (void);
    bool Resume       (void);
    void Destroy      (void);
    void SetCallbacks (SrcGetDataFunc func, void* args);
    ~SrcPipeline      (void);

public:
    SrcGetDataFunc  m_getDataFunc;
    void*           m_getDataArgs;
    uint64_t        m_timestamp;

    SrcPipelineConfig m_config;

    GstElement* m_srcPipeline;
    GstElement* m_appsrc;
    GstElement* m_videoconv;
    GstElement* m_capfilter;
    GstElement* m_display;
};

/*
Decode Pipeline: 
    filesrc location=test.mp4 ! qtdemux ! qtivdec ! qtivtransform ! 
    video/x-raw,format=BGR,width=1920,height=1080 ! appsink
Display Pipeline: 
    appsrc stream-type=0 is-live=true caps=video/x-raw,format=BGR,width=1920,height=1080
     ! videoconvert ! video/x-raw,format=NV12,width=1920,height=1080 ! waylandsink
*/

================================================
FILE: application_develop/app/src/appsink.cpp
================================================
/*
 * @Description: Appsink Pipeline Implement.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-28 09:57:03
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-09-01 12:41:27
 */

#include "appsink.h"

GstFlowReturn cb_appsink_new_sample (
    GstElement* appsink,
    gpointer user_data)
{
    // LOG_INFO_MSG ("cb_appsink_new_sample called, user data: %p", user_data);

    SinkPipeline* sp = reinterpret_cast<SinkPipeline*> (user_data);
    GstSample* sample = NULL;
    GstBuffer* buffer = NULL;
    GstMapInfo map;
    const GstStructure* info = NULL;
    GstCaps* caps = NULL;
    GstFlowReturn ret = GST_FLOW_OK;
    int sample_width = 0;
    int sample_height = 0;

    // equals to gst_app_sink_pull_sample (GST_APP_SINK_CAST (appsink), sample);
    g_signal_emit_by_name (appsink, "pull-sample", &sample, &ret);
    if (ret != GST_FLOW_OK) {
        LOG_ERROR_MSG ("can't pull GstSample.");
        return ret;
    }

    if (sample) {
        buffer = gst_sample_get_buffer (sample);
        if ( buffer == NULL ) {
            LOG_ERROR_MSG ("get buffer is null");
            goto exit;
        }

        gst_buffer_map (buffer, &map, GST_MAP_READ);

        caps = gst_sample_get_caps (sample);
        if ( caps == NULL ) {
            LOG_ERROR_MSG ("get caps is null");
            goto exit;
        }

        info = gst_caps_get_structure (caps, 0);
        if ( info == NULL ) {
            LOG_ERROR_MSG ("get info is null");
            goto exit;
        }

        // ---- Read frame and convert to opencv format ---------------
        // convert gstreamer data to OpenCV Mat, you could actually
        // resolve height / width from caps...
        gst_structure_get_int (info, "width", &sample_width);
        gst_structure_get_int (info, "height", &sample_height);

        // appsink product queue produce
        {
            // init a cv::Mat with gst buffer address: deep copy
            if (map.data == NULL) {
                LOG_ERROR_MSG("appsink buffer data empty\n");
                return GST_FLOW_OK;
            }

            cv::Mat img (sample_height, sample_width, CV_8UC3,
                        (unsigned char*)map.data, cv::Mat::AUTO_STEP);
            img = img.clone();

            if (sp->m_putDataFunc) {
                sp->m_putDataFunc(std::make_shared<cv::Mat> (img),
                    sp->m_putDataArgs);
            } else {
                goto exit;
            }
        }
    }

exit:
    if (buffer) {
        gst_buffer_unmap (buffer, &map);
    }
    if (sample) {
        gst_sample_unref (sample);
    }
    return GST_FLOW_OK;
}

static void cb_qtdemux_pad_added (
    GstElement* src, GstPad* new_pad, gpointer user_data)
{
    GstPadLinkReturn ret;
    GstCaps*         new_pad_caps = NULL;
    GstStructure*    new_pad_struct = NULL;
    const gchar*     new_pad_type = NULL;

    SinkPipeline* vp = reinterpret_cast<SinkPipeline*> (user_data);

    GstPad* v_sinkpad = gst_element_get_static_pad (
                    reinterpret_cast<GstElement*> (vp->m_h264parse), "sink");

    new_pad_caps = gst_pad_get_current_caps (new_pad);
    new_pad_struct = gst_caps_get_structure (new_pad_caps, 0);
    new_pad_type = gst_structure_get_name (new_pad_struct);

    if (!g_str_has_prefix (new_pad_type, "video/x-h264")) {
        LOG_WARN_MSG ("It has type '%s' which is not raw video. Ignoring.",
            new_pad_type);
        goto exit;
    }

    /* Attempt the link */
    ret = gst_pad_link (new_pad, v_sinkpad);
    if (GST_PAD_LINK_FAILED (ret)) {
        LOG_ERROR_MSG ("fail to link qtdemux and h264parse");
    }

exit:
    /* Unreference the new pad's caps, if we got them */
    if (new_pad_caps != NULL)
        gst_caps_unref (new_pad_caps);

    /* Unreference the sink pad */
    gst_object_unref (v_sinkpad);
}

SinkPipeline::SinkPipeline (const SinkPipelineConfig& config)
{
    m_config = config;
}

SinkPipeline::~SinkPipeline ()
{
    Destroy ();
}

bool SinkPipeline::Create (void)
{
    GstCaps* m_transCaps;

    // decode pipeline
    if (!(m_sinkPipeline = gst_pipeline_new ("decode-pipeline"))) {
        LOG_ERROR_MSG ("Failed to create pipeline named decode-pipeline");
        goto exit;
    }
    gst_pipeline_set_auto_flush_bus (GST_PIPELINE (m_sinkPipeline), true);

    if (!(m_source = gst_element_factory_make ("filesrc", "src"))) {
        LOG_ERROR_MSG ("Failed to create element filesrc named src");
        goto exit;
    }
    g_object_set (G_OBJECT (m_source), "location",
            m_config.src.c_str(), NULL);
    gst_bin_add_many (GST_BIN (m_sinkPipeline), m_source, NULL);

    if (!(m_qtdemux = gst_element_factory_make ("qtdemux", "demux"))) {
        LOG_ERROR_MSG ("Failed to create element qtdemux named demux");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_sinkPipeline), m_qtdemux, NULL);

    if (!gst_element_link_many (m_source, m_qtdemux, NULL)) {
        LOG_ERROR_MSG ("Failed to link filesrc->qtdemux");
        goto exit;
    }

    if (!(m_h264parse = gst_element_factory_make ("h264parse", "parse"))) {
        LOG_ERROR_MSG ("Failed to create element h264parse named parse");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_sinkPipeline), m_h264parse, NULL);

    // Link qtdemux with h264parse
    g_signal_connect (m_qtdemux, "pad-added",
        G_CALLBACK(cb_qtdemux_pad_added), reinterpret_cast<void*> (this));

    if (!(m_decoder = gst_element_factory_make ("qtivdec", "decode"))) {
        LOG_ERROR_MSG ("Failed to create element qtivdec named decode");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_sinkPipeline), m_decoder, NULL);

    if (!(m_qtivtrans = gst_element_factory_make ("qtivtransform", "transform"))) {
        LOG_ERROR_MSG ("Failed to create element qtivtransform named transform");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_sinkPipeline), m_qtivtrans, NULL);

    m_transCaps = gst_caps_new_simple ("video/x-raw", "format", G_TYPE_STRING,
        m_config.conv_format.c_str(), "width", G_TYPE_INT, m_config.conv_width,
        "height", G_TYPE_INT, m_config.conv_height, NULL);

    if (!(m_capfilter = gst_element_factory_make("capsfilter", "capfilter"))) {
        LOG_ERROR_MSG ("Failed to create element capsfilter named capfilter");
        goto exit;
    }

    g_object_set (G_OBJECT(m_capfilter), "caps", m_transCaps, NULL);
    gst_caps_unref (m_transCaps);

    gst_bin_add_many (GST_BIN (m_sinkPipeline), m_capfilter, NULL);

    if (!(m_appsink = gst_element_factory_make ("appsink", "appsink"))) {
        LOG_ERROR_MSG ("Failed to create element appsink named appsink");
        goto exit;
    }

    // equals to gst_app_sink_set_emit_signals (GST_APP_SINK_CAST (m_appsink), true);
    g_object_set (m_appsink, "emit-signals", TRUE, NULL);

    // full definition of appsink callbacks
    /*
    GstAppSinkCallbacks callbacks = {cb_appsink_eos,
                            cb_appsink_new_preroll, cb_appsink_new_sample};
    gst_app_sink_set_callbacks (GST_APP_SINK_CAST (m_appsink),
        &callbacks, reinterpret_cast<void*> (this), NULL);
    */
    g_signal_connect (m_appsink, "new-sample",
        G_CALLBACK (cb_appsink_new_sample), reinterpret_cast<void*> (this));

    gst_bin_add_many (GST_BIN (m_sinkPipeline), m_appsink, NULL);

    if (!gst_element_link_many (m_h264parse, m_decoder, m_qtivtrans,
            m_capfilter, m_appsink, NULL)) {
        LOG_ERROR_MSG ("Failed to link h264parse->qtivdec->"
            "qtivtransfrom->capfilter->appsink");
        goto exit;
    }

    return true;

exit:
    LOG_ERROR_MSG ("Failed to create video pipeline");
    return false;
}

bool SinkPipeline::Start (void)
{
    if (GST_STATE_CHANGE_FAILURE == gst_element_set_state (m_sinkPipeline,
        GST_STATE_PLAYING)) {
        LOG_ERROR_MSG ("Failed to set decode pipeline to playing state");
        return false;
    }
    return true;
}

bool SinkPipeline::Pause (void)
{
    GstState state, pending;

    LOG_INFO_MSG ("StopPipeline called");

    if (GST_STATE_CHANGE_ASYNC == gst_element_get_state (
        m_sinkPipeline, &state, &pending, 5 * GST_SECOND / 1000)) {
        LOG_WARN_MSG ("Failed to get state of decode pipeline");
        return false;
    }

    if (state == GST_STATE_PAUSED) {
        return true;
    } else if (state == GST_STATE_PLAYING) {
        gst_element_set_state (m_sinkPipeline, GST_STATE_PAUSED);
        gst_element_get_state (m_sinkPipeline, &state, &pending,
            GST_CLOCK_TIME_NONE);
        return true;
    } else {
        LOG_WARN_MSG ("Invalid state of decode pipeline(%d)",
            GST_STATE_CHANGE_ASYNC);
        return false;
    }
}

bool SinkPipeline::Resume (void)
{
    GstState state, pending;

    LOG_INFO_MSG ("StartPipeline called");

    if (GST_STATE_CHANGE_ASYNC == gst_element_get_state (
        m_sinkPipeline, &state, &pending, 5 * GST_SECOND / 1000)) {
        LOG_WARN_MSG ("Failed to get state of decode pipeline");
        return false;
    }

    if (state == GST_STATE_PLAYING) {
        return true;
    } else if (state == GST_STATE_PAUSED) {
        gst_element_set_state (m_sinkPipeline, GST_STATE_PLAYING);
        gst_element_get_state (m_sinkPipeline, &state, &pending,
            GST_CLOCK_TIME_NONE);
        return true;
    } else {
        LOG_WARN_MSG ("Invalid state of decode pipeline(%d)",
            GST_STATE_CHANGE_ASYNC);
        return false;
    }
}

void SinkPipeline::Destroy (void)
{
    if (m_sinkPipeline) {
        gst_element_set_state (m_sinkPipeline, GST_STATE_NULL);

        gst_object_unref (m_sinkPipeline);

        m_sinkPipeline = NULL;
    }
}

void SinkPipeline::SetCallbacks (SinkPutDataFunc func, void* args)
{
    LOG_INFO_MSG ("sink set callback called");

    m_putDataFunc = func;
    m_putDataArgs = args;
}

================================================
FILE: application_develop/app/src/appsrc.cpp
================================================
/*
 * @Description: Appsrc Pipeline Implement.
 * @version: 1.0
 * @Author: Ricardo Lu<shenglu1202@163.com>
 * @Date: 2021-08-28 09:57:13
 * @LastEditors: Ricardo Lu
 * @LastEditTime: 2021-09-01 12:41:48
 */

#include "appsrc.h"

GstFlowReturn cb_appsrc_need_data (
    GstElement* appsrc,
    guint length,
    gpointer user_data)
{
    // LOG_INFO_MSG ("cb_appsrc_need_data called, user_data: %p", user_data);
    
    SrcPipeline* sp = reinterpret_cast<SrcPipeline*> (user_data);
    GstBuffer* buffer;
    GstMapInfo map;
    GstFlowReturn ret = GST_FLOW_OK;

    std::shared_ptr<cv::Mat> img;

    if (sp->m_getDataFunc) {
        img = sp->m_getDataFunc (sp->m_getDataArgs);

        int len = img->total() * img->elemSize();
        // zero-copy GstBuffer
        // buffer = gst_buffer_new_wrapped(img->data, len);
        buffer = gst_buffer_new_allocate (NULL, len, NULL);

        gst_buffer_map(buffer,&map,GST_MAP_READ);
        memcpy(map.data, img->data, len);

        GST_BUFFER_PTS (buffer) = sp->m_timestamp;
        GST_BUFFER_DURATION (buffer) = gst_util_uint64_scale_int (1, GST_SECOND, 25);
        sp->m_timestamp += GST_BUFFER_DURATION (buffer) ;

        // equals to gst_app_src_push_buffer (GST_APP_SRC_CAST (appsrc), buffer);
        g_signal_emit_by_name (appsrc, "push-buffer", buffer, &ret);
        gst_buffer_unmap(buffer, &map);
        gst_buffer_unref (buffer);

        if (ret != GST_FLOW_OK) {
            /* something wrong, stop pushing */
            LOG_ERROR_MSG ("push-buffer failed");
        }
    }

    // usleep (25 * 1000);

    return ret;
}

SrcPipeline::SrcPipeline (const SrcPipelineConfig& config)
{
    m_config = config;
    m_timestamp = 0;
}

SrcPipeline::~SrcPipeline ()
{
    Destroy ();
}

bool SrcPipeline::Create (void)
{
    GstCaps* m_transCaps;

    // display pipeline
    if (!(m_srcPipeline = gst_pipeline_new ("display-pipeline"))) {
        LOG_ERROR_MSG ("Failed to create pipeline named display-pipeline");
        goto exit;
    }
    gst_pipeline_set_auto_flush_bus (GST_PIPELINE (m_srcPipeline), true);

    if (!(m_appsrc = gst_element_factory_make ("appsrc", "appsrc"))) {
        LOG_ERROR_MSG ("Failed to create element appsrc named appsrc");
        goto exit;
    }

    m_transCaps = gst_caps_new_simple ("video/x-raw", "format", G_TYPE_STRING,
        m_config.src_format.c_str(), "width", G_TYPE_INT, m_config.src_width,
        "height", G_TYPE_INT, m_config.src_height, NULL);
    // equals to gst_app_src_set_caps (GST_APP_SRC_CAST (m_appsrc), m_transCaps);
    g_object_set (G_OBJECT(m_appsrc), "caps", m_transCaps, NULL);
    gst_caps_unref (m_transCaps); 

    // equals to gst_app_src_set_stream_type (GST_APP_SRC_CAST (m_appsrc),
    //             GST_APP_STREAM_TYPE_STREAM);
    g_object_set (G_OBJECT(m_appsrc), "stream-type",
        GST_APP_STREAM_TYPE_STREAM, NULL);

    g_object_set (G_OBJECT(m_appsrc), "is-live", true, NULL);

    // full definition of appsrc callbacks
    /*
    GstAppSrcCallbacks callbacks = {cb_appsrc_need_data,
                            cb_appsrc_enough_data, cb_appsrc_seek_data};
    gst_app_src_set_callbacks (GST_APP_SRC_CAST (m_appsrc),
        &callbacks, reinterpret_cast<void*> (this), NULL);
    */
    g_signal_connect (m_appsrc, "need-data",
        G_CALLBACK (cb_appsrc_need_data), reinterpret_cast<void*> (this));

    gst_bin_add_many (GST_BIN (m_srcPipeline), m_appsrc, NULL);

    if (!(m_videoconv = gst_element_factory_make ("videoconvert", "videoconv"))) {
        LOG_ERROR_MSG ("Failed to create element videoconvert named videoconv");
        goto exit;
    }
    gst_bin_add_many (GST_BIN (m_srcPipeline), m_videoconv, NULL);

    m_transCaps = gst_caps_new_simple ("video/x-raw", "format", G_TYPE_STRING,
        m_config.conv_format.c_str(), "width", G_TYPE_INT, m_config.conv_width,
        "height", G_TYPE_INT, m_config.conv_height, NULL);

    if (!(m_capfilter = gst_element_factory_make("capsfilter", "capfilter"))) {
        LOG_ERROR_MSG ("Failed to create element capsfilter named capfilter");
        goto exit;
    }

    g_object_set (G_OBJECT(m_capfilter), "caps", m_transCaps, NULL);
    gst_caps_unref (m_transCaps); 

    gst_bin_add_many (GST_BIN (m_srcPipeline), m_capfilter, NULL);

    if (!(m_displ
Download .txt
gitextract_uf0uoqov/

├── .gitignore
├── LICENSE
├── README.md
├── ai_integration/
│   ├── deepstream/
│   │   ├── CMakeLists.txt
│   │   ├── doc/
│   │   │   └── video-pipeline.dot
│   │   ├── inc/
│   │   │   ├── Common.h
│   │   │   ├── DoubleBufferCache.h
│   │   │   ├── Logger.h
│   │   │   └── VideoPipeline.h
│   │   ├── sp_mp4.json
│   │   ├── sp_rtsp.json
│   │   ├── sp_uc.json
│   │   └── src/
│   │       ├── VideoPipeline.cpp
│   │       └── main.cpp
│   ├── test_30fps.h264
│   ├── test_30fps.ts
│   └── video-pipeline.dot
├── application_develop/
│   ├── GstPadProbe/
│   │   ├── CMakeLists.txt
│   │   ├── README.md
│   │   ├── doc/
│   │   │   └── gstpadprobe.md
│   │   ├── inc/
│   │   │   ├── Common.h
│   │   │   ├── DoubleBufferCache.h
│   │   │   └── VideoPipeline.h
│   │   └── src/
│   │       ├── VideoPipeline.cpp
│   │       └── main.cpp
│   ├── README.md
│   ├── app/
│   │   ├── CMakeLists.txt
│   │   ├── README.md
│   │   ├── doc/
│   │   │   ├── app.md
│   │   │   ├── appsink.md
│   │   │   └── appsrc.md
│   │   ├── inc/
│   │   │   ├── Common.h
│   │   │   ├── DoubleBufferCache.h
│   │   │   ├── appsink.h
│   │   │   └── appsrc.h
│   │   └── src/
│   │       ├── appsink.cpp
│   │       ├── appsrc.cpp
│   │       └── main.cpp
│   ├── build_pipeline/
│   │   ├── CMakeLists.txt
│   │   ├── README.md
│   │   ├── doc/
│   │   │   └── build_pipeline.md
│   │   ├── inc/
│   │   │   ├── Common.h
│   │   │   └── VideoPipeline.h
│   │   └── src/
│   │       ├── VideoPipeline.cpp
│   │       ├── gst_element_factory_make.cpp
│   │       └── gst_parse_launch.cpp
│   ├── custom_user_plugin/
│   │   ├── CMakeLists.txt
│   │   ├── README.md
│   │   ├── config.h
│   │   ├── gstoverlay.c
│   │   └── gstoverlay.h
│   └── uridecodebin/
│       ├── CMakeLists.txt
│       ├── README.md
│       ├── doc/
│       │   └── uridecodebin.md
│       ├── inc/
│       │   ├── Common.h
│       │   ├── DoubleBufferCache.h
│       │   └── VideoPipeline.h
│       └── src/
│           ├── VideoPipeline.cpp
│           └── main.cpp
├── basic_theory/
│   ├── README.md
│   ├── app_dev_manual/
│   │   ├── autoplugging.md
│   │   ├── fundamental.md
│   │   ├── interfaces.md
│   │   ├── metadata.md
│   │   └── threads.md
│   ├── basic_tutorial/
│   │   ├── dynamic_pipelines.md
│   │   ├── gstreamer_concepts.md
│   │   ├── hello_world.md
│   │   ├── media_format.md
│   │   ├── multithread.md
│   │   └── short_cutting_pipeline.md
│   └── playback/
│       ├── hardware_decode.md
│       ├── playbin.md
│       ├── playbin_sink.md
│       ├── progressive_stream.md
│       ├── shortcut_pipeline.md
│       └── subtitle.md
├── deepstream/
│   ├── DeepStreamSample.md
│   └── nvdsosd.md
├── postscript.md
├── qti_gst_plugins/
│   └── qtioverlay/
│       ├── README.md
│       ├── qtimlmeta/
│       │   ├── CMakeLists.txt
│       │   ├── ml_meta.c
│       │   └── ml_meta.h
│       ├── qtioverlay/
│       │   ├── CMakeLists.txt
│       │   ├── config.h.in
│       │   ├── gstoverlay.cc
│       │   └── gstoverlay.h
│       └── qtiqmmf_overlay/
│           ├── CMakeLists.txt
│           ├── overlay_blit_kernel.cl
│           ├── qmmf_overlay.cc
│           └── qmmf_overlay_item.h
└── useful_tricks/
    ├── rtspsrc_1.md
    └── uridecodebin_1.md
Download .txt
SYMBOL INDEX (212 symbols across 35 files)

FILE: ai_integration/deepstream/inc/Common.h
  function class (line 24) | class OSDObject {

FILE: ai_integration/deepstream/inc/DoubleBufferCache.h
  function feed (line 50) | void feed(std::shared_ptr<T> pending) {

FILE: ai_integration/deepstream/inc/Logger.h
  function NowDateToInt (line 24) | static inline int NowDateToInt()
  function NowTimeToInt (line 35) | static inline int NowTimeToInt()
  function spdlog (line 47) | static inline spdlog::level::level_enum GetLogLevel(std::string& level)

FILE: ai_integration/deepstream/inc/VideoPipeline.h
  type VideoType (line 15) | typedef enum _VideoType {
  type VideoPipelineConfig (line 21) | typedef struct _VideoPipelineConfig {
  function class (line 59) | class VideoPipeline {

FILE: ai_integration/deepstream/src/VideoPipeline.cpp
  function GstPadProbeReturn (line 12) | static GstPadProbeReturn cb_sync_before_buffer_probe(
  function GstPadProbeReturn (line 25) | static GstPadProbeReturn cb_sync_after_buffer_probe(
  function GstPadProbeReturn (line 46) | static GstPadProbeReturn cb_queue0_probe(
  function GstFlowReturn (line 83) | static GstFlowReturn cb_appsink_new_sample(
  function gboolean (line 111) | static gboolean cb_seek_decoded_file(gpointer user_data)
  function GstPadProbeReturn (line 132) | static GstPadProbeReturn cb_reset_stream_probe(
  function cb_decodebin_child_added (line 170) | static void cb_decodebin_child_added(GstChildProxy* child_proxy, GObject...
  function cb_uridecodebin_source_setup (line 205) | static void cb_uridecodebin_source_setup(GstElement* object, GstElement*...
  function cb_uridecodebin_pad_added (line 222) | static void cb_uridecodebin_pad_added(GstElement* decodebin, GstPad* pad,
  function cb_uridecodebin_child_added (line 256) | static void cb_uridecodebin_child_added(GstChildProxy* child_proxy,
  function GstElement (line 301) | GstElement* VideoPipeline::CreateUridecodebin()
  function GstElement (line 323) | GstElement* VideoPipeline::CreateV4l2src()

FILE: ai_integration/deepstream/src/main.cpp
  function Parse (line 27) | static void Parse(VideoPipelineConfig& config, std::string& config_path)
  function validateConfigPath (line 107) | static bool validateConfigPath(const char* name, const std::string& value)
  function main (line 126) | int main(int argc, char* argv[])

FILE: application_develop/GstPadProbe/inc/Common.h
  type std (line 31) | typedef std::function<void(std::shared_ptr<cv::Mat>, void*)> SinkPutData...
  type std (line 33) | typedef std::function<std::shared_ptr<cv::Mat>(void*)>       SrcGetDataF...
  type std (line 35) | typedef std::function<std::shared_ptr<cv::Rect>(void*)>      ProbeGetRes...
  type std (line 37) | typedef std::function<void(GstBuffer*,

FILE: application_develop/GstPadProbe/inc/DoubleBufferCache.h
  function feed (line 43) | void feed(std::shared_ptr<T> pending) {

FILE: application_develop/GstPadProbe/inc/VideoPipeline.h
  type VideoPipelineConfig (line 13) | typedef struct _VideoPipelineConfig {
  function class (line 21) | class VideoPipeline

FILE: application_develop/GstPadProbe/src/VideoPipeline.cpp
  function GstPadProbeReturn (line 12) | static GstPadProbeReturn cb_sync_before_buffer_probe (
  function GstPadProbeReturn (line 25) | static GstPadProbeReturn cb_sync_buffer_probe (
  function GstPadProbeReturn (line 46) | static GstPadProbeReturn cb_queue0_probe (
  function GstFlowReturn (line 82) | static GstFlowReturn cb_appsink_new_sample (
  function cb_rtspsrc_pad_added (line 158) | static void cb_rtspsrc_pad_added (
  function cb_qtdemux_pad_added (line 209) | static void cb_qtdemux_pad_added (

FILE: application_develop/GstPadProbe/src/main.cpp
  function validateSrcUri (line 21) | static bool validateSrcUri (const char* name, const std::string& value) {
  function putData (line 58) | void putData (std::shared_ptr<cv::Mat> img, void* user_data)
  function getData (line 77) | std::shared_ptr<cv::Mat> getData (void* user_data)
  function getResult (line 99) | std::shared_ptr<cv::Rect> getResult(void* user_data)
  function procData (line 108) | void procData(GstBuffer* buffer, const std::shared_ptr<cv::Rect>& rect)
  function main (line 138) | int main(int argc, char* argv[])

FILE: application_develop/app/inc/Common.h
  type std (line 30) | typedef std::function<void(std::shared_ptr<cv::Mat>, void*)> SinkPutData...
  type std (line 31) | typedef std::function<std::shared_ptr<cv::Mat>(void*)>       SrcGetDataF...

FILE: application_develop/app/inc/DoubleBufferCache.h
  function feed (line 43) | void feed(std::shared_ptr<T> pending) {

FILE: application_develop/app/inc/appsink.h
  type SinkPipelineConfig (line 13) | typedef struct _SinkPipelineConfig {
  function class (line 21) | class SinkPipeline

FILE: application_develop/app/inc/appsrc.h
  type SrcPipelineConfig (line 13) | typedef struct _SrcPipelineConfig {
  function class (line 24) | class SrcPipeline

FILE: application_develop/app/src/appsink.cpp
  function GstFlowReturn (line 12) | GstFlowReturn cb_appsink_new_sample (
  function cb_qtdemux_pad_added (line 93) | static void cb_qtdemux_pad_added (

FILE: application_develop/app/src/appsrc.cpp
  function GstFlowReturn (line 12) | GstFlowReturn cb_appsrc_need_data (

FILE: application_develop/app/src/main.cpp
  function validateSrcUri (line 21) | static bool validateSrcUri (const char* name, const std::string& value) {
  function putData (line 47) | void putData (std::shared_ptr<cv::Mat> img, void* user_data)
  function getData (line 72) | std::shared_ptr<cv::Mat> getData (void* user_data)
  function main (line 94) | int main(int argc, char* argv[])

FILE: application_develop/build_pipeline/inc/VideoPipeline.h
  type VideoPipelineConfig (line 13) | typedef struct _VideoPipelineConfig {
  function class (line 17) | class VideoPipeline

FILE: application_develop/build_pipeline/src/VideoPipeline.cpp
  function cb_qtdemux_pad_added (line 13) | static void cb_qtdemux_pad_added (

FILE: application_develop/build_pipeline/src/gst_element_factory_make.cpp
  function validateSrcUri (line 17) | static bool validateSrcUri(const char* name, const std::string& value) {
  function main (line 36) | int main(int argc, char* argv[])

FILE: application_develop/build_pipeline/src/gst_parse_launch.cpp
  function validateSrcUri (line 19) | static bool validateSrcUri(const char* name, const std::string& value) {
  function main (line 38) | int main(int argc, char* argv[])

FILE: application_develop/custom_user_plugin/gstoverlay.c
  function gst_overlay_set_property (line 62) | static void gst_overlay_set_property(GObject *object, guint prop_id,
  function gst_overlay_get_property (line 126) | static void gst_overlay_get_property(GObject * object, guint prop_id,
  function gst_overlay_finalize (line 189) | static void gst_overlay_finalize(GObject *object)
  function gboolean (line 218) | static gboolean gst_overlay_set_info(GstVideoFilter *filter,
  function GstFlowReturn (line 267) | static GstFlowReturn gst_overlay_transform_frame_ip(GstVideoFilter *filt...
  function gst_overlay_init (line 287) | static void gst_overlay_init(GstOverlay *gst_overlay)
  function gst_overlay_class_init (line 314) | static void gst_overlay_class_init(GstOverlayClass *klass)
  function gboolean (line 378) | static gboolean plugin_init(GstPlugin * plugin)

FILE: application_develop/custom_user_plugin/gstoverlay.h
  type BufferFormat (line 26) | typedef enum _BufferFormat {
  type GstOverlay (line 31) | typedef struct _GstOverlay      GstOverlay;
  type GstOverlayClass (line 32) | typedef struct _GstOverlayClass GstOverlayClass;
  type GstOverlayText (line 33) | typedef struct _GstOverlayText  GstOverlayText;
  type GstOverlayBBox (line 34) | typedef struct _GstOverlayBBox  GstOverlayBBox;
  type _GstOverlay (line 36) | struct _GstOverlay
  type _GstOverlayClass (line 54) | struct _GstOverlayClass {
  type _GstOverlayText (line 65) | struct _GstOverlayText {
  type _GstOverlayBBox (line 79) | struct _GstOverlayBBox {

FILE: application_develop/uridecodebin/inc/Common.h
  type std (line 31) | typedef std::function<void(std::shared_ptr<cv::Mat>, void*)> SinkPutData...
  type std (line 33) | typedef std::function<std::shared_ptr<cv::Mat>(void*)>       SrcGetDataF...
  type std (line 35) | typedef std::function<std::shared_ptr<cv::Rect>(void*)>      ProbeGetRes...
  type std (line 37) | typedef std::function<void(GstBuffer*,

FILE: application_develop/uridecodebin/inc/DoubleBufferCache.h
  function feed (line 43) | void feed(std::shared_ptr<T> pending) {

FILE: application_develop/uridecodebin/inc/VideoPipeline.h
  type VideoPipelineConfig (line 13) | typedef struct _VideoPipelineConfig {
  function class (line 20) | class VideoPipeline

FILE: application_develop/uridecodebin/src/VideoPipeline.cpp
  function cb_decodebin_child_added (line 19) | static void
  function cb_uridecodebin_source_setup (line 55) | static void cb_uridecodebin_source_setup (
  function cb_uridecodebin_pad_added (line 82) | static void cb_uridecodebin_pad_added (
  function cb_uridecodebin_child_added (line 131) | static void cb_uridecodebin_child_added (

FILE: application_develop/uridecodebin/src/main.cpp
  function validateSrcUri (line 20) | static bool validateSrcUri (const char* name, const std::string& value) {
  function main (line 48) | int main(int argc, char* argv[])

FILE: qti_gst_plugins/qtioverlay/qtimlmeta/ml_meta.c
  function GstDebugCategory (line 34) | static GstDebugCategory *
  function gboolean (line 53) | static gboolean
  function gst_ml_detection_free (line 62) | static void
  function GType (line 70) | GType
  function GstMetaInfo (line 83) | const GstMetaInfo *
  function gboolean (line 100) | static gboolean
  function gst_ml_segmentation_free (line 113) | static void
  function GType (line 124) | GType
  function GstMetaInfo (line 137) | const GstMetaInfo *
  function gboolean (line 154) | static gboolean
  function gst_ml_classification_free (line 163) | static void
  function GType (line 174) | GType
  function GstMetaInfo (line 188) | const GstMetaInfo *
  function gboolean (line 205) | static gboolean
  function GType (line 214) | GType
  function GstMetaInfo (line 228) | const GstMetaInfo *
  function GstMLDetectionMeta (line 245) | GstMLDetectionMeta *
  function GSList (line 257) | GSList *
  function GstMLSegmentationMeta (line 275) | GstMLSegmentationMeta *
  function GSList (line 287) | GSList *
  function GstMLClassificationMeta (line 305) | GstMLClassificationMeta *
  function GSList (line 317) | GSList *
  function GstMLPoseNetMeta (line 335) | GstMLPoseNetMeta *
  function GSList (line 347) | GSList *

FILE: qti_gst_plugins/qtioverlay/qtimlmeta/ml_meta.h
  type _GstMLClassificationResult (line 38) | struct _GstMLClassificationResult
  type GstMLBoundingBox (line 39) | typedef struct _GstMLBoundingBox GstMLBoundingBox;
  type GstMLDetectionMeta (line 40) | typedef struct _GstMLDetectionMeta GstMLDetectionMeta;
  type GstMLSegmentationMeta (line 41) | typedef struct _GstMLSegmentationMeta GstMLSegmentationMeta;
  type GstMLClassificationMeta (line 42) | typedef struct _GstMLClassificationMeta GstMLClassificationMeta;
  type GstMLKeyPoint (line 44) | typedef struct _GstMLKeyPoint GstMLKeyPoint;
  type GstMLPose (line 45) | typedef struct _GstMLPose GstMLPose;
  type GstMLPoseNetMeta (line 46) | typedef struct _GstMLPoseNetMeta GstMLPoseNetMeta;
  type _GstMLBoundingBox (line 70) | struct _GstMLBoundingBox {
  type _GstMLClassificationResult (line 84) | struct _GstMLClassificationResult {
  type _GstMLDetectionMeta (line 97) | struct _GstMLDetectionMeta {
  type _GstMLSegmentationMeta (line 116) | struct _GstMLSegmentationMeta {
  type _GstMLClassificationMeta (line 136) | struct _GstMLClassificationMeta {
  type GstMLKeyPointsType (line 144) | enum GstMLKeyPointsType{
  type _GstMLKeyPoint (line 173) | struct _GstMLKeyPoint {
  type _GstMLPoseNetMeta (line 188) | struct _GstMLPoseNetMeta {

FILE: qti_gst_plugins/qtioverlay/qtioverlay/gstoverlay.cc
  function GstCaps (line 164) | static GstCaps *
  function GstPadTemplate (line 181) | static GstPadTemplate *
  function GstPadTemplate (line 193) | static GstPadTemplate *
  function gst_overlay_destroy_overlay_item (line 207) | static void
  function gboolean (line 240) | static gboolean
  function gboolean (line 288) | static gboolean
  function gboolean (line 360) | static gboolean
  function gst_overlay_apply_user_bbox_item (line 394) | static void
  function gboolean (line 428) | static gboolean
  function gboolean (line 503) | static gboolean
  function gst_overlay_apply_user_simg_item (line 537) | static void
  function gboolean (line 583) | static gboolean
  function gboolean (line 655) | static gboolean
  function gst_overlay_apply_user_text_item (line 678) | static void
  function gboolean (line 710) | static gboolean
  function gboolean (line 916) | static gboolean
  function gst_overlay_apply_user_date_item (line 979) | static void
  function gboolean (line 1013) | static gboolean
  function gst_overlay_apply_user_mask_item (line 1084) | static void
  function gboolean (line 1114) | static gboolean
  function gboolean (line 1148) | static gboolean
  function gboolean (line 1216) | static gboolean
  function gboolean (line 1305) | static gboolean
  function gboolean (line 1407) | static gboolean
  function gboolean (line 1480) | static gboolean
  function gint (line 1589) | static gint
  function gst_overlay_set_user_overlay (line 1610) | static void
  function gst_overlay_text_overlay_to_string (line 1685) | static void
  function gst_overlay_date_overlay_to_string (line 1732) | static void
  function gst_overlay_simg_overlay_to_string (line 1813) | static void
  function gst_overlay_bbox_overlay_to_string (line 1861) | static void
  function gst_overlay_mask_overlay_to_string (line 1909) | static void
  function gst_overlay_get_user_overlay (line 1976) | static void
  function gst_overlay_set_property (line 2008) | static void
  function gst_overlay_get_property (line 2089) | static void
  function gst_overlay_finalize (line 2163) | static void
  function gboolean (line 2212) | static gboolean
  function GstFlowReturn (line 2275) | static GstFlowReturn
  function gst_overlay_free_user_overlay_entry (line 2373) | static void
  function gst_overlay_free_user_text_entry (line 2393) | static void
  function gst_overlay_free_user_simg_entry (line 2409) | static void
  function gst_overlay_free_user_bbox_entry (line 2426) | static void
  function gst_overlay_init (line 2436) | static void
  function gst_overlay_class_init (line 2467) | static void
  function gboolean (line 2564) | static gboolean

FILE: qti_gst_plugins/qtioverlay/qtioverlay/gstoverlay.h
  type GstOverlay (line 55) | typedef struct _GstOverlay GstOverlay;
  type GstOverlayClass (line 56) | typedef struct _GstOverlayClass GstOverlayClass;
  type GstOverlayUser (line 57) | typedef struct _GstOverlayUser GstOverlayUser;
  type GstOverlayUsrText (line 58) | typedef struct _GstOverlayUsrText GstOverlayUsrText;
  type GstOverlayUsrDate (line 59) | typedef struct _GstOverlayUsrDate GstOverlayUsrDate;
  type GstOverlayUsrSImg (line 60) | typedef struct _GstOverlayUsrSImg GstOverlayUsrSImg;
  type GstOverlayUsrBBox (line 61) | typedef struct _GstOverlayUsrBBox GstOverlayUsrBBox;
  type GstOverlayUsrMask (line 62) | typedef struct _GstOverlayUsrMask GstOverlayUsrMask;
  type GstOverlayString (line 63) | typedef struct _GstOverlayString GstOverlayString;
  type _GstOverlay (line 65) | struct _GstOverlay {
  type _GstOverlayClass (line 95) | struct _GstOverlayClass {
  type _GstOverlayUser (line 105) | struct _GstOverlayUser {
  type _GstOverlayUsrText (line 119) | struct _GstOverlayUsrText {
  type _GstOverlayUsrDate (line 133) | struct _GstOverlayUsrDate {
  type _GstOverlayUsrSImg (line 150) | struct _GstOverlayUsrSImg {
  type _GstOverlayUsrBBox (line 166) | struct _GstOverlayUsrBBox {
  type _GstOverlayUsrMask (line 181) | struct _GstOverlayUsrMask {
  type _GstOverlayString (line 194) | struct _GstOverlayString {

FILE: qti_gst_plugins/qtioverlay/qtiqmmf_overlay/qmmf_overlay.cc
  type qmmf (line 59) | namespace qmmf {
    type overlay (line 61) | namespace overlay {
      type SyncObject (line 473) | struct SyncObject
      type SyncObject (line 473) | struct SyncObject
      type timeval (line 1749) | struct timeval
      type timeval (line 1941) | struct timeval
      type tm (line 1943) | struct tm

FILE: qti_gst_plugins/qtioverlay/qtiqmmf_overlay/qmmf_overlay_item.h
  type OpenClFrame (line 105) | struct OpenClFrame {
  type OpenCLArgs (line 114) | struct OpenCLArgs {
  type SyncObject (line 196) | struct SyncObject {
  type DrawInfo (line 205) | struct DrawInfo {
  type RGBAValues (line 222) | struct RGBAValues {
  type C2dObjects (line 229) | struct C2dObjects {
  function class (line 233) | class OverlaySurface {
  function class (line 261) | class OverlayItem {
  function UpdateAndDraw (line 432) | int32_t UpdateAndDraw() override;
Condensed preview — 94 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (804K chars).
[
  {
    "path": ".gitignore",
    "chars": 343,
    "preview": "# Prerequisites\n*.d\n\n# Compiled Object files\n*.slo\n*.lo\n*.o\n*.obj\n\n# Precompiled Headers\n*.gch\n*.pch\n\n# Compiled Dynamic"
  },
  {
    "path": "LICENSE",
    "chars": 1067,
    "preview": "MIT License\n\nCopyright (c) 2021 Ricardo Lu\n\nPermission is hereby granted, free of charge, to any person obtaining a copy"
  },
  {
    "path": "README.md",
    "chars": 2578,
    "preview": "# GStreamer-example\n\n[![](https://img.shields.io/badge/Auther-@RicardoLu-red.svg)](https://github.com/gesanqiu)![](https"
  },
  {
    "path": "ai_integration/deepstream/CMakeLists.txt",
    "chars": 3073,
    "preview": "# create by Ricardo Lu in 07/15/2022\n\ncmake_minimum_required(VERSION 3.10)\n\nproject(ds-yolov5s)\n\nset(CMAKE_CXX_STANDARD "
  },
  {
    "path": "ai_integration/deepstream/doc/video-pipeline.dot",
    "chars": 30368,
    "preview": "digraph pipeline {\n  rankdir=LR;\n  fontname=\"sans\";\n  fontsize=\"10\";\n  labelloc=t;\n  nodesep=.1;\n  ranksep=.2;\n  label=\""
  },
  {
    "path": "ai_integration/deepstream/inc/Common.h",
    "chars": 3427,
    "preview": "/*\n * @Description: Common Utils.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021-08-27 12:2"
  },
  {
    "path": "ai_integration/deepstream/inc/DoubleBufferCache.h",
    "chars": 2694,
    "preview": "/*\n * @Description: Double Buffer Cache Implement.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date"
  },
  {
    "path": "ai_integration/deepstream/inc/Logger.h",
    "chars": 4191,
    "preview": "/*\n * @Description: Single Instance logger based on spdlog.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>"
  },
  {
    "path": "ai_integration/deepstream/inc/VideoPipeline.h",
    "chars": 5528,
    "preview": "/*\n * @Description: Implement of VideoPipeline on DeepStream.\n * @version: 2.0\n * @Author: Ricardo Lu<shenglu1202@163.co"
  },
  {
    "path": "ai_integration/deepstream/sp_mp4.json",
    "chars": 988,
    "preview": "{\n    \"name\":\"pipeline0\",\n    \"input-config\":{\n        \"type\":1,\n        \"stream\":{\n            \"uri\":\"file:///home/rica"
  },
  {
    "path": "ai_integration/deepstream/sp_rtsp.json",
    "chars": 947,
    "preview": "{\n    \"name\":\"pipeline0\",\n    \"input-config\":{\n        \"type\":1,\n        \"stream\":{\n            \"uri\":\"rtsp://127.0.0.1:"
  },
  {
    "path": "ai_integration/deepstream/sp_uc.json",
    "chars": 946,
    "preview": "{\n    \"name\":\"pipeline0\",\n    \"input-config\":{\n        \"type\":2,\n        \"stream\":{\n            \"uri\":\"rtsp://127.0.0.1:"
  },
  {
    "path": "ai_integration/deepstream/src/VideoPipeline.cpp",
    "chars": 26034,
    "preview": "/*\n * @Description: Implement of VideoPipeline on DeepStream.\n * @version: 2.0\n * @Author: Ricardo Lu<shenglu1202@163.co"
  },
  {
    "path": "ai_integration/deepstream/src/main.cpp",
    "chars": 6922,
    "preview": "/*\n * @Description: Test program of VideoPipeline.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date"
  },
  {
    "path": "ai_integration/video-pipeline.dot",
    "chars": 30367,
    "preview": "digraph pipeline {\n  rankdir=LR;\n  fontname=\"sans\";\n  fontsize=\"10\";\n  labelloc=t;\n  nodesep=.1;\n  ranksep=.2;\n  label=\""
  },
  {
    "path": "application_develop/GstPadProbe/CMakeLists.txt",
    "chars": 1279,
    "preview": "# created by Ricardo Lu in 08/29/2021\n\ncmake_minimum_required(VERSION 3.10)\n\nproject(GstPadProbe)\n\nset(CMAKE_CXX_STANDAR"
  },
  {
    "path": "application_develop/GstPadProbe/README.md",
    "chars": 924,
    "preview": "# GstPadProbe\n\nGstPad上发生的数据流、事件和查询可以通过探针进行监控,探针通过`gst_pad_add_probe()`安装,这为开发者提供了另外一种访问GStreamer pipeline数据的方式。\n\n**教程地址:"
  },
  {
    "path": "application_develop/GstPadProbe/doc/gstpadprobe.md",
    "chars": 6239,
    "preview": "# GstPadProbe\n\n在[GStreamer-APP](https://ricardolu.gitbook.io/gstreamer/application-development/app)章节讲到了应用程序和GStreamer p"
  },
  {
    "path": "application_develop/GstPadProbe/inc/Common.h",
    "chars": 1126,
    "preview": "/*\n * @Description: Common Utils.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021-08-27 12:2"
  },
  {
    "path": "application_develop/GstPadProbe/inc/DoubleBufferCache.h",
    "chars": 2439,
    "preview": "/*\n * @Description: Double Buffer Cache Implement.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date"
  },
  {
    "path": "application_develop/GstPadProbe/inc/VideoPipeline.h",
    "chars": 2322,
    "preview": "/*\n * @Description: GstPipeline common header.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 20"
  },
  {
    "path": "application_develop/GstPadProbe/src/VideoPipeline.cpp",
    "chars": 18531,
    "preview": "/*\n * @Description: Implement of VideoPipeline.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2"
  },
  {
    "path": "application_develop/GstPadProbe/src/main.cpp",
    "chars": 5839,
    "preview": "/*\n * @Description: Test Program.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021-08-28 09:1"
  },
  {
    "path": "application_develop/README.md",
    "chars": 1031,
    "preview": "# Application Development\n\n[![](https://img.shields.io/badge/Author-@RucardoLu-red.svg)](https://github.com/gesanqiu)![]"
  },
  {
    "path": "application_develop/app/CMakeLists.txt",
    "chars": 1007,
    "preview": "# created by Ricardo Lu in 08/29/2021\n\ncmake_minimum_required(VERSION 3.10)\n\nproject(app)\n\nset(CMAKE_CXX_STANDARD 11)\n\ns"
  },
  {
    "path": "application_develop/app/README.md",
    "chars": 587,
    "preview": "# app\n\n为了完成应用程序与GStreamer Pipeline的数据交互,GStreamer提供了两个插件:\n\n- [GstAppSink](https://gstreamer.freedesktop.org/documentatio"
  },
  {
    "path": "application_develop/app/doc/app.md",
    "chars": 396,
    "preview": "# App\n\n为了完成应用程序与GStreamer Pipeline的数据交互,GStreamer提供了两个插件:\n\n- [GstAppSink](https://gstreamer.freedesktop.org/documentatio"
  },
  {
    "path": "application_develop/app/doc/appsink.md",
    "chars": 7665,
    "preview": "# Appsink\n\nappsink是一个sink插件,具有多种方法允许应用程序获取Pipeline的数据句柄。与大部分插件不同,除了Action signals的方式以外,appsink还提供了一系列的外部接口`gst_app_sink_"
  },
  {
    "path": "application_develop/app/doc/appsrc.md",
    "chars": 5269,
    "preview": "# Appsrc\n\n应用程序可以通过Appsrc插件向GStreamer pipeline中插入数据,与大部分插件不同,除了Action signals的方式以外,appsrc还提供了一系列的外部接口`gst_app_src_<functi"
  },
  {
    "path": "application_develop/app/inc/Common.h",
    "chars": 920,
    "preview": "/*\n * @Description: Common Utils.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021-08-27 12:2"
  },
  {
    "path": "application_develop/app/inc/DoubleBufferCache.h",
    "chars": 2439,
    "preview": "/*\n * @Description: Double Buffer Cache Implement.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date"
  },
  {
    "path": "application_develop/app/inc/appsink.h",
    "chars": 1467,
    "preview": "/*\n * @Description: Appsink Pipeline Header.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021"
  },
  {
    "path": "application_develop/app/inc/appsrc.h",
    "chars": 1518,
    "preview": "/*\n * @Description: Appsrc Pipeline Header.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021-"
  },
  {
    "path": "application_develop/app/src/appsink.cpp",
    "chars": 9826,
    "preview": "/*\n * @Description: Appsink Pipeline Implement.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2"
  },
  {
    "path": "application_develop/app/src/appsrc.cpp",
    "chars": 6957,
    "preview": "/*\n * @Description: Appsrc Pipeline Implement.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 20"
  },
  {
    "path": "application_develop/app/src/main.cpp",
    "chars": 4905,
    "preview": "/*\n * @Description: Test Program.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021-08-28 09:1"
  },
  {
    "path": "application_develop/build_pipeline/CMakeLists.txt",
    "chars": 1358,
    "preview": "# created by Ricardo Lu in 08/28/2021\n\ncmake_minimum_required(VERSION 3.10)\n\nproject(build_pipeline)\n\nset(CMAKE_CXX_STAN"
  },
  {
    "path": "application_develop/build_pipeline/README.md",
    "chars": 1312,
    "preview": "# Build Pipeline\n\nGStreamer提供了一个命令行工具`gst-launch-1.0`用于快速构建运行Pipeline,同样的GStreamer也提供了C-API用于在C/C++应用开发中引入GStreamer Pipe"
  },
  {
    "path": "application_develop/build_pipeline/doc/build_pipeline.md",
    "chars": 9430,
    "preview": "# Build Pipeline\n\nGStreamer提供了一个命令行工具`gst-launch-1.0`用于快速构建运行Pipeline,同样的GStreamer也提供了C-API用于在C/C++开发中引入GStreamer Pipeli"
  },
  {
    "path": "application_develop/build_pipeline/inc/Common.h",
    "chars": 667,
    "preview": "/*\n * @Description: Common Utils.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021-08-27 12:2"
  },
  {
    "path": "application_develop/build_pipeline/inc/VideoPipeline.h",
    "chars": 1355,
    "preview": "/*\n * @Description: GstPipeline common header.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 20"
  },
  {
    "path": "application_develop/build_pipeline/src/VideoPipeline.cpp",
    "chars": 8896,
    "preview": "/*\n * @Description: Implement of VideoPipeline.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2"
  },
  {
    "path": "application_develop/build_pipeline/src/gst_element_factory_make.cpp",
    "chars": 1655,
    "preview": "/*\n * @Description: Build GstPipeline with GstElementFactory.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.co"
  },
  {
    "path": "application_develop/build_pipeline/src/gst_parse_launch.cpp",
    "chars": 2147,
    "preview": "/*\n * @Description: Build GstPipeline with GstParse.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Da"
  },
  {
    "path": "application_develop/custom_user_plugin/CMakeLists.txt",
    "chars": 36,
    "preview": "# create by Ricardo Lu in 05-19-2022"
  },
  {
    "path": "application_develop/custom_user_plugin/README.md",
    "chars": 37,
    "preview": "# gstreamer-example\n\nGstreamer开发教程。\n\n"
  },
  {
    "path": "application_develop/custom_user_plugin/config.h",
    "chars": 496,
    "preview": "/*\n * @Description: Config herader.\n * @version: 1.0\n * @Author: Ricardo Lu<sheng.lu@thundercomm.com>\n * @Date: 2022-05-"
  },
  {
    "path": "application_develop/custom_user_plugin/gstoverlay.c",
    "chars": 13968,
    "preview": "/*\n * @Description: GStreamer overlay plugin.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 202"
  },
  {
    "path": "application_develop/custom_user_plugin/gstoverlay.h",
    "chars": 2844,
    "preview": "/*\n * @Description: GStreamer overlay plugin.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 202"
  },
  {
    "path": "application_develop/uridecodebin/CMakeLists.txt",
    "chars": 1004,
    "preview": "# created by Ricardo Lu in 09/01/2021\n\ncmake_minimum_required(VERSION 3.10)\n\nproject(uridecoderbin)\n\nset(CMAKE_CXX_STAND"
  },
  {
    "path": "application_develop/uridecodebin/README.md",
    "chars": 1321,
    "preview": "# uridecodebin\n\nuridecodebin是属于Playback部分的内容,内部集成了一系列自动化操作,可以有效缩短pipeline的元素,但是整个pipeline的构建过程对用户并不透明,因此不能很好的控制内部元素的链接,这"
  },
  {
    "path": "application_develop/uridecodebin/doc/uridecodebin.md",
    "chars": 6047,
    "preview": "# uridecodebin\n\n`uridecodebin`能够将URI数据解码成raw media,它会自动选择一个能处理uri数据的source element并将其和`decodebin`链接。\n\n**Github: [urideco"
  },
  {
    "path": "application_develop/uridecodebin/inc/Common.h",
    "chars": 1126,
    "preview": "/*\n * @Description: Common Utils.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021-08-27 12:2"
  },
  {
    "path": "application_develop/uridecodebin/inc/DoubleBufferCache.h",
    "chars": 2439,
    "preview": "/*\n * @Description: Double Buffer Cache Implement.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date"
  },
  {
    "path": "application_develop/uridecodebin/inc/VideoPipeline.h",
    "chars": 1456,
    "preview": "/*\n * @Description: GstPipeline common header.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 20"
  },
  {
    "path": "application_develop/uridecodebin/src/VideoPipeline.cpp",
    "chars": 10124,
    "preview": "/*\n * @Description: Implement of VideoPipeline.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2"
  },
  {
    "path": "application_develop/uridecodebin/src/main.cpp",
    "chars": 2139,
    "preview": "/*\n * @Description: Test Program.\n * @version: 1.0\n * @Author: Ricardo Lu<shenglu1202@163.com>\n * @Date: 2021-08-28 09:1"
  },
  {
    "path": "basic_theory/README.md",
    "chars": 1471,
    "preview": "# Basic Theory\n\n[![](https://img.shields.io/badge/Author-@RucardoLu-red.svg)](https://github.com/gesanqiu)![](https://im"
  },
  {
    "path": "basic_theory/app_dev_manual/autoplugging.md",
    "chars": 3163,
    "preview": "# Autoplugging\n\n由于autoplugging这一概念具备充分的动态性,GStreamer可以自动拓展以支持新的数据类型而无需修改autoplugger。\n\n## Media types as a way to identif"
  },
  {
    "path": "basic_theory/app_dev_manual/fundamental.md",
    "chars": 7735,
    "preview": "# Building an Application\n\n这一章节,我们将讨论GStreamer的基础概念和最常用的objects,例如elements,pads和buffers。我们将使用这些objects的可视化表示,以便能够可视化您稍后将"
  },
  {
    "path": "basic_theory/app_dev_manual/interfaces.md",
    "chars": 1577,
    "preview": "# Interfaces\n\n在应用程序中,将element定义为GObject对象便可沿用GObject中设置对象属性的方式,这为应用程序与element的交互带来便利。当然,此种设置对象属性的方式仅包含面向对象编程中常见的gette与se"
  },
  {
    "path": "basic_theory/app_dev_manual/metadata.md",
    "chars": 4775,
    "preview": "# Metadata\n\nGStreamer对其支持的2种元数据有着清晰的分类。其一,Stream-tag,这类元数据以非技术的方式描述数据流中的内容。其二,Stream-info,它则以技术方式准确描述数据流中的各项属性。 例如,当数据流为"
  },
  {
    "path": "basic_theory/app_dev_manual/threads.md",
    "chars": 8315,
    "preview": "# Threads\n\nGStreamer的设计原生支持多线程,并完全保证线程安全。大多数情况下,多线程实现细节对基于GStreamer开发的应用程序隐藏,因为这会让应用程序开发更便利。而在某些场景下,应用程序可能会介入Gstreamer的多"
  },
  {
    "path": "basic_theory/basic_tutorial/dynamic_pipelines.md",
    "chars": 13373,
    "preview": "# Basic tutorial 3: Dynamic pipelines\n\n## 目标\n\n这篇教程展示了使用GStreamer需要的剩余基本概念,允许你随着数据流动来构建pipeline, 而不是在应用程序的一开始就定义一个完整的管道。\n"
  },
  {
    "path": "basic_theory/basic_tutorial/gstreamer_concepts.md",
    "chars": 7885,
    "preview": "# Basic tutorial 2: GStreamer concepts\n\n## 目标\n\n上一篇教程展示了如何自动地都剑一条pipeline。现在我们将手动构建一条pipeline:初始化每一个element并将它们连接起来。在本教程中"
  },
  {
    "path": "basic_theory/basic_tutorial/hello_world.md",
    "chars": 5111,
    "preview": "# Basic tutorial1: Hello world!\n\n## 目标\n\n大部分人对大多数编程教程的第一印象都是在屏幕上输出”Hello World“,但是GStreamer作为一个多媒体框架,将以播放一个视频作为第一个教程。\n\n##"
  },
  {
    "path": "basic_theory/basic_tutorial/media_format.md",
    "chars": 12478,
    "preview": "# Basic tutorial 6: Media formats and Pad Capabilities\n\n## 目标\n\nPad的Capabilities时GStreamer的一个基本元素,由于大部分时间都由框架自动处理它们,所以用户很"
  },
  {
    "path": "basic_theory/basic_tutorial/multithread.md",
    "chars": 9551,
    "preview": "# Basic tutorial 7: Multithreading and Pad Availability\n\n## 目标\n\nGStreamer自动处理多线程,但是在某些情况下,用户可能需要手动解耦线程。这篇教程将展示如何解耦线程以及完善"
  },
  {
    "path": "basic_theory/basic_tutorial/short_cutting_pipeline.md",
    "chars": 16916,
    "preview": "# Basic tutorial 8: Short-cutting the pipeline\n\n## 目标\n\nGStreamer构造的pipeline不需要完全封闭,有几种方式允许用户在任意时间向pipeline注入或提取数据。本教程将展示"
  },
  {
    "path": "basic_theory/playback/hardware_decode.md",
    "chars": 2665,
    "preview": "# Playback tutorial 8: Hardware-accelerated video decodingHardware-accelerated video decoding.\n\n### Goal\n\n随着低功耗设备变得越来越普遍"
  },
  {
    "path": "basic_theory/playback/playbin.md",
    "chars": 16452,
    "preview": "# Playback tutorial 1: Playbin usage\n\n## Goal\n\n使用`playbin`,我们可以很方便的构建一个完整的播放pipeline而不需要做太多工作。这篇教程将展示如何进一步定制`playbin`以防它"
  },
  {
    "path": "basic_theory/playback/playbin_sink.md",
    "chars": 4841,
    "preview": "# Playback tutorial 7: Custom playbin sinks\n\n## Goal\n\n`playbin`可以通过手动选择其音频和视频sink进行进一步定制。这允许应用程序仅依赖`playbin`提取和解码媒体数据然后自"
  },
  {
    "path": "basic_theory/playback/progressive_stream.md",
    "chars": 10323,
    "preview": "# Playback tutorial 4: Progressive streaming\n\n## Goal\n\nBasic tutorial 12: Streaming展示了如何在糟糕的网络情况下提高用户体验,通过使用缓冲机制。这篇教程是Ba"
  },
  {
    "path": "basic_theory/playback/shortcut_pipeline.md",
    "chars": 7573,
    "preview": "# Playback tutorial 3: Short-cutting the pipeline\n\n## Goal\n\n在[Basic tutorial 8: Short-cutting the pipeline]()中展示了一个应用程序如"
  },
  {
    "path": "basic_theory/playback/subtitle.md",
    "chars": 10361,
    "preview": "# Playback tutorial 2: Subtitle management\n\n## Goal\n\n这篇教程与上一篇非常相似,但是我们将切换字幕流而不是音频流。我们将学到:\n\n- 如何选择字幕流。\n- 如何添加外挂字幕。\n- 如何自定"
  },
  {
    "path": "deepstream/DeepStreamSample.md",
    "chars": 8406,
    "preview": "# DeepStream学习拾遗\n\n## nvstreammux\n\n![Gst-nvstreammux](images/DeepStreamSample/DS_plugin_gst-nvstreammux.png)\n\n`nvstreammu"
  },
  {
    "path": "deepstream/nvdsosd.md",
    "chars": 7686,
    "preview": "# nvdsosd\n\n## Overview\n\nNvidia的[nvdsosd](https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_plugin_gst-nvds"
  },
  {
    "path": "postscript.md",
    "chars": 1224,
    "preview": "# 后记\n\n从想法诞生到实践到写下这篇后记不过短短半个月,虽然在翻译Basic tutorial的过程中规划了越来越多的翻译内容,但其中很多Core Library API Reference已经超出了我现在的理解范围,大多数时候我都是有针"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/README.md",
    "chars": 9163,
    "preview": "# qtioverlay\n\n## Overview\n\n高通的[qtioverlay](https://developer.qualcomm.com/qualcomm-robotics-rb5-kit/software-reference-m"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtimlmeta/CMakeLists.txt",
    "chars": 1146,
    "preview": "cmake_minimum_required(VERSION 3.8.2)\nproject(GST_PLUGIN_QTI_OSS_OVERLAY\n  VERSION ${GST_PLUGINS_QTI_OSS_VERSION}\n  LANG"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtimlmeta/ml_meta.c",
    "chars": 10448,
    "preview": "/*\n* Copyright (c) 2020, The Linux Foundation. All rights reserved.\n*\n* Redistribution and use in source and binary form"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtimlmeta/ml_meta.h",
    "chars": 8679,
    "preview": "/*\n* Copyright (c) 2020, The Linux Foundation. All rights reserved.\n*\n* Redistribution and use in source and binary form"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtioverlay/CMakeLists.txt",
    "chars": 1614,
    "preview": "cmake_minimum_required(VERSION 3.8.2)\nproject(GST_PLUGIN_QTI_OSS_OVERLAY\n  VERSION ${GST_PLUGINS_QTI_OSS_VERSION}\n  LANG"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtioverlay/config.h.in",
    "chars": 1873,
    "preview": "/*\n * Copyright (c) 2020, The Linux Foundation. All rights reserved.\n *\n * Redistribution and use in source and binary f"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtioverlay/gstoverlay.cc",
    "chars": 83789,
    "preview": "/*\n* Copyright (c) 2020, The Linux Foundation. All rights reserved.\n*\n* Redistribution and use in source and binary form"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtioverlay/gstoverlay.h",
    "chars": 7025,
    "preview": "/*\n* Copyright (c) 2020, The Linux Foundation. All rights reserved.\n*\n* Redistribution and use in source and binary form"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtiqmmf_overlay/CMakeLists.txt",
    "chars": 1267,
    "preview": "cmake_minimum_required(VERSION 3.1)\n\nproject(qmmf_overlay)\n\nset(CMAKE_CXX_FLAGS \"${CMAKE_CXX_FLAGS} -DUSE_SKIA=0 -DUSE_C"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtiqmmf_overlay/overlay_blit_kernel.cl",
    "chars": 3728,
    "preview": "/*\n* Copyright (c) 2020, The Linux Foundation. All rights reserved.\n*\n* Redistribution and use in source and binary form"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtiqmmf_overlay/qmmf_overlay.cc",
    "chars": 111821,
    "preview": "/*\n* Copyright (c) 2016-2020, The Linux Foundation. All rights reserved.\n*\n* Redistribution and use in source and binary"
  },
  {
    "path": "qti_gst_plugins/qtioverlay/qtiqmmf_overlay/qmmf_overlay_item.h",
    "chars": 16322,
    "preview": "/*\n* Copyright (c) 2016-2020, The Linux Foundation. All rights reserved.\n*\n* Redistribution and use in source and binary"
  },
  {
    "path": "useful_tricks/rtspsrc_1.md",
    "chars": 4356,
    "preview": "# GStreamer源码剖析之——rtspsrc(1)\n\n>  RTSP URL密码中包含'@'的解决方法。\n\n**前言:**为了让Application的Stream Pipeline具有更好的适应性,能够处理不同格式的输入流,在设计P"
  },
  {
    "path": "useful_tricks/uridecodebin_1.md",
    "chars": 3384,
    "preview": "# GStreamer源码剖析——uridecodebin(1)\n\n> How to enable software decoder with uridecedebin.\n\n**前言:**Gstreamer中的`uridecodebin`插"
  }
]

// ... and 2 more files (download for full content)

About this extraction

This page contains the full source code of the gesanqiu/gstreamer-example GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 94 files (39.2 MB), approximately 209.9k tokens, and a symbol index with 212 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!