[
  {
    "path": ".gitattributes",
    "content": "*.aar filter=lfs diff=lfs merge=lfs -text\n"
  },
  {
    "path": ".gitignore",
    "content": ".idea/\n\n.DS_Store\n.dart_tool/\n\n.packages\n.pub/\n\nbuild/\n\npubspec.lock"
  },
  {
    "path": ".metadata",
    "content": "# This file tracks properties of this Flutter project.\n# Used by Flutter tool to assess capabilities and perform upgrades etc.\n#\n# This file should be version controlled and should not be manually edited.\n\nversion:\n  revision: 9f5ff2306bb3e30b2b98eee79cd231b1336f41f4\n  channel: stable\n\nproject_type: plugin\n"
  },
  {
    "path": "CHANGELOG.md",
    "content": "## 0.0.1\n\n* TODO: Describe initial release.\n"
  },
  {
    "path": "LICENSE",
    "content": "                                 Apache License\n                           Version 2.0, January 2004\n                        http://www.apache.org/licenses/\n\n   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n   1. Definitions.\n\n      \"License\" shall mean the terms and conditions for use, reproduction,\n      and distribution as defined by Sections 1 through 9 of this document.\n\n      \"Licensor\" shall mean the copyright owner or entity authorized by\n      the copyright owner that is granting the License.\n\n      \"Legal Entity\" shall mean the union of the acting entity and all\n      other entities that control, are controlled by, or are under common\n      control with that entity. For the purposes of this definition,\n      \"control\" means (i) the power, direct or indirect, to cause the\n      direction or management of such entity, whether by contract or\n      otherwise, or (ii) ownership of fifty percent (50%) or more of the\n      outstanding shares, or (iii) beneficial ownership of such entity.\n\n      \"You\" (or \"Your\") shall mean an individual or Legal Entity\n      exercising permissions granted by this License.\n\n      \"Source\" form shall mean the preferred form for making modifications,\n      including but not limited to software source code, documentation\n      source, and configuration files.\n\n      \"Object\" form shall mean any form resulting from mechanical\n      transformation or translation of a Source form, including but\n      not limited to compiled object code, generated documentation,\n      and conversions to other media types.\n\n      \"Work\" shall mean the work of authorship, whether in Source or\n      Object form, made available under the License, as indicated by a\n      copyright notice that is included in or attached to the work\n      (an example is provided in the Appendix below).\n\n      \"Derivative Works\" shall mean any work, whether in Source or Object\n      form, that is based on (or derived from) the Work and for which the\n      editorial revisions, annotations, elaborations, or other modifications\n      represent, as a whole, an original work of authorship. For the purposes\n      of this License, Derivative Works shall not include works that remain\n      separable from, or merely link (or bind by name) to the interfaces of,\n      the Work and Derivative Works thereof.\n\n      \"Contribution\" shall mean any work of authorship, including\n      the original version of the Work and any modifications or additions\n      to that Work or Derivative Works thereof, that is intentionally\n      submitted to Licensor for inclusion in the Work by the copyright owner\n      or by an individual or Legal Entity authorized to submit on behalf of\n      the copyright owner. For the purposes of this definition, \"submitted\"\n      means any form of electronic, verbal, or written communication sent\n      to the Licensor or its representatives, including but not limited to\n      communication on electronic mailing lists, source code control systems,\n      and issue tracking systems that are managed by, or on behalf of, the\n      Licensor for the purpose of discussing and improving the Work, but\n      excluding communication that is conspicuously marked or otherwise\n      designated in writing by the copyright owner as \"Not a Contribution.\"\n\n      \"Contributor\" shall mean Licensor and any individual or Legal Entity\n      on behalf of whom a Contribution has been received by Licensor and\n      subsequently incorporated within the Work.\n\n   2. Grant of Copyright License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      copyright license to reproduce, prepare Derivative Works of,\n      publicly display, publicly perform, sublicense, and distribute the\n      Work and such Derivative Works in Source or Object form.\n\n   3. Grant of Patent License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      (except as stated in this section) patent license to make, have made,\n      use, offer to sell, sell, import, and otherwise transfer the Work,\n      where such license applies only to those patent claims licensable\n      by such Contributor that are necessarily infringed by their\n      Contribution(s) alone or by combination of their Contribution(s)\n      with the Work to which such Contribution(s) was submitted. If You\n      institute patent litigation against any entity (including a\n      cross-claim or counterclaim in a lawsuit) alleging that the Work\n      or a Contribution incorporated within the Work constitutes direct\n      or contributory patent infringement, then any patent licenses\n      granted to You under this License for that Work shall terminate\n      as of the date such litigation is filed.\n\n   4. Redistribution. You may reproduce and distribute copies of the\n      Work or Derivative Works thereof in any medium, with or without\n      modifications, and in Source or Object form, provided that You\n      meet the following conditions:\n\n      (a) You must give any other recipients of the Work or\n          Derivative Works a copy of this License; and\n\n      (b) You must cause any modified files to carry prominent notices\n          stating that You changed the files; and\n\n      (c) You must retain, in the Source form of any Derivative Works\n          that You distribute, all copyright, patent, trademark, and\n          attribution notices from the Source form of the Work,\n          excluding those notices that do not pertain to any part of\n          the Derivative Works; and\n\n      (d) If the Work includes a \"NOTICE\" text file as part of its\n          distribution, then any Derivative Works that You distribute must\n          include a readable copy of the attribution notices contained\n          within such NOTICE file, excluding those notices that do not\n          pertain to any part of the Derivative Works, in at least one\n          of the following places: within a NOTICE text file distributed\n          as part of the Derivative Works; within the Source form or\n          documentation, if provided along with the Derivative Works; or,\n          within a display generated by the Derivative Works, if and\n          wherever such third-party notices normally appear. The contents\n          of the NOTICE file are for informational purposes only and\n          do not modify the License. You may add Your own attribution\n          notices within Derivative Works that You distribute, alongside\n          or as an addendum to the NOTICE text from the Work, provided\n          that such additional attribution notices cannot be construed\n          as modifying the License.\n\n      You may add Your own copyright statement to Your modifications and\n      may provide additional or different license terms and conditions\n      for use, reproduction, or distribution of Your modifications, or\n      for any such Derivative Works as a whole, provided Your use,\n      reproduction, and distribution of the Work otherwise complies with\n      the conditions stated in this License.\n\n   5. Submission of Contributions. Unless You explicitly state otherwise,\n      any Contribution intentionally submitted for inclusion in the Work\n      by You to the Licensor shall be under the terms and conditions of\n      this License, without any additional terms or conditions.\n      Notwithstanding the above, nothing herein shall supersede or modify\n      the terms of any separate license agreement you may have executed\n      with Licensor regarding such Contributions.\n\n   6. Trademarks. This License does not grant permission to use the trade\n      names, trademarks, service marks, or product names of the Licensor,\n      except as required for reasonable and customary use in describing the\n      origin of the Work and reproducing the content of the NOTICE file.\n\n   7. Disclaimer of Warranty. Unless required by applicable law or\n      agreed to in writing, Licensor provides the Work (and each\n      Contributor provides its Contributions) on an \"AS IS\" BASIS,\n      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n      implied, including, without limitation, any warranties or conditions\n      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n      PARTICULAR PURPOSE. You are solely responsible for determining the\n      appropriateness of using or redistributing the Work and assume any\n      risks associated with Your exercise of permissions under this License.\n\n   8. Limitation of Liability. In no event and under no legal theory,\n      whether in tort (including negligence), contract, or otherwise,\n      unless required by applicable law (such as deliberate and grossly\n      negligent acts) or agreed to in writing, shall any Contributor be\n      liable to You for damages, including any direct, indirect, special,\n      incidental, or consequential damages of any character arising as a\n      result of this License or out of the use or inability to use the\n      Work (including but not limited to damages for loss of goodwill,\n      work stoppage, computer failure or malfunction, or any and all\n      other commercial damages or losses), even if such Contributor\n      has been advised of the possibility of such damages.\n\n   9. Accepting Warranty or Additional Liability. While redistributing\n      the Work or Derivative Works thereof, You may choose to offer,\n      and charge a fee for, acceptance of support, warranty, indemnity,\n      or other liability obligations and/or rights consistent with this\n      License. However, in accepting such obligations, You may act only\n      on Your own behalf and on Your sole responsibility, not on behalf\n      of any other Contributor, and only if You agree to indemnify,\n      defend, and hold each Contributor harmless for any liability\n      incurred by, or claims asserted against, such Contributor by reason\n      of your accepting any such warranty or additional liability.\n\n   END OF TERMS AND CONDITIONS\n\n   APPENDIX: How to apply the Apache License to your work.\n\n      To apply the Apache License to your work, attach the following\n      boilerplate notice, with the fields enclosed by brackets \"[]\"\n      replaced with your own identifying information. (Don't include\n      the brackets!)  The text should be enclosed in the appropriate\n      comment syntax for the file format. We also recommend that a\n      file or class name and description of purpose be included on the\n      same \"printed page\" as the copyright notice for easier\n      identification within third-party archives.\n\n   Copyright [yyyy] [name of copyright owner]\n\n   Licensed under the Apache License, Version 2.0 (the \"License\");\n   you may not use this file except in compliance with the License.\n   You may obtain a copy of the License at\n\n       http://www.apache.org/licenses/LICENSE-2.0\n\n   Unless required by applicable law or agreed to in writing, software\n   distributed under the License is distributed on an \"AS IS\" BASIS,\n   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n   See the License for the specific language governing permissions and\n   limitations under the License.\n"
  },
  {
    "path": "README.md",
    "content": "# Flutter Hand Tracking Plugin\n\n> 开发时间比较古早，支持的版本已经很老了，慎用！\n\n这个 `Flutter Hand Tracking Plugin` 是为了实现调用 `Andorid` 设备摄像头精确追踪并识别十指的运动路径/轨迹和手势动作, 且输出22个手部关键点以支持更多手势自定义. 基于这个包可以编写业务逻辑将手势信息实时转化为指令信息: 一二三四五, rock, spiderman...同时基于 `Flutter` 可以对不同手势编写不同特效. 可用于短视频直播特效, 智能硬件等领域, 为人机互动带来更自然丰富的体验.\n\n![demo1](img/demo1.gif)![demo2](img/demo2.gif)\n\n> 源码托管于 Github: [https://github.com/zhouzaihang/flutter_hand_tracking_plugin](https://github.com/zhouzaihang/flutter_hand_tracking_plugin)\n\n> [Bilibili 演示](https://www.bilibili.com/video/av92842489/)\n\n## 使用\n\n> 项目中使用的 `android/libs/hand_tracking_aar.aar` 托管在 `git-lfs`, 在你下载之后需要确认 `.aar` 文件是否存在(且文件超过100MB). 如果你没有安装 `git-lfs` 你可能需要手动下载, 然后替换到你的项目路径中.\n\nThis project is a starting point for a Flutter\n[plug-in package](https://flutter.dev/developing-packages/),\na specialized package that includes platform-specific implementation code for\nAndroid.\n\nFor help getting started with Flutter, view our \n[online documentation](https://flutter.dev/docs), which offers tutorials, \nsamples, guidance on mobile development, and a full API reference.\n\n## 涉及到的技术\n\n1. 编写一个 `Flutter Plugin Package`\n1. 使用 `Docker` 配置 `MediaPipe` 开发环境\n1. 在 `Gradle` 中使用 `MediaPipe`\n1. `Flutter` 程序运行 `MediaPipe` 图\n1. `Flutter` 页面中嵌入原生视图\n1. `protobuf` 的使用\n\n## 什么是 `Flutter Package`\n\n`Flutter Package` 有以下两种类型:\n\n`Dart Package`: 完全用 `Dart` 编写的包, 例如 `path` 包. 其中一些可能包含 `Flutter` 特定的功能, 这类包完全依赖于 `Flutter` 框架.\n\n`Plugin Package`: 一类依赖于运行平台的包, 其中包含用 `Dart` 代码编写的 `API`, 并结合了针对 `Android` (使用 `Java` 或 `Kotlin`）和 `iOS` (使用 `ObjC` 或 `Swift`)平台特定的实现. 比如说 `battery` 包.\n\n## 为什么需要 `Flutter Plugin Package`\n\n`Flutter` 作为一个跨平台的 `UI` 框架, 本身是不能够直接调用原生的功能的. 如果需要使用原生系统的功能, 就需要对平台特定实现, 然后在 `Flutter` 的 `Dart` 层进行兼容. \n此处需要使用调用摄像头和 `GPU` 实现业务. 所以使用 `Flutter Plugin Package`.\n\n## `Flutter Plugin Package` 是如何工作的\n\n以下是 `Flutter Plugin Package` 项目的目录:\n\n![Flutter Plugin Directory Structure](img/directory_structure.png)\n\n- 其中 `pubspec.yaml` 用于添加 `Plugin` 可能会用到的依赖或者资源(图片, 字体等)\n\n- `example` 目录下是一个完整的 `Flutter APP`, 用于测试编写的 `Plugin`\n\n- 另外, 无论在一个 `Flutter app` 项目还是在一个 `Flutter Plugin` 项目中都会有三个目录 `android`, `ios` 和 `lib`. `lib` 目录用于存放 `Dart` 代码, 而另外两个目录则是用于存放平台特定实现的代码.  `Flutter` 会运行根据实际运行平台来运行平台对应的代码, 然后使用 [Platform Channels](https://flutter.dev/docs/development/platform-integration/platform-channels?tab=android-channel-kotlin-tab) 把代码运行的结果返回给 `Dart` 层.\n\n以下是 `Flutter` 官方给出的一张 `Flutter` 架构图:\n\n![Flutter System Overview](img/flutter_system_overview.png)\n\n从架构图中可以看到 `Flutter` 之所以是一个跨平台框架是因为有 `Embedder` 作为操作系统适配层, `Engine` 层实现渲染引擎等功能, 而 `Framework` 层是一个用 `Dart` 实现的 `UI SDK`. 对于一个 `Flutter Plugin Package` 来说, 就是要在 `Embedder` 层用原生的平台特定实现, 并且在 `Dart` 层中封装为一个 `UI API`, 从而实现跨平台. `Embedder` 层并不能直接和 `Framework` 直接连接, 还必须经过 `Engine` 层的 `Platform Channels`.\n\n使用 `Platform Channels` 在客户端(`UI`) 和主机(特定平台)之间传递的过程如下图所示:\n\n![PlatformChannels](img/PlatformChannels.png)\n\n\n## 新建 `Flutter Plugin Package`\n\n1. 打开 `Android Studio`, 点击 `New Flutter Project`\n1. 选择 `Flutter Plugin` 选项\n1. 输入项目名字, 描述等信息\n\n## 编写 `Android` 平台 `view`\n\n首先在 `android/src/main/kotlin/xyz/zhzh/flutter_hand_tracking_plugin` 目录下创建两个 `kotlin` 文件: `FlutterHandTrackingPlugin.kt` 和 `HandTrackingViewFactory.kt` 文件.\n\n### 编写 `Factory` 类\n\n在 `HandTrackingViewFactory.kt` 中编写一个 `HandTrackingViewFactory` 类实现抽象类 `PlatformViewFactory`. 之后编写的 `Android` 平台组件都需要用这个 `Factory` 类来生成. 在生成视图的时候需要传入一个参数 `id` 来辨别视图(`id`会由 `Flutter` 创建并传递给 `Factory`):\n\n``` Kotlin\npackage xyz.zhzh.flutter_hand_tracking_plugin\n\nimport android.content.Context\nimport io.flutter.plugin.common.PluginRegistry\nimport io.flutter.plugin.common.StandardMessageCodec\nimport io.flutter.plugin.platform.PlatformView\nimport io.flutter.plugin.platform.PlatformViewFactory\n\nclass HandTrackingViewFactory(private val registrar: PluginRegistry.Registrar) :\n        PlatformViewFactory(StandardMessageCodec.INSTANCE) {\n    override fun create(context: Context?, viewId: Int, args: Any?): PlatformView {\n        return FlutterHandTrackingPlugin(registrar, viewId)\n    }\n}\n```\n\n### 编写 `AndroidView` 类\n\n在 `FlutterHandTrackingPlugin.kt` 中编写 `FlutterHandTrackingPlugin` 实现 `PlatformView` 接口, 这个接口需要实现两个方法 `getView` 和 `dispose`.\n\n`getView` 用于返回一个将要嵌入到 `Flutter` 界面的视图\n\n`dispose` 则是在试图关闭的时候进行一些操作\n\n首先要添加一个 `SurfaceView`:\n\n``` kotlin\nclass FlutterHandTrackingPlugin(r: Registrar, id: Int) : PlatformView, MethodCallHandler {\n    companion object {\n        private const val TAG = \"HandTrackingPlugin\"\n        private const val NAMESPACE = \"plugins.zhzh.xyz/flutter_hand_tracking_plugin\"\n\n        @JvmStatic\n        fun registerWith(registrar: Registrar) {\n            registrar.platformViewRegistry().registerViewFactory(\n                    \"$NAMESPACE/view\",\n                    HandTrackingViewFactory(registrar))\n        }\n\n        init { // Load all native libraries needed by the app.\n            System.loadLibrary(\"mediapipe_jni\")\n            System.loadLibrary(\"opencv_java3\")\n        }\n    }\n    private var previewDisplayView: SurfaceView = SurfaceView(r.context())\n}\n```\n\n然后通过 `getView` 返回添加的 `SurfaceView`:\n\n``` kotlin\nclass FlutterHandTrackingPlugin(r: Registrar, id: Int) : PlatformView, MethodCallHandler {\n    // ...\n    override fun getView(): SurfaceView? {\n        return previewDisplayView\n    }\n\n    override fun dispose() {\n        // TODO: ViewDispose()\n    }\n}\n```\n\n## 在 `Dart` 中调用原生实现的 `View`\n\n> 打开 `plugin package` 项目的 `lib/flutter_hand_tracking_plugin.dart` 进行编辑(具体文件名依据新建项目时创建的包名). \n\n在 `Flutter` 中调用原生的 `Android` 组件需要创建一个 `AndroidView` 并告诉它组建的注册名称, 创建 `AndroidView` 的时候, 会给组件分配一个 `id`, 这个 `id` 可以通过参数 `onPlatformViewCreated` 传入方法获得:\n\n``` dart\nAndroidView(\n    viewType: '$NAMESPACE/blueview',\n    onPlatformViewCreated: (id) => _id = id),\n)\n```\n\n由于只实现了 `Android` 平台的组件, 在其他系统上并不可使用, 所以还需要获取 `defaultTargetPlatform` 来判断运行的平台:\n\n``` dart\nimport 'dart:async';\n\nimport 'package:flutter/cupertino.dart';\nimport 'package:flutter/foundation.dart';\nimport 'package:flutter/services.dart';\nimport 'package:flutter_hand_tracking_plugin/gen/landmark.pb.dart';\n\nconst NAMESPACE = \"plugins.zhzh.xyz/flutter_hand_tracking_plugin\";\n\ntypedef void HandTrackingViewCreatedCallback(\n    HandTrackingViewController controller);\n\nclass HandTrackingView extends StatelessWidget {\n  const HandTrackingView({@required this.onViewCreated})\n      : assert(onViewCreated != null);\n\n  final HandTrackingViewCreatedCallback onViewCreated;\n\n  @override\n  Widget build(BuildContext context) {\n    switch (defaultTargetPlatform) {\n      case TargetPlatform.android:\n        return AndroidView(\n          viewType: \"$NAMESPACE/view\",\n          onPlatformViewCreated: (int id) => onViewCreated == null\n              ? null\n              : onViewCreated(HandTrackingViewController._(id)),\n        );\n      case TargetPlatform.fuchsia:\n      case TargetPlatform.iOS:\n      default:\n        throw UnsupportedError(\n            \"Trying to use the default webview implementation for\"\n            \" $defaultTargetPlatform but there isn't a default one\");\n    }\n  }\n}\n```\n\n上面使用 `typedef` 定义了一个 `HandTrackingViewCreatedCallback`, 传入的参数类型为 `HandTrackingViewController`, 这个 `controller` 用于管理对应 `AndroidView` 的 `id`:\n\n``` dart\nclass HandTrackingViewController {\n  final MethodChannel _methodChannel;\n\n  HandTrackingViewController._(int id)\n      : _methodChannel = MethodChannel(\"$NAMESPACE/$id\"),\n        _eventChannel = EventChannel(\"$NAMESPACE/$id/landmarks\");\n\n  Future<String> get platformVersion async =>\n      await _methodChannel.invokeMethod(\"getPlatformVersion\");\n}\n```\n\n其中的 `MethodChannel` 用于调用 `Flutter Plugin Package` 的方法, 本次不需要使用到 `MethodChannel`, 所以不用关注.\n\n## 使用 `Docker` 构建 `MediaPipe AAR` 并添加到项目中\n\n`MediaPipe` 是一个 `Google` 发布的使用 `ML pipelines` 技术构建多个模型连接在一起的跨平台框架. (`Machine learning pipelines`: 简单说就是一套 `API` 解决各个模型/算法/`workflow` 之间的数据传输). `MediaPipe` 支持视频, 音频, 等任何 `time series data`([WiKi--Time Series](https://en.wikipedia.org/wiki/Time_series)).\n\n这里利用 `MediaPipe` 将摄像头数据传入到手势检测的 `TFlite` 模型中处理. 然后再把整套程序构建为 `Android archive library`.\n\n`MediaPipe Android archive library` 是一个把 `MediaPipe` 与 `Gradle` 一起使用的方法. `MediaPipe` 不会发布可用于所有项目的常规AAR, 所以需要开发者自行构建. 这是官方给出的[MediaPipe 安装教程](https://github.com/google/mediapipe/blob/master/mediapipe/docs/install.md). 笔者这里是 `Ubuntu` 系统, 选择了 `Docker` 的安装方式(`git clone` 和 `docker pull` 的时候网络不稳定的话可以设置一下 `proxy` 或换源).\n\n安装完成后使用 `docker exec -it mediapipe /bin/bash` 进入 `bash` 操作.\n\n### 创建一个 `mediapipe_aar()`\n\n首先在 `mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example` 里创建一个 `BUILD` 文件, 并把一下内容添加到文本文件里.\n\n``` build\nload(\"//mediapipe/java/com/google/mediapipe:mediapipe_aar.bzl\", \"mediapipe_aar\")\n\nmediapipe_aar(\n    name = \"hand_tracking_aar\",\n    calculators = [\"//mediapipe/graphs/hand_tracking:mobile_calculators\"],\n)\n```\n\n### 生成 `aar`\n\n根据上面创建的文本文件, 运行 `bazel build` 命令就可以生成一个 `AAR`, 其中 `--action_env=HTTP_PROXY=$HTTP_PROXY`, 这个参数是用来指定设置代理的(因为在构建的过程中会从 `Github` 下载很多依赖.)\n\n``` bash\nbazel build -c opt --action_env=HTTP_PROXY=$HTTP_PROXY --action_env=HTTPS_PROXY=$HTTPS_PROXY --fat_apk_cpu=arm64-v8a,armeabi-v7a mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example:hand_tracking_aar\n```\n\n在容器内的 `/mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example` 路径下, 你可以找到刚刚构建得到的 `hand_tracking_aar.aar`, 使用 `docker cp` 从容器中拷贝到项目的 `android/libs` 下.\n\n### 生成 `binary graph`\n\n上面的 `aar` 执行还需要依赖 `MediaPipe binary graph`, 使用以下命令可以构建生成一个 `binary graph`:\n\n``` bash\nbazel build -c opt mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu:binary_graph\n```\n\n从 `bazel-bin/mediapipe/examples/android/src/java/com/google/mediapipe/apps/handtrackinggpu` 中拷贝刚才 `build` 出来的 `binary graph`, 当到 `android/src/main/assets` 下\n\n### 添加 `assets` 和 `OpenCV library`\n\n在容器的 `/mediapipe/models` 目录下, 会找到 `hand_lanmark.tflite`, `palm_detection.tflite` 和 `palm_detection_labelmap.txt` 把这些也拷贝出来放到 `android/src/main/assets` 下.\n\n另外 `MediaPipe` 依赖 `OpenCV`, 是哦一需要下载 `OpenCV` 预编译的 `JNI libraries` 库, 并放置到 `android/src/main/jniLibs` 路径下. 可以从[此处](https://github.com/opencv/opencv/releases/download/3.4.3/opencv-3.4.3-android-sdk.zip)下载官方的 `OpenCV Android SDK` 然后运行 `cp` 放到对应路径中\n\n``` bash\ncp -R ~/Downloads/OpenCV-android-sdk/sdk/native/libs/arm* /path/to/your/plugin/android/src/main/jniLibs/\n```\n\n![add aar in plugin](img/mediapipe.png)\n\n`MediaPipe` 框架使用 `OpenCV`, 要加载 `MediaPipe` 框架首先要在 `Flutter Plugin` 中加载 `OpenCV`, 在 `FlutterHandTrackingPlugin` 的 `companion object` 中使用以下代码来加载这两个依赖项:\n\n``` kotlin\nclass FlutterHandTrackingPlugin(r: Registrar, id: Int) : PlatformView, MethodCallHandler {\n    companion object {\n        init { // Load all native libraries needed by the app.\n            System.loadLibrary(\"mediapipe_jni\")\n            System.loadLibrary(\"opencv_java3\")\n        }\n    }\n}\n```\n\n### 修改 `build.grage`\n\n打开 `android/build.gradle`, 添加 `MediaPipe dependencies` 和 `MediaPipe AAR` 到 `app/build.gradle`:\n\n```\ndependencies {\n    implementation \"org.jetbrains.kotlin:kotlin-stdlib-jdk7:$kotlin_version\"\n    implementation fileTree(dir: 'libs', include: ['*.aar'])\n    // MediaPipe deps\n    implementation 'com.google.flogger:flogger:0.3.1'\n    implementation 'com.google.flogger:flogger-system-backend:0.3.1'\n    implementation 'com.google.code.findbugs:jsr305:3.0.2'\n    implementation 'com.google.guava:guava:27.0.1-android'\n    implementation 'com.google.guava:guava:27.0.1-android'\n    implementation 'com.google.protobuf:protobuf-lite:3.0.1'\n    // CameraX core library\n    def camerax_version = \"1.0.0-alpha06\"\n    implementation \"androidx.camera:camera-core:$camerax_version\"\n    implementation \"androidx.camera:camera-camera2:$camerax_version\"\n    implementation \"androidx.core:core-ktx:1.2.0\"\n}\n```\n\n## 通过 `CameraX` 调用摄像头\n\n### 获取摄像头权限\n\n要在我们的应用程序中使用相机, 我们需要请求用户提供对相机的访问权限. 要请求相机权限, 请将以下内容添加到 `android/src/main/AndroidManifest.xml`:\n\n``` xml\n<!-- For using the camera -->\n<uses-permission android:name=\"android.permission.CAMERA\" />\n\n<uses-feature android:name=\"android.hardware.camera\" />\n<uses-feature android:name=\"android.hardware.camera.autofocus\" />\n<!-- For MediaPipe -->\n<uses-feature\n    android:glEsVersion=\"0x00020000\"\n    android:required=\"true\" />\n```\n\n同时在 `build.gradle` 中将最低 `SDK` 版本更改为 `21` 以上, 并将目标 `SDK` 版本更改为 `27` 以上\n\n``` Gradle Script\ndefaultConfig {\n    // TODO: Specify your own unique Application ID (https://developer.android.com/studio/build/application-id.html).\n    applicationId \"xyz.zhzh.flutter_hand_tracking_plugin_example\"\n    minSdkVersion 21\n    targetSdkVersion 27\n    versionCode flutterVersionCode.toInteger()\n    versionName flutterVersionName\n    testInstrumentationRunner \"androidx.test.runner.AndroidJUnitRunner\"\n}\n```\n\n为了确保提示用户请求照相机权限, 并使我们能够使用 `CameraX` 库访问摄像头. 请求摄像机许可, 可以使用 `MediaPipe` 组件提供的组建 `PermissionHelper` 要使用它. 首先在组件 `init` 内添加请求权限的代码:\n\n``` kotlin\nPermissionHelper.checkAndRequestCameraPermissions(activity)\n```\n\n这会在屏幕上以对话框提示用户, 以请求在此应用程序中使用相机的权限.\n\n然后在添加以下代码来处理用户响应:\n\n``` kotlin\nclass FlutterHandTrackingPlugin(r: Registrar, id: Int) : PlatformView, MethodCallHandler {\n    private val activity: Activity = r.activity()\n\n    init {\n        // ...\n        r.addRequestPermissionsResultListener(CameraRequestPermissionsListener())\n        PermissionHelper.checkAndRequestCameraPermissions(activity)\n        if (PermissionHelper.cameraPermissionsGranted(activity)) onResume()\n    }\n\n    private inner class CameraRequestPermissionsListener :\n            PluginRegistry.RequestPermissionsResultListener {\n        override fun onRequestPermissionsResult(requestCode: Int,\n                                                permissions: Array<out String>?,\n                                                grantResults: IntArray?): Boolean {\n            return if (requestCode != 0) false\n            else {\n                for (result in grantResults!!) {\n                    if (result == PERMISSION_GRANTED) onResume()\n                    else Toast.makeText(activity, \"请授予摄像头权限\", Toast.LENGTH_LONG).show()\n                }\n                true\n            }\n        }\n    }\n\n    private fun onResume() {\n        // ...\n        if (PermissionHelper.cameraPermissionsGranted(activity)) startCamera()\n    }\n    private fun startCamera() {}\n}\n```\n\n暂时将 `startCamera()` 方法保留为空. 当用户响应提示后, `onResume()` 方法会被调用调用. 该代码将确认已授予使用相机的权限, 然后将启动相机.\n\n### 调用摄像头\n\n现在将 `SurfaceTexture` 和 `SurfaceView` 添加到插件:\n\n``` kotlin\n// {@link SurfaceTexture} where the camera-preview frames can be accessed.\nprivate var previewFrameTexture: SurfaceTexture? = null\n// {@link SurfaceView} that displays the camera-preview frames processed by a MediaPipe graph.\nprivate var previewDisplayView: SurfaceView = SurfaceView(r.context())\n```\n\n在组件 `init` 的方法里, 添加 `setupPreviewDisplayView()` 方法到请求请求摄像头权限的前面:\n\n``` kotlin\ninit {\n    r.addRequestPermissionsResultListener(CameraRequestPermissionsListener())\n\n    setupPreviewDisplayView()\n    PermissionHelper.checkAndRequestCameraPermissions(activity)\n    if (PermissionHelper.cameraPermissionsGranted(activity)) onResume()\n}\n```\n\n然后编写 `setupPreviewDisplayView` 方法:\n\n``` kotlin\nprivate fun setupPreviewDisplayView() {\n    previewDisplayView.visibility = View.GONE\n    // TODO\n}\n```\n\n要 `previewDisplayView` 用于获取摄像头的数据, 可以使用 `CameraX`, `MediaPipe` 提供了一个名为 `CameraXPreviewHelper` 的类使用 `CameraX`. 开启相机时, 可以更新监听函数 `onCameraStarted(@Nullable SurfaceTexture)`\n\n现在定义一个 `CameraXPreviewHelper`:\n\n``` kotlin\nclass FlutterHandTrackingPlugin(r: Registrar, id: Int) : PlatformView, MethodCallHandler {\n    // ...\n    private var cameraHelper: CameraXPreviewHelper? = null\n    // ...\n}\n```\n\n然后实现之前的 `startCamera()`:\n\n``` kotlin\nprivate fun startCamera() {\n    cameraHelper = CameraXPreviewHelper()\n    cameraHelper!!.setOnCameraStartedListener { surfaceTexture: SurfaceTexture? ->\n        previewFrameTexture = surfaceTexture\n        // Make the display view visible to start showing the preview. This triggers the\n        // SurfaceHolder.Callback added to (the holder of) previewDisplayView.\n        previewDisplayView.visibility = View.VISIBLE\n    }\n    cameraHelper!!.startCamera(activity, CAMERA_FACING,  /*surfaceTexture=*/null)\n}\n```\n\n这将 `new` 一个 `CameraXPreviewHelper` 对象, 并在该对象上添加一个匿名监听. 当有 `cameraHelper` 监听到相机已启动时, `surfaceTexture` 可以抓取摄像头帧, 将其传给 `previewFrameTexture`, 并使 `previewDisplayView` 可见.\n\n在调用摄像头时, 需要确定要使用的相机. `CameraXPreviewHelper` 继承 `CameraHelper` 的两个选项: `FRONT` 和 `BACK`. 并且作为参数 `CAMERA_FACING` 传入 `cameraHelper!!.startCamera` 方法. 这里设置前置摄像头为一个静态变量: \n\n``` kotlin\nclass FlutterHandTrackingPlugin(r: Registrar, id: Int) : PlatformView, MethodCallHandler {\n    companion object {\n        private val CAMERA_FACING = CameraHelper.CameraFacing.FRONT\n        // ...\n    }\n    //...\n}\n```\n## 使用 `ExternalTextureConverter` 转换摄像头图像帧数据\n\n上面使用 `SurfaceTexture` 将流中的摄像头图像帧捕获并存在 `OpenGL ES texture` 对象中. 要使用 `MediaPipe graph`, 需要把摄像机捕获的帧存在在普通的 `Open GL texture` 对象中. `MediaPipe` 提供了类 `ExternalTextureConverter` 类用于将存储在 `SurfaceTexture` 对象中的图像帧转换为常规 `OpenGL texture` 对象.\n\n要使用 `ExternalTextureConverter`, 还需要一个由 `EglManager` 对象创建和管理的 `EGLContext`. 在插件中添加以下声明:\n\n``` kotlin\n// Creates and manages an {@link EGLContext}.\nprivate var eglManager: EglManager = EglManager(null)\n\n// Converts the GL_TEXTURE_EXTERNAL_OES texture from Android camera into a regular texture to be\n// consumed by {@link FrameProcessor} and the underlying MediaPipe graph.\nprivate var converter: ExternalTextureConverter? = null\n```\n\n修改之前编写的 `onResume()` 方法添加初始化 `converter` 对象的代码:\n\n``` kotlin\nprivate fun onResume() {\n    converter = ExternalTextureConverter(eglManager.context)\n    converter!!.setFlipY(FLIP_FRAMES_VERTICALLY)\n    if (PermissionHelper.cameraPermissionsGranted(activity)) {\n        startCamera()\n    }\n}\n```\n\n要把 `previewFrameTexture` 传输到 `converter` 进行转换, 将以下代码块添加到 `setupPreviewDisplayView()`:\n\n``` kotlin\nprivate fun setupPreviewDisplayView() {\n    previewDisplayView.visibility = View.GONE\n    previewDisplayView.holder.addCallback(\n            object : SurfaceHolder.Callback {\n                override fun surfaceCreated(holder: SurfaceHolder) {\n                    processor.videoSurfaceOutput.setSurface(holder.surface)\n                }\n\n                override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) { // (Re-)Compute the ideal size of the camera-preview display (the area that the\n                    // camera-preview frames get rendered onto, potentially with scaling and rotation)\n                    // based on the size of the SurfaceView that contains the display.\n                    val viewSize = Size(width, height)\n                    val displaySize = cameraHelper!!.computeDisplaySizeFromViewSize(viewSize)\n                    val isCameraRotated = cameraHelper!!.isCameraRotated\n                    // Connect the converter to the camera-preview frames as its input (via\n                    // previewFrameTexture), and configure the output width and height as the computed\n                    // display size.\n                    converter!!.setSurfaceTextureAndAttachToGLContext(\n                            previewFrameTexture,\n                            if (isCameraRotated) displaySize.height else displaySize.width,\n                            if (isCameraRotated) displaySize.width else displaySize.height)\n                }\n\n                override fun surfaceDestroyed(holder: SurfaceHolder) {\n                    // TODO\n                }\n            })\n}\n```\n\n在上面的代码中, 首先自定义并添加 `SurfaceHolder.Callback` 到 `previewDisplayView` 并实现 `surfaceChanged(SurfaceHolder holder, int format, int width, int height)`: \n\n1. 计算摄像头的帧在设备屏幕上适当的显示尺寸\n2. 传入 `previewFrameTexture` 和 `displaySize` 到 `converter`\n\n现在摄像头获取到的图像帧已经可以传入到 `MediaPipe graph` 中了.\n\n## 调用 `MediaPipe graph`\n\n首先需要加载所有 `MediaPipe graph` 需要的资源(之前从容器中拷贝出来的 `tflite` 模型, `binary graph` 等) 可以使用 `MediaPipe` 的组件 `AndroidAssetUtil` 类:\n\n``` kotlin\n// Initialize asset manager so that MediaPipe native libraries can access the app assets, e.g.,\n// binary graphs.\nAndroidAssetUtil.initializeNativeAssetManager(activity)\n```\n\n然后添加以下代码设置 `processor`:\n\n```\ninit {\n    setupProcess()\n}\n\nprivate fun setupProcess() {\n    processor.videoSurfaceOutput.setFlipY(FLIP_FRAMES_VERTICALLY)\n    // TODO\n}\n```\n\n然后根据用到的 `graph` 的名称声明静态变量, 这些静态变量用于之后使用 `graph`:\n\n```\nclass FlutterHandTrackingPlugin(r: Registrar, id: Int) : PlatformView, MethodCallHandler {\n    companion object {\n        private const val BINARY_GRAPH_NAME = \"handtrackinggpu.binarypb\"\n        private const val INPUT_VIDEO_STREAM_NAME = \"input_video\"\n        private const val OUTPUT_VIDEO_STREAM_NAME = \"output_video\"\n        private const val OUTPUT_HAND_PRESENCE_STREAM_NAME = \"hand_presence\"\n        private const val OUTPUT_LANDMARKS_STREAM_NAME = \"hand_landmarks\"\n    }\n}\n```\n\n现在设置一个 `FrameProcessor` 对象, 把之前 `converter` 转换好的的摄像头图像帧发送到 `MediaPipe graph` 并运行该图获得输出的图像帧, 然后更新 `previewDisplayView` 来显示输出. 添加以下代码以声明 `FrameProcessor`:\n\n``` kotlin\n// Sends camera-preview frames into a MediaPipe graph for processing, and displays the processed\n// frames onto a {@link Surface}.\nprivate var processor: FrameProcessor = FrameProcessor(\n        activity,\n        eglManager.nativeContext,\n        BINARY_GRAPH_NAME,\n        INPUT_VIDEO_STREAM_NAME,\n        OUTPUT_VIDEO_STREAM_NAME)\n```\n\n然后编辑 `onResume()` 通过 `converter!!.setConsumer(processor)` 设置 `convert` 把转换好的图像帧输出到 `processor`:\n\n``` kotlin\nprivate fun onResume() {\n    converter = ExternalTextureConverter(eglManager.context)\n    converter!!.setFlipY(FLIP_FRAMES_VERTICALLY)\n    converter!!.setConsumer(processor)\n    if (PermissionHelper.cameraPermissionsGranted(activity)) {\n        startCamera()\n    }\n}\n```\n\n接着就是把 `processor` 处理后的图像帧输出到 `previewDisplayView`. 重新编辑 `setupPreviewDisplayView` 修改之前定义的 `SurfaceHolder.Callback`:\n\n``` kotlin\nprivate fun setupPreviewDisplayView() {\n    previewDisplayView.visibility = View.GONE\n    previewDisplayView.holder.addCallback(\n            object : SurfaceHolder.Callback {\n                override fun surfaceCreated(holder: SurfaceHolder) {\n                    processor.videoSurfaceOutput.setSurface(holder.surface)\n                }\n\n                override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {\n                    // (Re-)Compute the ideal size of the camera-preview display (the area that the\n                    // camera-preview frames get rendered onto, potentially with scaling and rotation)\n                    // based on the size of the SurfaceView that contains the display.\n                    val viewSize = Size(width, height)\n                    val displaySize = cameraHelper!!.computeDisplaySizeFromViewSize(viewSize)\n                    val isCameraRotated = cameraHelper!!.isCameraRotated\n                    // Connect the converter to the camera-preview frames as its input (via\n                    // previewFrameTexture), and configure the output width and height as the computed\n                    // display size.\n                    converter!!.setSurfaceTextureAndAttachToGLContext(\n                            previewFrameTexture,\n                            if (isCameraRotated) displaySize.height else displaySize.width,\n                            if (isCameraRotated) displaySize.width else displaySize.height)\n                }\n\n                override fun surfaceDestroyed(holder: SurfaceHolder) {\n                    processor.videoSurfaceOutput.setSurface(null)\n                }\n            })\n}\n```\n\n当 `SurfaceHolder` 被创建时, 摄像头图像帧就会经由 `convert` 转换后输出到 `processor`, 然后再通过 `VideoSurfaceOutput` 输出一个 `Surface`.\n\n## 通过 `EventChannel` 实现原生组件与 `Flutter` 的通讯\n\n之前的处理仅仅在处理图像帧, `processor` 除了处理图像之外, 还可以获取手部关键点的坐标. 通过 `EventChannel` 可以把这些数据传输给 `Flutter` 的 `Dart` 层, 进而根据这些关键点可以编写各种业务逻辑将手势信息实时转化为指令信息: 一二三四五, rock, spiderman...或者对不同手势编写不同特效. 从而为人机互动带来更自然丰富的体验.\n\n### 在 `Android` 平台打开 `EventChannel`\n\n首先定义一个 `EventChannel` 和一个 `EventChannel.EventSink`:\n\n``` kotlin\nprivate val eventChannel: EventChannel = EventChannel(r.messenger(), \"$NAMESPACE/$id/landmarks\")\nprivate var eventSink: EventChannel.EventSink? = null\n```\n\n`EventChannel.EventSink` 用于之后发送消息. 然后在 `init` 方法中初始化 `eventChannel`:\n\n``` kotlin\ninti {\n    this.eventChannel.setStreamHandler(landMarksStreamHandler())\n}\n\nprivate fun landMarksStreamHandler(): EventChannel.StreamHandler {\n    return object : EventChannel.StreamHandler {\n\n        override fun onListen(arguments: Any?, events: EventChannel.EventSink) {\n            eventSink = events\n            // Log.e(TAG, \"Listen Event Channel\")\n        }\n\n        override fun onCancel(arguments: Any?) {\n            eventSink = null\n        }\n    }\n}\n```\n\n设置完消息通道后, 编辑之前的 `setupProcess()` 方法. 在设置 `processor` 的输出前添加代码, 实现获得手部关键点的位置并通过 `EventChannel.EventSink` 发送到之前打开的 `eventChannel`:\n\n``` kotlin\nprivate val uiThreadHandler: Handler = Handler(Looper.getMainLooper())\n\nprivate fun setupProcess() {\n    processor.videoSurfaceOutput.setFlipY(FLIP_FRAMES_VERTICALLY)\n    processor.addPacketCallback(\n            OUTPUT_HAND_PRESENCE_STREAM_NAME\n    ) { packet: Packet ->\n        val handPresence = PacketGetter.getBool(packet)\n        if (!handPresence) Log.d(TAG, \"[TS:\" + packet.timestamp + \"] Hand presence is false, no hands detected.\")\n    }\n    processor.addPacketCallback(\n            OUTPUT_LANDMARKS_STREAM_NAME\n    ) { packet: Packet ->\n        val landmarksRaw = PacketGetter.getProtoBytes(packet)\n        if (eventSink == null) try {\n            val landmarks = LandmarkProto.NormalizedLandmarkList.parseFrom(landmarksRaw)\n            if (landmarks == null) {\n                Log.d(TAG, \"[TS:\" + packet.timestamp + \"] No hand landmarks.\")\n                return@addPacketCallback\n            }\n            // Note: If hand_presence is false, these landmarks are useless.\n            Log.d(TAG, \"[TS: ${packet.timestamp}] #Landmarks for hand: ${landmarks.landmarkCount}\\n ${getLandmarksString(landmarks)}\")\n        } catch (e: InvalidProtocolBufferException) {\n            Log.e(TAG, \"Couldn't Exception received - $e\")\n            return@addPacketCallback\n        }\n        else uiThreadHandler.post { eventSink?.success(landmarksRaw) }\n    }\n}\n```\n\n这里 `LandmarkProto.NormalizedLandmarkList.parseFrom()` 用来解析标记点 `byte array` 格式的数据. 因为所有标记点的数据都是使用 `protobuf` 封装的. `Protocol buffers` 是一种 `Google` 可以在各种语言使用的跨平台序列化结构数据的工具, 详情可以查看[官网](https://developers.google.com/protocol-buffers).\n\n另外最后用到了 `uiThreadHandler` 来发送数据, 因为 `processor` 的 `callback` 会在线程中执行, 但是 `Flutter` 框架往 `eventChannel` 里发送消息需要在 `UI` 线程中, 所以使用 `uiThreadHandler` 来 `post`.\n\n完整的 `FlutterHandTrackingPlugin.kt` 的详情可见[github](https://github.com/zhouzaihang/flutter_hand_tracking_plugin)\n\n### 在 `Dart` 层获得 `eventChannel` 的数据\n\n再一次打开 `lib/flutter_hand_tracking_plugin.dart`, 编辑 `HandTrackingViewController` 类. 根据 `id` 添加一个 `EventChannel`, 然后使用 `receiveBroadcastStream` 接受这个通道消息:\n\n``` dart\nclass HandTrackingViewController {\n  final MethodChannel _methodChannel;\n  final EventChannel _eventChannel;\n\n  HandTrackingViewController._(int id)\n      : _methodChannel = MethodChannel(\"$NAMESPACE/$id\"),\n        _eventChannel = EventChannel(\"$NAMESPACE/$id/landmarks\");\n\n  Future<String> get platformVersion async =>\n      await _methodChannel.invokeMethod(\"getPlatformVersion\");\n\n  Stream<NormalizedLandmarkList> get landMarksStream async* {\n    yield* _eventChannel\n        .receiveBroadcastStream()\n        .map((buffer) => NormalizedLandmarkList.fromBuffer(buffer));\n  }\n}\n```\n\n之前已经介绍过了, 传输的数据格式是使用 `protobuf` 序列化的有一定结构的 `byte array`. 所以需要使用 `NormalizedLandmarkList.fromBuffer()`, 来解析. `NormalizedLandmarkList.fromBuffer()` 这个接口, 是由 `protobuf` 根据 [protos/landmark.proto](https://github.com/zhouzaihang/flutter_hand_tracking_plugin/blob/master/protos/landmark.proto) 生成的 `.dart` 文件提供.\n\n首先打开 `pubspec.yaml` 添加 `protoc_plugin` 的依赖:\n\n``` yaml\ndependencies:\n  flutter:\n    sdk: flutter\n  protobuf: ^1.0.1\n```\n\n然后安装和激活插件:\n\n``` bash\npub install\npub global activate protoc_plugin\n```\n\n再根据[protobuf 安装教程](https://github.com/google/protobuf)配置 `Protocol`\n\n然后运行 `protoc` 命令就可以生成 `.dart` 文件了:\n\n``` bash\nprotoc --dart_out=../lib/gen ./landmark.proto\n```\n\n也可以直接使用已经生成好的[flutter_hand_tracking_plugin/lib/gen/](https://github.com/zhouzaihang/flutter_hand_tracking_plugin/tree/master/lib/gen):\n\n![ProtoBufGen](img/protobuf_gen.png)\n\n生成完了以后就可以通过 `NormalizedLandmarkList` 来存储收到的数据, 并且 `NormalizedLandmarkList` 对象有 `fromBuffer()`, `fromJson()` 各种方法来反序列化数据.\n"
  },
  {
    "path": "android/.gitignore",
    "content": "*.iml\n.gradle\n/local.properties\n/.idea/workspace.xml\n/.idea/libraries\n.DS_Store\n/build\n/captures\n"
  },
  {
    "path": "android/build.gradle",
    "content": "group 'xyz.zhzh.flutter_hand_tracking_plugin'\nversion '1.0-SNAPSHOT'\n\nbuildscript {\n    ext.kotlin_version = '1.3.61'\n    repositories {\n        google()\n        jcenter()\n    }\n\n    dependencies {\n        classpath 'com.android.tools.build:gradle:3.5.3'\n        classpath \"org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version\"\n    }\n}\n\nrootProject.allprojects {\n    repositories {\n        google()\n        jcenter()\n    }\n}\n\napply plugin: 'com.android.library'\napply plugin: 'kotlin-android'\n\nandroid {\n    compileSdkVersion 29\n\n    sourceSets {\n        main.java.srcDirs += 'src/main/kotlin'\n    }\n    defaultConfig {\n        minSdkVersion 21\n        targetSdkVersion 29\n        testInstrumentationRunner \"androidx.test.runner.AndroidJUnitRunner\"\n    }\n    lintOptions {\n        disable 'InvalidPackage'\n    }\n}\n\ndependencies {\n    implementation \"org.jetbrains.kotlin:kotlin-stdlib-jdk7:$kotlin_version\"\n    implementation fileTree(dir: 'libs', include: ['*.aar'])\n    // MediaPipe deps\n    implementation 'com.google.flogger:flogger:0.3.1'\n    implementation 'com.google.flogger:flogger-system-backend:0.3.1'\n    implementation 'com.google.code.findbugs:jsr305:3.0.2'\n    implementation 'com.google.guava:guava:27.0.1-android'\n    implementation 'com.google.guava:guava:27.0.1-android'\n    implementation 'com.google.protobuf:protobuf-lite:3.0.1'\n    // CameraX core library\n    def camerax_version = \"1.0.0-alpha06\"\n    implementation \"androidx.camera:camera-core:$camerax_version\"\n    implementation \"androidx.camera:camera-camera2:$camerax_version\"\n    implementation \"androidx.core:core-ktx:1.2.0\"\n}\n"
  },
  {
    "path": "android/gradle/wrapper/gradle-wrapper.properties",
    "content": "distributionBase=GRADLE_USER_HOME\ndistributionPath=wrapper/dists\nzipStoreBase=GRADLE_USER_HOME\nzipStorePath=wrapper/dists\ndistributionUrl=https\\://services.gradle.org/distributions/gradle-4.10.2-all.zip\n"
  },
  {
    "path": "android/gradle.properties",
    "content": "org.gradle.jvmargs=-Xmx1536M\nandroid.enableR8=true\nandroid.useAndroidX=true\nandroid.enableJetifier=true\n"
  },
  {
    "path": "android/libs/hand_tracking_aar.aar",
    "content": "version https://git-lfs.github.com/spec/v1\noid sha256:e9e16b67719b7f6a53ec6295764d4c5a05f25b09ddfdefb165da2ebcfd60e9d2\nsize 105692902\n"
  },
  {
    "path": "android/settings.gradle",
    "content": "rootProject.name = 'flutter_hand_tracking_plugin'\n"
  },
  {
    "path": "android/src/main/AndroidManifest.xml",
    "content": "<manifest xmlns:android=\"http://schemas.android.com/apk/res/android\"\n    package=\"xyz.zhzh.flutter_hand_tracking_plugin\">\n    <!-- For using the camera -->\n    <uses-permission android:name=\"android.permission.CAMERA\" />\n\n    <uses-feature android:name=\"android.hardware.camera\" />\n    <uses-feature android:name=\"android.hardware.camera.autofocus\" />\n    <!-- For MediaPipe -->\n    <uses-feature\n        android:glEsVersion=\"0x00020000\"\n        android:required=\"true\" />\n</manifest>\n"
  },
  {
    "path": "android/src/main/assets/handtrackinggpu.binarypb",
    "content": "\n]\u0012\u0015FlowLimiterCalculator\u001a\u000binput_video\u001a\u0012FINISHED:hand_rect\"\u0015throttled_input_videoj\f\n\bFINISHED\u0010\u0001\nt\u0012\u001aPreviousLoopbackCalculator\u001a\u001aMAIN:throttled_input_video\u001a\u0012LOOP:hand_presence\"\u001cPREV_LOOP:prev_hand_presencej\b\n\u0004LOOP\u0010\u0001\n\u0001\u0012\u000eGateCalculator\u001a\u0015throttled_input_video\u001a\u001bDISALLOW:prev_hand_presence\"\u001ahand_detection_input_videoB9\n3type.googleapis.com/mediapipe.GateCalculatorOptions\u0012\u0002\b\u0001\ny\u0012\u0015HandDetectionSubgraph\u001a\u001ahand_detection_input_video\"\u001aDETECTIONS:palm_detections\"(NORM_RECT:hand_rect_from_palm_detections\n\u0001\u0012\u0014HandLandmarkSubgraph\u001a\u001bIMAGE:throttled_input_video\u001a\u0013NORM_RECT:hand_rect\"\u0018LANDMARKS:hand_landmarks\"\"NORM_RECT:hand_rect_from_landmarks\"\u0016PRESENCE:hand_presence\n\u0001\u0012\u001aPreviousLoopbackCalculator\u001a\u001aMAIN:throttled_input_video\u001a\u001dLOOP:hand_rect_from_landmarks\"'PREV_LOOP:prev_hand_rect_from_landmarksj\b\n\u0004LOOP\u0010\u0001\n[\u0012\u000fMergeCalculator\u001a\u001ehand_rect_from_palm_detections\u001a\u001dprev_hand_rect_from_landmarks\"\thand_rect\n\u0001\u0012\u0010RendererSubgraph\u001a\u001bIMAGE:throttled_input_video\u001a\u0018LANDMARKS:hand_landmarks\u001a\u0013NORM_RECT:hand_rect\u001a\u001aDETECTIONS:palm_detections\"\u0012IMAGE:output_videoR\u000binput_videoz\foutput_video"
  },
  {
    "path": "android/src/main/assets/palm_detection_labelmap.txt",
    "content": "Palm\n"
  },
  {
    "path": "android/src/main/kotlin/xyz/zhzh/flutter_hand_tracking_plugin/FlutterHandTrackingPlugin.kt",
    "content": "package xyz.zhzh.flutter_hand_tracking_plugin\n\nimport android.app.Activity\nimport android.content.pm.PackageManager.PERMISSION_GRANTED\nimport android.graphics.SurfaceTexture\nimport android.os.Handler\nimport android.os.Looper\nimport android.util.Log\nimport android.util.Size\nimport android.view.SurfaceHolder\nimport android.view.SurfaceView\nimport android.view.View\nimport android.widget.Toast\nimport androidx.annotation.NonNull\nimport com.google.mediapipe.components.*\nimport com.google.mediapipe.formats.proto.LandmarkProto\nimport com.google.mediapipe.framework.AndroidAssetUtil\nimport com.google.mediapipe.framework.Packet\nimport com.google.mediapipe.framework.PacketGetter\nimport com.google.mediapipe.glutil.EglManager\nimport com.google.protobuf.InvalidProtocolBufferException\nimport io.flutter.plugin.common.EventChannel\nimport io.flutter.plugin.common.MethodCall\nimport io.flutter.plugin.common.MethodChannel\nimport io.flutter.plugin.common.MethodChannel.MethodCallHandler\nimport io.flutter.plugin.common.MethodChannel.Result\nimport io.flutter.plugin.common.PluginRegistry\nimport io.flutter.plugin.common.PluginRegistry.Registrar\nimport io.flutter.plugin.platform.PlatformView\n\n/** FlutterHandTrackingPlugin */\nclass FlutterHandTrackingPlugin(r: Registrar, id: Int) : PlatformView, MethodCallHandler {\n    companion object {\n        private const val TAG = \"HandTrackingPlugin\"\n        private const val NAMESPACE = \"plugins.zhzh.xyz/flutter_hand_tracking_plugin\"\n        private const val BINARY_GRAPH_NAME = \"handtrackinggpu.binarypb\"\n        private const val INPUT_VIDEO_STREAM_NAME = \"input_video\"\n        private const val OUTPUT_VIDEO_STREAM_NAME = \"output_video\"\n        private const val OUTPUT_HAND_PRESENCE_STREAM_NAME = \"hand_presence\"\n        private const val OUTPUT_LANDMARKS_STREAM_NAME = \"hand_landmarks\"\n        private val CAMERA_FACING = CameraHelper.CameraFacing.FRONT\n        // Flips the camera-preview frames vertically before sending them into FrameProcessor to be\n        // processed in a MediaPipe graph, and flips the processed frames back when they are displayed.\n        // This is needed because OpenGL represents images assuming the image origin is at the bottom-left\n        // corner, whereas MediaPipe in general assumes the image origin is at top-left.\n        private const val FLIP_FRAMES_VERTICALLY = true\n\n        private fun getLandmarksString(landmarks: LandmarkProto.NormalizedLandmarkList): String {\n            var landmarksString = \"\"\n            for ((landmarkIndex, landmark) in landmarks.landmarkList.withIndex()) {\n                landmarksString += (\"\\t\\tLandmark[\"\n                        + landmarkIndex\n                        + \"]: (\"\n                        + landmark.x\n                        + \", \"\n                        + landmark.y\n                        + \", \"\n                        + landmark.z\n                        + \")\\n\")\n            }\n            return landmarksString\n        }\n\n        @JvmStatic\n        fun registerWith(registrar: Registrar) {\n            registrar.platformViewRegistry().registerViewFactory(\n                    \"$NAMESPACE/view\",\n                    HandTrackingViewFactory(registrar))\n        }\n\n        init { // Load all native libraries needed by the app.\n            System.loadLibrary(\"mediapipe_jni\")\n            System.loadLibrary(\"opencv_java3\")\n        }\n    }\n\n    private val activity: Activity = r.activity()\n    private val methodChannel: MethodChannel = MethodChannel(r.messenger(), \"$NAMESPACE/$id\")\n    private val eventChannel: EventChannel = EventChannel(r.messenger(), \"$NAMESPACE/$id/landmarks\")\n    private var eventSink: EventChannel.EventSink? = null\n    private val uiThreadHandler: Handler = Handler(Looper.getMainLooper())\n    // {@link SurfaceTexture} where the camera-preview frames can be accessed.\n    private var previewFrameTexture: SurfaceTexture? = null\n    // {@link SurfaceView} that displays the camera-preview frames processed by a MediaPipe graph.\n    private var previewDisplayView: SurfaceView = SurfaceView(r.context())\n    // Creates and manages an {@link EGLContext}.\n    private var eglManager: EglManager = EglManager(null)\n    // Sends camera-preview frames into a MediaPipe graph for processing, and displays the processed\n    // frames onto a {@link Surface}.\n    private var processor: FrameProcessor = FrameProcessor(\n            activity,\n            eglManager.nativeContext,\n            BINARY_GRAPH_NAME,\n            INPUT_VIDEO_STREAM_NAME,\n            OUTPUT_VIDEO_STREAM_NAME)\n    // Converts the GL_TEXTURE_EXTERNAL_OES texture from Android camera into a regular texture to be\n    // consumed by {@link FrameProcessor} and the underlying MediaPipe graph.\n    private var converter: ExternalTextureConverter? = null\n    // Handles camera access via the {@link CameraX} Jetpack support library.\n    private var cameraHelper: CameraXPreviewHelper? = null\n\n    init {\n        r.addRequestPermissionsResultListener(CameraRequestPermissionsListener())\n\n        this.methodChannel.setMethodCallHandler(this)\n        this.eventChannel.setStreamHandler(landMarksStreamHandler())\n        setupPreviewDisplayView()\n        // Initialize asset manager so that MediaPipe native libraries can access the app assets, e.g.,\n        // binary graphs.\n        AndroidAssetUtil.initializeNativeAssetManager(activity)\n        setupProcess()\n        PermissionHelper.checkAndRequestCameraPermissions(activity)\n\n        if (PermissionHelper.cameraPermissionsGranted(activity)) onResume()\n    }\n\n    override fun onMethodCall(@NonNull call: MethodCall, @NonNull result: Result) {\n        if (call.method == \"getPlatformVersion\") {\n            result.success(\"Android ${android.os.Build.VERSION.RELEASE}\")\n        } else {\n            result.notImplemented()\n        }\n    }\n\n    override fun getView(): SurfaceView? {\n        return previewDisplayView\n    }\n\n    override fun dispose() {\n        converter?.close()\n    }\n\n    private inner class CameraRequestPermissionsListener :\n            PluginRegistry.RequestPermissionsResultListener {\n        override fun onRequestPermissionsResult(requestCode: Int,\n                                                permissions: Array<out String>?,\n                                                grantResults: IntArray?): Boolean {\n            return if (requestCode != 0) false\n            else {\n                for (result in grantResults!!) {\n                    if (result == PERMISSION_GRANTED) onResume()\n                    else Toast.makeText(activity, \"请授予摄像头权限\", Toast.LENGTH_LONG).show()\n                }\n                true\n            }\n        }\n\n    }\n\n    private fun onResume() {\n        converter = ExternalTextureConverter(eglManager.context)\n        converter!!.setFlipY(FLIP_FRAMES_VERTICALLY)\n        converter!!.setConsumer(processor)\n        if (PermissionHelper.cameraPermissionsGranted(activity)) {\n            startCamera()\n        }\n    }\n\n    private fun setupPreviewDisplayView() {\n        previewDisplayView.visibility = View.GONE\n        previewDisplayView.holder.addCallback(\n                object : SurfaceHolder.Callback {\n                    override fun surfaceCreated(holder: SurfaceHolder) {\n                        processor.videoSurfaceOutput.setSurface(holder.surface)\n                    }\n\n                    override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) { // (Re-)Compute the ideal size of the camera-preview display (the area that the\n                        // camera-preview frames get rendered onto, potentially with scaling and rotation)\n                        // based on the size of the SurfaceView that contains the display.\n                        val viewSize = Size(width, height)\n                        val displaySize = cameraHelper!!.computeDisplaySizeFromViewSize(viewSize)\n                        val isCameraRotated = cameraHelper!!.isCameraRotated\n                        // Connect the converter to the camera-preview frames as its input (via\n                        // previewFrameTexture), and configure the output width and height as the computed\n                        // display size.\n                        converter!!.setSurfaceTextureAndAttachToGLContext(\n                                previewFrameTexture,\n                                if (isCameraRotated) displaySize.height else displaySize.width,\n                                if (isCameraRotated) displaySize.width else displaySize.height)\n                    }\n\n                    override fun surfaceDestroyed(holder: SurfaceHolder) {\n                        processor.videoSurfaceOutput.setSurface(null)\n                    }\n                })\n    }\n\n    private fun setupProcess() {\n        processor.videoSurfaceOutput.setFlipY(FLIP_FRAMES_VERTICALLY)\n        processor.addPacketCallback(\n                OUTPUT_HAND_PRESENCE_STREAM_NAME\n        ) { packet: Packet ->\n            val handPresence = PacketGetter.getBool(packet)\n            if (!handPresence)\n//                Toast.makeText(\n//                        activity,\n//                        \"[TS: ${packet.timestamp}] No hands detected.\",\n//                        Toast.LENGTH_SHORT).show()\n                Log.d(TAG, \"[TS:\" + packet.timestamp + \"] Hand presence is false, no hands detected.\")\n        }\n        processor.addPacketCallback(\n                OUTPUT_LANDMARKS_STREAM_NAME\n        ) { packet: Packet ->\n            val landmarksRaw = PacketGetter.getProtoBytes(packet)\n            if (eventSink == null) try {\n                val landmarks = LandmarkProto.NormalizedLandmarkList.parseFrom(landmarksRaw)\n                if (landmarks == null) {\n                    Log.d(TAG, \"[TS:\" + packet.timestamp + \"] No hand landmarks.\")\n                    return@addPacketCallback\n                }\n                // Note: If hand_presence is false, these landmarks are useless.\n                Log.d(TAG, \"[TS: ${packet.timestamp}] #Landmarks for hand: ${landmarks.landmarkCount}\\n ${getLandmarksString(landmarks)}\")\n            } catch (e: InvalidProtocolBufferException) {\n                Log.e(TAG, \"Couldn't Exception received - $e\")\n                return@addPacketCallback\n            }\n            else uiThreadHandler.post { eventSink?.success(landmarksRaw) }\n        }\n    }\n\n    private fun landMarksStreamHandler(): EventChannel.StreamHandler {\n        return object : EventChannel.StreamHandler {\n\n            override fun onListen(arguments: Any?, events: EventChannel.EventSink) {\n                eventSink = events\n                // Log.e(TAG, \"Listen Event Channel\")\n            }\n\n            override fun onCancel(arguments: Any?) {\n                eventSink = null\n            }\n        }\n    }\n\n    private fun startCamera() {\n        cameraHelper = CameraXPreviewHelper()\n        cameraHelper!!.setOnCameraStartedListener { surfaceTexture: SurfaceTexture? ->\n            previewFrameTexture = surfaceTexture\n            // Make the display view visible to start showing the preview. This triggers the\n            // SurfaceHolder.Callback added to (the holder of) previewDisplayView.\n            previewDisplayView.visibility = View.VISIBLE\n        }\n        cameraHelper!!.startCamera(activity, CAMERA_FACING,  /*surfaceTexture=*/null)\n    }\n}\n"
  },
  {
    "path": "android/src/main/kotlin/xyz/zhzh/flutter_hand_tracking_plugin/HandTrackingViewFactory.kt",
    "content": "package xyz.zhzh.flutter_hand_tracking_plugin\n\nimport android.content.Context\nimport io.flutter.plugin.common.PluginRegistry\nimport io.flutter.plugin.common.StandardMessageCodec\nimport io.flutter.plugin.platform.PlatformView\nimport io.flutter.plugin.platform.PlatformViewFactory\n\nclass HandTrackingViewFactory(private val registrar: PluginRegistry.Registrar) :\n        PlatformViewFactory(StandardMessageCodec.INSTANCE) {\n    override fun create(context: Context?, viewId: Int, args: Any?): PlatformView {\n        return FlutterHandTrackingPlugin(registrar, viewId)\n    }\n}"
  },
  {
    "path": "example/.gitignore",
    "content": "# Miscellaneous\n*.class\n*.log\n*.pyc\n*.swp\n.DS_Store\n.atom/\n.buildlog/\n.history\n.svn/\n\n# IntelliJ related\n*.iml\n*.ipr\n*.iws\n.idea/\n\n# The .vscode folder contains launch configuration and tasks you configure in\n# VS Code which you may wish to be included in version control, so this line\n# is commented out by default.\n#.vscode/\n\n# Flutter/Dart/Pub related\n**/doc/api/\n.dart_tool/\n.flutter-plugins\n.flutter-plugins-dependencies\n.packages\n.pub-cache/\n.pub/\n/build/\n\n# Web related\nlib/generated_plugin_registrant.dart\n\n# Exceptions to above rules.\n!/packages/flutter_tools/test/data/dart_dependencies_test/**/.packages\n"
  },
  {
    "path": "example/.metadata",
    "content": "# This file tracks properties of this Flutter project.\n# Used by Flutter tool to assess capabilities and perform upgrades etc.\n#\n# This file should be version controlled and should not be manually edited.\n\nversion:\n  revision: 9f5ff2306bb3e30b2b98eee79cd231b1336f41f4\n  channel: stable\n\nproject_type: app\n"
  },
  {
    "path": "example/README.md",
    "content": "# flutter_hand_tracking_plugin_example\n\nDemonstrates how to use the flutter_hand_tracking_plugin plugin.\n\n## Getting Started\n\nThis project is a starting point for a Flutter application.\n\nA few resources to get you started if this is your first Flutter project:\n\n- [Lab: Write your first Flutter app](https://flutter.dev/docs/get-started/codelab)\n- [Cookbook: Useful Flutter samples](https://flutter.dev/docs/cookbook)\n\nFor help getting started with Flutter, view our\n[online documentation](https://flutter.dev/docs), which offers tutorials,\nsamples, guidance on mobile development, and a full API reference.\n"
  },
  {
    "path": "example/android/.gitignore",
    "content": "gradle-wrapper.jar\n/.gradle\n/captures/\n/gradlew\n/gradlew.bat\n/local.properties\nGeneratedPluginRegistrant.java\n"
  },
  {
    "path": "example/android/app/build.gradle",
    "content": "def localProperties = new Properties()\ndef localPropertiesFile = rootProject.file('local.properties')\nif (localPropertiesFile.exists()) {\n    localPropertiesFile.withReader('UTF-8') { reader ->\n        localProperties.load(reader)\n    }\n}\n\ndef flutterRoot = localProperties.getProperty('flutter.sdk')\nif (flutterRoot == null) {\n    throw new GradleException(\"Flutter SDK not found. Define location with flutter.sdk in the local.properties file.\")\n}\n\ndef flutterVersionCode = localProperties.getProperty('flutter.versionCode')\nif (flutterVersionCode == null) {\n    flutterVersionCode = '1'\n}\n\ndef flutterVersionName = localProperties.getProperty('flutter.versionName')\nif (flutterVersionName == null) {\n    flutterVersionName = '1.0'\n}\n\napply plugin: 'com.android.application'\napply plugin: 'kotlin-android'\napply from: \"$flutterRoot/packages/flutter_tools/gradle/flutter.gradle\"\n\nandroid {\n    compileSdkVersion 29\n\n    sourceSets {\n        main.java.srcDirs += 'src/main/kotlin'\n    }\n\n    lintOptions {\n        disable 'InvalidPackage'\n    }\n\n    defaultConfig {\n        // TODO: Specify your own unique Application ID (https://developer.android.com/studio/build/application-id.html).\n        applicationId \"xyz.zhzh.flutter_hand_tracking_plugin_example\"\n        minSdkVersion 21\n        targetSdkVersion 29\n        versionCode flutterVersionCode.toInteger()\n        versionName flutterVersionName\n        testInstrumentationRunner \"androidx.test.runner.AndroidJUnitRunner\"\n    }\n\n    buildTypes {\n        release {\n            // TODO: Add your own signing config for the release build.\n            // Signing with the debug keys for now, so `flutter run --release` works.\n            signingConfig signingConfigs.debug\n        }\n    }\n}\n\nflutter {\n    source '../..'\n}\n\ndependencies {\n    implementation \"org.jetbrains.kotlin:kotlin-stdlib-jdk7:$kotlin_version\"\n    testImplementation 'junit:junit:4.12'\n    androidTestImplementation 'androidx.test:runner:1.2.0'\n    androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'\n}\n"
  },
  {
    "path": "example/android/app/src/debug/AndroidManifest.xml",
    "content": "<manifest xmlns:android=\"http://schemas.android.com/apk/res/android\"\n    package=\"xyz.zhzh.flutter_hand_tracking_plugin_example\">\n    <!-- Flutter needs it to communicate with the running application\n         to allow setting breakpoints, to provide hot reload, etc.\n    -->\n    <uses-permission android:name=\"android.permission.INTERNET\"/>\n</manifest>\n"
  },
  {
    "path": "example/android/app/src/main/AndroidManifest.xml",
    "content": "<manifest xmlns:android=\"http://schemas.android.com/apk/res/android\"\n    package=\"xyz.zhzh.flutter_hand_tracking_plugin_example\">\n    <!-- io.flutter.app.FlutterApplication is an android.app.Application that\n         calls FlutterMain.startInitialization(this); in its onCreate method.\n         In most cases you can leave this as-is, but you if you want to provide\n         additional functionality it is fine to subclass or reimplement\n         FlutterApplication and put your custom class here. -->\n    <application\n        android:name=\"io.flutter.app.FlutterApplication\"\n        android:label=\"flutter_hand_tracking_plugin_example\"\n        android:icon=\"@mipmap/ic_launcher\">\n        <activity\n            android:name=\".MainActivity\"\n            android:launchMode=\"singleTop\"\n            android:theme=\"@style/LaunchTheme\"\n            android:configChanges=\"orientation|keyboardHidden|keyboard|screenSize|smallestScreenSize|locale|layoutDirection|fontScale|screenLayout|density|uiMode\"\n            android:hardwareAccelerated=\"true\"\n            android:windowSoftInputMode=\"adjustResize\">\n            <intent-filter>\n                <action android:name=\"android.intent.action.MAIN\"/>\n                <category android:name=\"android.intent.category.LAUNCHER\"/>\n            </intent-filter>\n        </activity>\n        <!-- Don't delete the meta-data below.\n             This is used by the Flutter tool to generate GeneratedPluginRegistrant.java -->\n        <meta-data\n            android:name=\"flutterEmbedding\"\n            android:value=\"2\" />\n    </application>\n</manifest>\n"
  },
  {
    "path": "example/android/app/src/main/kotlin/xyz/zhzh/flutter_hand_tracking_plugin_example/MainActivity.kt",
    "content": "package xyz.zhzh.flutter_hand_tracking_plugin_example\n\nimport androidx.annotation.NonNull\nimport android.util.Log // 导入 Android 日志工具\n\nimport io.flutter.embedding.android.FlutterActivity\nimport io.flutter.embedding.engine.FlutterEngine\nimport io.flutter.plugins.GeneratedPluginRegistrant\n\nclass MainActivity: FlutterActivity() {\n    // 当 Activity 启动时，Flutter 将会配置 FlutterEngine\n    override fun configureFlutterEngine(@NonNull flutterEngine: FlutterEngine) {\n        // 使用 GeneratedPluginRegistrant 注册所有 Flutter 插件\n        GeneratedPluginRegistrant.registerWith(flutterEngine)\n\n        // 添加日志打印，以便在配置 FlutterEngine 时进行调试\n        Log.d(\"MainActivity\", \"FlutterEngine 已配置\")\n    }\n}\n"
  },
  {
    "path": "example/android/app/src/main/res/drawable/launch_background.xml",
    "content": "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n<!-- Modify this file to customize your launch splash screen -->\n<layer-list xmlns:android=\"http://schemas.android.com/apk/res/android\">\n    <item android:drawable=\"@android:color/white\" />\n\n    <!-- You can insert your own image assets here -->\n    <!-- <item>\n        <bitmap\n            android:gravity=\"center\"\n            android:src=\"@mipmap/launch_image\" />\n    </item> -->\n</layer-list>\n"
  },
  {
    "path": "example/android/app/src/main/res/values/styles.xml",
    "content": "<?xml version=\"1.0\" encoding=\"utf-8\"?>\n<resources>\n    <style name=\"LaunchTheme\" parent=\"@android:style/Theme.Black.NoTitleBar\">\n        <!-- Show a splash screen on the activity. Automatically removed when\n             Flutter draws its first frame -->\n        <item name=\"android:windowBackground\">@drawable/launch_background</item>\n    </style>\n</resources>\n"
  },
  {
    "path": "example/android/app/src/profile/AndroidManifest.xml",
    "content": "<manifest xmlns:android=\"http://schemas.android.com/apk/res/android\"\n    package=\"xyz.zhzh.flutter_hand_tracking_plugin_example\">\n    <!-- Flutter needs it to communicate with the running application\n         to allow setting breakpoints, to provide hot reload, etc.\n    -->\n    <uses-permission android:name=\"android.permission.INTERNET\"/>\n</manifest>\n"
  },
  {
    "path": "example/android/build.gradle",
    "content": "buildscript {\n    ext.kotlin_version = '1.3.50'\n    repositories {\n        google()\n        jcenter()\n    }\n\n    dependencies {\n        classpath 'com.android.tools.build:gradle:3.6.0'\n        classpath \"org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version\"\n    }\n}\n\nallprojects {\n    repositories {\n        google()\n        jcenter()\n    }\n}\n\nrootProject.buildDir = '../build'\nsubprojects {\n    project.buildDir = \"${rootProject.buildDir}/${project.name}\"\n}\nsubprojects {\n    project.evaluationDependsOn(':app')\n}\n\ntask clean(type: Delete) {\n    delete rootProject.buildDir\n}\n"
  },
  {
    "path": "example/android/gradle/wrapper/gradle-wrapper.properties",
    "content": "#Sun Mar 01 12:17:42 CST 2020\ndistributionBase=GRADLE_USER_HOME\ndistributionPath=wrapper/dists\nzipStoreBase=GRADLE_USER_HOME\nzipStorePath=wrapper/dists\ndistributionUrl=https\\://services.gradle.org/distributions/gradle-5.6.4-all.zip\n"
  },
  {
    "path": "example/android/gradle.properties",
    "content": "org.gradle.jvmargs=-Xmx1536M\nandroid.enableR8=true\nandroid.useAndroidX=true\nandroid.enableJetifier=true\n"
  },
  {
    "path": "example/android/settings.gradle",
    "content": "include ':app'\n\ndef flutterProjectRoot = rootProject.projectDir.parentFile.toPath()\n\ndef plugins = new Properties()\ndef pluginsFile = new File(flutterProjectRoot.toFile(), '.flutter-plugins')\nif (pluginsFile.exists()) {\n    pluginsFile.withReader('UTF-8') { reader -> plugins.load(reader) }\n}\n\nplugins.each { name, path ->\n    def pluginDirectory = flutterProjectRoot.resolve(path).resolve('android').toFile()\n    include \":$name\"\n    project(\":$name\").projectDir = pluginDirectory\n}\n"
  },
  {
    "path": "example/ios/.gitignore",
    "content": "*.mode1v3\n*.mode2v3\n*.moved-aside\n*.pbxuser\n*.perspectivev3\n**/*sync/\n.sconsign.dblite\n.tags*\n**/.vagrant/\n**/DerivedData/\nIcon?\n**/Pods/\n**/.symlinks/\nprofile\nxcuserdata\n**/.generated/\nFlutter/App.framework\nFlutter/Flutter.framework\nFlutter/Flutter.podspec\nFlutter/Generated.xcconfig\nFlutter/app.flx\nFlutter/app.zip\nFlutter/flutter_assets/\nFlutter/flutter_export_environment.sh\nServiceDefinitions.json\nRunner/GeneratedPluginRegistrant.*\n\n# Exceptions to above rules.\n!default.mode1v3\n!default.mode2v3\n!default.pbxuser\n!default.perspectivev3\n"
  },
  {
    "path": "example/ios/Flutter/AppFrameworkInfo.plist",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<!DOCTYPE plist PUBLIC \"-//Apple//DTD PLIST 1.0//EN\" \"http://www.apple.com/DTDs/PropertyList-1.0.dtd\">\n<plist version=\"1.0\">\n<dict>\n  <key>CFBundleDevelopmentRegion</key>\n  <string>$(DEVELOPMENT_LANGUAGE)</string>\n  <key>CFBundleExecutable</key>\n  <string>App</string>\n  <key>CFBundleIdentifier</key>\n  <string>io.flutter.flutter.app</string>\n  <key>CFBundleInfoDictionaryVersion</key>\n  <string>6.0</string>\n  <key>CFBundleName</key>\n  <string>App</string>\n  <key>CFBundlePackageType</key>\n  <string>FMWK</string>\n  <key>CFBundleShortVersionString</key>\n  <string>1.0</string>\n  <key>CFBundleSignature</key>\n  <string>????</string>\n  <key>CFBundleVersion</key>\n  <string>1.0</string>\n  <key>MinimumOSVersion</key>\n  <string>8.0</string>\n</dict>\n</plist>\n"
  },
  {
    "path": "example/ios/Flutter/Debug.xcconfig",
    "content": "#include \"Generated.xcconfig\"\n"
  },
  {
    "path": "example/ios/Flutter/Release.xcconfig",
    "content": "#include \"Generated.xcconfig\"\n"
  },
  {
    "path": "example/ios/Runner/AppDelegate.h",
    "content": "#import <Flutter/Flutter.h>\n#import <UIKit/UIKit.h>\n\n@interface AppDelegate : FlutterAppDelegate\n\n@end\n"
  },
  {
    "path": "example/ios/Runner/AppDelegate.m",
    "content": "#import \"AppDelegate.h\"\n#import \"GeneratedPluginRegistrant.h\"\n\n@implementation AppDelegate\n\n- (BOOL)application:(UIApplication *)application\n    didFinishLaunchingWithOptions:(NSDictionary *)launchOptions {\n  [GeneratedPluginRegistrant registerWithRegistry:self];\n  // Override point for customization after application launch.\n  return [super application:application didFinishLaunchingWithOptions:launchOptions];\n}\n\n@end\n"
  },
  {
    "path": "example/ios/Runner/Assets.xcassets/AppIcon.appiconset/Contents.json",
    "content": "{\n  \"images\" : [\n    {\n      \"size\" : \"20x20\",\n      \"idiom\" : \"iphone\",\n      \"filename\" : \"Icon-App-20x20@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"size\" : \"20x20\",\n      \"idiom\" : \"iphone\",\n      \"filename\" : \"Icon-App-20x20@3x.png\",\n      \"scale\" : \"3x\"\n    },\n    {\n      \"size\" : \"29x29\",\n      \"idiom\" : \"iphone\",\n      \"filename\" : \"Icon-App-29x29@1x.png\",\n      \"scale\" : \"1x\"\n    },\n    {\n      \"size\" : \"29x29\",\n      \"idiom\" : \"iphone\",\n      \"filename\" : \"Icon-App-29x29@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"size\" : \"29x29\",\n      \"idiom\" : \"iphone\",\n      \"filename\" : \"Icon-App-29x29@3x.png\",\n      \"scale\" : \"3x\"\n    },\n    {\n      \"size\" : \"40x40\",\n      \"idiom\" : \"iphone\",\n      \"filename\" : \"Icon-App-40x40@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"size\" : \"40x40\",\n      \"idiom\" : \"iphone\",\n      \"filename\" : \"Icon-App-40x40@3x.png\",\n      \"scale\" : \"3x\"\n    },\n    {\n      \"size\" : \"60x60\",\n      \"idiom\" : \"iphone\",\n      \"filename\" : \"Icon-App-60x60@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"size\" : \"60x60\",\n      \"idiom\" : \"iphone\",\n      \"filename\" : \"Icon-App-60x60@3x.png\",\n      \"scale\" : \"3x\"\n    },\n    {\n      \"size\" : \"20x20\",\n      \"idiom\" : \"ipad\",\n      \"filename\" : \"Icon-App-20x20@1x.png\",\n      \"scale\" : \"1x\"\n    },\n    {\n      \"size\" : \"20x20\",\n      \"idiom\" : \"ipad\",\n      \"filename\" : \"Icon-App-20x20@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"size\" : \"29x29\",\n      \"idiom\" : \"ipad\",\n      \"filename\" : \"Icon-App-29x29@1x.png\",\n      \"scale\" : \"1x\"\n    },\n    {\n      \"size\" : \"29x29\",\n      \"idiom\" : \"ipad\",\n      \"filename\" : \"Icon-App-29x29@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"size\" : \"40x40\",\n      \"idiom\" : \"ipad\",\n      \"filename\" : \"Icon-App-40x40@1x.png\",\n      \"scale\" : \"1x\"\n    },\n    {\n      \"size\" : \"40x40\",\n      \"idiom\" : \"ipad\",\n      \"filename\" : \"Icon-App-40x40@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"size\" : \"76x76\",\n      \"idiom\" : \"ipad\",\n      \"filename\" : \"Icon-App-76x76@1x.png\",\n      \"scale\" : \"1x\"\n    },\n    {\n      \"size\" : \"76x76\",\n      \"idiom\" : \"ipad\",\n      \"filename\" : \"Icon-App-76x76@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"size\" : \"83.5x83.5\",\n      \"idiom\" : \"ipad\",\n      \"filename\" : \"Icon-App-83.5x83.5@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"size\" : \"1024x1024\",\n      \"idiom\" : \"ios-marketing\",\n      \"filename\" : \"Icon-App-1024x1024@1x.png\",\n      \"scale\" : \"1x\"\n    }\n  ],\n  \"info\" : {\n    \"version\" : 1,\n    \"author\" : \"xcode\"\n  }\n}\n"
  },
  {
    "path": "example/ios/Runner/Assets.xcassets/LaunchImage.imageset/Contents.json",
    "content": "{\n  \"images\" : [\n    {\n      \"idiom\" : \"universal\",\n      \"filename\" : \"LaunchImage.png\",\n      \"scale\" : \"1x\"\n    },\n    {\n      \"idiom\" : \"universal\",\n      \"filename\" : \"LaunchImage@2x.png\",\n      \"scale\" : \"2x\"\n    },\n    {\n      \"idiom\" : \"universal\",\n      \"filename\" : \"LaunchImage@3x.png\",\n      \"scale\" : \"3x\"\n    }\n  ],\n  \"info\" : {\n    \"version\" : 1,\n    \"author\" : \"xcode\"\n  }\n}\n"
  },
  {
    "path": "example/ios/Runner/Assets.xcassets/LaunchImage.imageset/README.md",
    "content": "# Launch Screen Assets\n\nYou can customize the launch screen with your own desired assets by replacing the image files in this directory.\n\nYou can also do it by opening your Flutter project's Xcode project with `open ios/Runner.xcworkspace`, selecting `Runner/Assets.xcassets` in the Project Navigator and dropping in the desired images."
  },
  {
    "path": "example/ios/Runner/Base.lproj/LaunchScreen.storyboard",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"no\"?>\n<document type=\"com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB\" version=\"3.0\" toolsVersion=\"12121\" systemVersion=\"16G29\" targetRuntime=\"iOS.CocoaTouch\" propertyAccessControl=\"none\" useAutolayout=\"YES\" launchScreen=\"YES\" colorMatched=\"YES\" initialViewController=\"01J-lp-oVM\">\n    <dependencies>\n        <deployment identifier=\"iOS\"/>\n        <plugIn identifier=\"com.apple.InterfaceBuilder.IBCocoaTouchPlugin\" version=\"12089\"/>\n    </dependencies>\n    <scenes>\n        <!--View Controller-->\n        <scene sceneID=\"EHf-IW-A2E\">\n            <objects>\n                <viewController id=\"01J-lp-oVM\" sceneMemberID=\"viewController\">\n                    <layoutGuides>\n                        <viewControllerLayoutGuide type=\"top\" id=\"Ydg-fD-yQy\"/>\n                        <viewControllerLayoutGuide type=\"bottom\" id=\"xbc-2k-c8Z\"/>\n                    </layoutGuides>\n                    <view key=\"view\" contentMode=\"scaleToFill\" id=\"Ze5-6b-2t3\">\n                        <autoresizingMask key=\"autoresizingMask\" widthSizable=\"YES\" heightSizable=\"YES\"/>\n                        <subviews>\n                            <imageView opaque=\"NO\" clipsSubviews=\"YES\" multipleTouchEnabled=\"YES\" contentMode=\"center\" image=\"LaunchImage\" translatesAutoresizingMaskIntoConstraints=\"NO\" id=\"YRO-k0-Ey4\">\n                            </imageView>\n                        </subviews>\n                        <color key=\"backgroundColor\" red=\"1\" green=\"1\" blue=\"1\" alpha=\"1\" colorSpace=\"custom\" customColorSpace=\"sRGB\"/>\n                        <constraints>\n                            <constraint firstItem=\"YRO-k0-Ey4\" firstAttribute=\"centerX\" secondItem=\"Ze5-6b-2t3\" secondAttribute=\"centerX\" id=\"1a2-6s-vTC\"/>\n                            <constraint firstItem=\"YRO-k0-Ey4\" firstAttribute=\"centerY\" secondItem=\"Ze5-6b-2t3\" secondAttribute=\"centerY\" id=\"4X2-HB-R7a\"/>\n                        </constraints>\n                    </view>\n                </viewController>\n                <placeholder placeholderIdentifier=\"IBFirstResponder\" id=\"iYj-Kq-Ea1\" userLabel=\"First Responder\" sceneMemberID=\"firstResponder\"/>\n            </objects>\n            <point key=\"canvasLocation\" x=\"53\" y=\"375\"/>\n        </scene>\n    </scenes>\n    <resources>\n        <image name=\"LaunchImage\" width=\"168\" height=\"185\"/>\n    </resources>\n</document>\n"
  },
  {
    "path": "example/ios/Runner/Base.lproj/Main.storyboard",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"no\"?>\n<document type=\"com.apple.InterfaceBuilder3.CocoaTouch.Storyboard.XIB\" version=\"3.0\" toolsVersion=\"10117\" systemVersion=\"15F34\" targetRuntime=\"iOS.CocoaTouch\" propertyAccessControl=\"none\" useAutolayout=\"YES\" useTraitCollections=\"YES\" initialViewController=\"BYZ-38-t0r\">\n    <dependencies>\n        <deployment identifier=\"iOS\"/>\n        <plugIn identifier=\"com.apple.InterfaceBuilder.IBCocoaTouchPlugin\" version=\"10085\"/>\n    </dependencies>\n    <scenes>\n        <!--Flutter View Controller-->\n        <scene sceneID=\"tne-QT-ifu\">\n            <objects>\n                <viewController id=\"BYZ-38-t0r\" customClass=\"FlutterViewController\" sceneMemberID=\"viewController\">\n                    <layoutGuides>\n                        <viewControllerLayoutGuide type=\"top\" id=\"y3c-jy-aDJ\"/>\n                        <viewControllerLayoutGuide type=\"bottom\" id=\"wfy-db-euE\"/>\n                    </layoutGuides>\n                    <view key=\"view\" contentMode=\"scaleToFill\" id=\"8bC-Xf-vdC\">\n                        <rect key=\"frame\" x=\"0.0\" y=\"0.0\" width=\"600\" height=\"600\"/>\n                        <autoresizingMask key=\"autoresizingMask\" widthSizable=\"YES\" heightSizable=\"YES\"/>\n                        <color key=\"backgroundColor\" white=\"1\" alpha=\"1\" colorSpace=\"custom\" customColorSpace=\"calibratedWhite\"/>\n                    </view>\n                </viewController>\n                <placeholder placeholderIdentifier=\"IBFirstResponder\" id=\"dkx-z0-nzr\" sceneMemberID=\"firstResponder\"/>\n            </objects>\n        </scene>\n    </scenes>\n</document>\n"
  },
  {
    "path": "example/ios/Runner/Info.plist",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<!DOCTYPE plist PUBLIC \"-//Apple//DTD PLIST 1.0//EN\" \"http://www.apple.com/DTDs/PropertyList-1.0.dtd\">\n<plist version=\"1.0\">\n<dict>\n\t<key>CFBundleDevelopmentRegion</key>\n\t<string>$(DEVELOPMENT_LANGUAGE)</string>\n\t<key>CFBundleExecutable</key>\n\t<string>$(EXECUTABLE_NAME)</string>\n\t<key>CFBundleIdentifier</key>\n\t<string>$(PRODUCT_BUNDLE_IDENTIFIER)</string>\n\t<key>CFBundleInfoDictionaryVersion</key>\n\t<string>6.0</string>\n\t<key>CFBundleName</key>\n\t<string>flutter_hand_tracking_plugin_example</string>\n\t<key>CFBundlePackageType</key>\n\t<string>APPL</string>\n\t<key>CFBundleShortVersionString</key>\n\t<string>$(FLUTTER_BUILD_NAME)</string>\n\t<key>CFBundleSignature</key>\n\t<string>????</string>\n\t<key>CFBundleVersion</key>\n\t<string>$(FLUTTER_BUILD_NUMBER)</string>\n\t<key>LSRequiresIPhoneOS</key>\n\t<true/>\n\t<key>UILaunchStoryboardName</key>\n\t<string>LaunchScreen</string>\n\t<key>UIMainStoryboardFile</key>\n\t<string>Main</string>\n\t<key>UISupportedInterfaceOrientations</key>\n\t<array>\n\t\t<string>UIInterfaceOrientationPortrait</string>\n\t\t<string>UIInterfaceOrientationLandscapeLeft</string>\n\t\t<string>UIInterfaceOrientationLandscapeRight</string>\n\t</array>\n\t<key>UISupportedInterfaceOrientations~ipad</key>\n\t<array>\n\t\t<string>UIInterfaceOrientationPortrait</string>\n\t\t<string>UIInterfaceOrientationPortraitUpsideDown</string>\n\t\t<string>UIInterfaceOrientationLandscapeLeft</string>\n\t\t<string>UIInterfaceOrientationLandscapeRight</string>\n\t</array>\n\t<key>UIViewControllerBasedStatusBarAppearance</key>\n\t<false/>\n</dict>\n</plist>\n"
  },
  {
    "path": "example/ios/Runner/main.m",
    "content": "#import <Flutter/Flutter.h>\n#import <UIKit/UIKit.h>\n#import \"AppDelegate.h\"\n\nint main(int argc, char* argv[]) {\n  @autoreleasepool {\n    return UIApplicationMain(argc, argv, nil, NSStringFromClass([AppDelegate class]));\n  }\n}\n"
  },
  {
    "path": "example/ios/Runner.xcodeproj/project.pbxproj",
    "content": "// !$*UTF8*$!\n{\n\tarchiveVersion = 1;\n\tclasses = {\n\t};\n\tobjectVersion = 46;\n\tobjects = {\n\n/* Begin PBXBuildFile section */\n\t\t1498D2341E8E89220040F4C2 /* GeneratedPluginRegistrant.m in Sources */ = {isa = PBXBuildFile; fileRef = 1498D2331E8E89220040F4C2 /* GeneratedPluginRegistrant.m */; };\n\t\t3B3967161E833CAA004F5970 /* AppFrameworkInfo.plist in Resources */ = {isa = PBXBuildFile; fileRef = 3B3967151E833CAA004F5970 /* AppFrameworkInfo.plist */; };\n\t\t3B80C3941E831B6300D905FE /* App.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 3B80C3931E831B6300D905FE /* App.framework */; };\n\t\t3B80C3951E831B6300D905FE /* App.framework in Embed Frameworks */ = {isa = PBXBuildFile; fileRef = 3B80C3931E831B6300D905FE /* App.framework */; settings = {ATTRIBUTES = (CodeSignOnCopy, RemoveHeadersOnCopy, ); }; };\n\t\t9705A1C61CF904A100538489 /* Flutter.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 9740EEBA1CF902C7004384FC /* Flutter.framework */; };\n\t\t9705A1C71CF904A300538489 /* Flutter.framework in Embed Frameworks */ = {isa = PBXBuildFile; fileRef = 9740EEBA1CF902C7004384FC /* Flutter.framework */; settings = {ATTRIBUTES = (CodeSignOnCopy, RemoveHeadersOnCopy, ); }; };\n\t\t978B8F6F1D3862AE00F588F7 /* AppDelegate.m in Sources */ = {isa = PBXBuildFile; fileRef = 7AFFD8EE1D35381100E5BB4D /* AppDelegate.m */; };\n\t\t97C146F31CF9000F007C117D /* main.m in Sources */ = {isa = PBXBuildFile; fileRef = 97C146F21CF9000F007C117D /* main.m */; };\n\t\t97C146FC1CF9000F007C117D /* Main.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = 97C146FA1CF9000F007C117D /* Main.storyboard */; };\n\t\t97C146FE1CF9000F007C117D /* Assets.xcassets in Resources */ = {isa = PBXBuildFile; fileRef = 97C146FD1CF9000F007C117D /* Assets.xcassets */; };\n\t\t97C147011CF9000F007C117D /* LaunchScreen.storyboard in Resources */ = {isa = PBXBuildFile; fileRef = 97C146FF1CF9000F007C117D /* LaunchScreen.storyboard */; };\n/* End PBXBuildFile section */\n\n/* Begin PBXCopyFilesBuildPhase section */\n\t\t9705A1C41CF9048500538489 /* Embed Frameworks */ = {\n\t\t\tisa = PBXCopyFilesBuildPhase;\n\t\t\tbuildActionMask = 2147483647;\n\t\t\tdstPath = \"\";\n\t\t\tdstSubfolderSpec = 10;\n\t\t\tfiles = (\n\t\t\t\t3B80C3951E831B6300D905FE /* App.framework in Embed Frameworks */,\n\t\t\t\t9705A1C71CF904A300538489 /* Flutter.framework in Embed Frameworks */,\n\t\t\t);\n\t\t\tname = \"Embed Frameworks\";\n\t\t\trunOnlyForDeploymentPostprocessing = 0;\n\t\t};\n/* End PBXCopyFilesBuildPhase section */\n\n/* Begin PBXFileReference section */\n\t\t1498D2321E8E86230040F4C2 /* GeneratedPluginRegistrant.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = GeneratedPluginRegistrant.h; sourceTree = \"<group>\"; };\n\t\t1498D2331E8E89220040F4C2 /* GeneratedPluginRegistrant.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = GeneratedPluginRegistrant.m; sourceTree = \"<group>\"; };\n\t\t3B3967151E833CAA004F5970 /* AppFrameworkInfo.plist */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.plist.xml; name = AppFrameworkInfo.plist; path = Flutter/AppFrameworkInfo.plist; sourceTree = \"<group>\"; };\n\t\t3B80C3931E831B6300D905FE /* App.framework */ = {isa = PBXFileReference; lastKnownFileType = wrapper.framework; name = App.framework; path = Flutter/App.framework; sourceTree = \"<group>\"; };\n\t\t7AFA3C8E1D35360C0083082E /* Release.xcconfig */ = {isa = PBXFileReference; lastKnownFileType = text.xcconfig; name = Release.xcconfig; path = Flutter/Release.xcconfig; sourceTree = \"<group>\"; };\n\t\t7AFFD8ED1D35381100E5BB4D /* AppDelegate.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AppDelegate.h; sourceTree = \"<group>\"; };\n\t\t7AFFD8EE1D35381100E5BB4D /* AppDelegate.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = AppDelegate.m; sourceTree = \"<group>\"; };\n\t\t9740EEB21CF90195004384FC /* Debug.xcconfig */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.xcconfig; name = Debug.xcconfig; path = Flutter/Debug.xcconfig; sourceTree = \"<group>\"; };\n\t\t9740EEB31CF90195004384FC /* Generated.xcconfig */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.xcconfig; name = Generated.xcconfig; path = Flutter/Generated.xcconfig; sourceTree = \"<group>\"; };\n\t\t9740EEBA1CF902C7004384FC /* Flutter.framework */ = {isa = PBXFileReference; lastKnownFileType = wrapper.framework; name = Flutter.framework; path = Flutter/Flutter.framework; sourceTree = \"<group>\"; };\n\t\t97C146EE1CF9000F007C117D /* Runner.app */ = {isa = PBXFileReference; explicitFileType = wrapper.application; includeInIndex = 0; path = Runner.app; sourceTree = BUILT_PRODUCTS_DIR; };\n\t\t97C146F21CF9000F007C117D /* main.m */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.objc; path = main.m; sourceTree = \"<group>\"; };\n\t\t97C146FB1CF9000F007C117D /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/Main.storyboard; sourceTree = \"<group>\"; };\n\t\t97C146FD1CF9000F007C117D /* Assets.xcassets */ = {isa = PBXFileReference; lastKnownFileType = folder.assetcatalog; path = Assets.xcassets; sourceTree = \"<group>\"; };\n\t\t97C147001CF9000F007C117D /* Base */ = {isa = PBXFileReference; lastKnownFileType = file.storyboard; name = Base; path = Base.lproj/LaunchScreen.storyboard; sourceTree = \"<group>\"; };\n\t\t97C147021CF9000F007C117D /* Info.plist */ = {isa = PBXFileReference; lastKnownFileType = text.plist.xml; path = Info.plist; sourceTree = \"<group>\"; };\n/* End PBXFileReference section */\n\n/* Begin PBXFrameworksBuildPhase section */\n\t\t97C146EB1CF9000F007C117D /* Frameworks */ = {\n\t\t\tisa = PBXFrameworksBuildPhase;\n\t\t\tbuildActionMask = 2147483647;\n\t\t\tfiles = (\n\t\t\t\t9705A1C61CF904A100538489 /* Flutter.framework in Frameworks */,\n\t\t\t\t3B80C3941E831B6300D905FE /* App.framework in Frameworks */,\n\t\t\t);\n\t\t\trunOnlyForDeploymentPostprocessing = 0;\n\t\t};\n/* End PBXFrameworksBuildPhase section */\n\n/* Begin PBXGroup section */\n\t\t9740EEB11CF90186004384FC /* Flutter */ = {\n\t\t\tisa = PBXGroup;\n\t\t\tchildren = (\n\t\t\t\t3B80C3931E831B6300D905FE /* App.framework */,\n\t\t\t\t3B3967151E833CAA004F5970 /* AppFrameworkInfo.plist */,\n\t\t\t\t9740EEBA1CF902C7004384FC /* Flutter.framework */,\n\t\t\t\t9740EEB21CF90195004384FC /* Debug.xcconfig */,\n\t\t\t\t7AFA3C8E1D35360C0083082E /* Release.xcconfig */,\n\t\t\t\t9740EEB31CF90195004384FC /* Generated.xcconfig */,\n\t\t\t);\n\t\t\tname = Flutter;\n\t\t\tsourceTree = \"<group>\";\n\t\t};\n\t\t97C146E51CF9000F007C117D = {\n\t\t\tisa = PBXGroup;\n\t\t\tchildren = (\n\t\t\t\t9740EEB11CF90186004384FC /* Flutter */,\n\t\t\t\t97C146F01CF9000F007C117D /* Runner */,\n\t\t\t\t97C146EF1CF9000F007C117D /* Products */,\n\t\t\t\tCF3B75C9A7D2FA2A4C99F110 /* Frameworks */,\n\t\t\t);\n\t\t\tsourceTree = \"<group>\";\n\t\t};\n\t\t97C146EF1CF9000F007C117D /* Products */ = {\n\t\t\tisa = PBXGroup;\n\t\t\tchildren = (\n\t\t\t\t97C146EE1CF9000F007C117D /* Runner.app */,\n\t\t\t);\n\t\t\tname = Products;\n\t\t\tsourceTree = \"<group>\";\n\t\t};\n\t\t97C146F01CF9000F007C117D /* Runner */ = {\n\t\t\tisa = PBXGroup;\n\t\t\tchildren = (\n\t\t\t\t7AFFD8ED1D35381100E5BB4D /* AppDelegate.h */,\n\t\t\t\t7AFFD8EE1D35381100E5BB4D /* AppDelegate.m */,\n\t\t\t\t97C146FA1CF9000F007C117D /* Main.storyboard */,\n\t\t\t\t97C146FD1CF9000F007C117D /* Assets.xcassets */,\n\t\t\t\t97C146FF1CF9000F007C117D /* LaunchScreen.storyboard */,\n\t\t\t\t97C147021CF9000F007C117D /* Info.plist */,\n\t\t\t\t97C146F11CF9000F007C117D /* Supporting Files */,\n\t\t\t\t1498D2321E8E86230040F4C2 /* GeneratedPluginRegistrant.h */,\n\t\t\t\t1498D2331E8E89220040F4C2 /* GeneratedPluginRegistrant.m */,\n\t\t\t);\n\t\t\tpath = Runner;\n\t\t\tsourceTree = \"<group>\";\n\t\t};\n\t\t97C146F11CF9000F007C117D /* Supporting Files */ = {\n\t\t\tisa = PBXGroup;\n\t\t\tchildren = (\n\t\t\t\t97C146F21CF9000F007C117D /* main.m */,\n\t\t\t);\n\t\t\tname = \"Supporting Files\";\n\t\t\tsourceTree = \"<group>\";\n\t\t};\n/* End PBXGroup section */\n\n/* Begin PBXNativeTarget section */\n\t\t97C146ED1CF9000F007C117D /* Runner */ = {\n\t\t\tisa = PBXNativeTarget;\n\t\t\tbuildConfigurationList = 97C147051CF9000F007C117D /* Build configuration list for PBXNativeTarget \"Runner\" */;\n\t\t\tbuildPhases = (\n\t\t\t\t9740EEB61CF901F6004384FC /* Run Script */,\n\t\t\t\t97C146EA1CF9000F007C117D /* Sources */,\n\t\t\t\t97C146EB1CF9000F007C117D /* Frameworks */,\n\t\t\t\t97C146EC1CF9000F007C117D /* Resources */,\n\t\t\t\t9705A1C41CF9048500538489 /* Embed Frameworks */,\n\t\t\t\t3B06AD1E1E4923F5004D2608 /* Thin Binary */,\n\t\t\t);\n\t\t\tbuildRules = (\n\t\t\t);\n\t\t\tdependencies = (\n\t\t\t);\n\t\t\tname = Runner;\n\t\t\tproductName = Runner;\n\t\t\tproductReference = 97C146EE1CF9000F007C117D /* Runner.app */;\n\t\t\tproductType = \"com.apple.product-type.application\";\n\t\t};\n/* End PBXNativeTarget section */\n\n/* Begin PBXProject section */\n\t\t97C146E61CF9000F007C117D /* Project object */ = {\n\t\t\tisa = PBXProject;\n\t\t\tattributes = {\n\t\t\t\tLastUpgradeCheck = 1020;\n\t\t\t\tORGANIZATIONNAME = \"The Chromium Authors\";\n\t\t\t\tTargetAttributes = {\n\t\t\t\t\t97C146ED1CF9000F007C117D = {\n\t\t\t\t\t\tCreatedOnToolsVersion = 7.3.1;\n\t\t\t\t\t};\n\t\t\t\t};\n\t\t\t};\n\t\t\tbuildConfigurationList = 97C146E91CF9000F007C117D /* Build configuration list for PBXProject \"Runner\" */;\n\t\t\tcompatibilityVersion = \"Xcode 3.2\";\n\t\t\tdevelopmentRegion = en;\n\t\t\thasScannedForEncodings = 0;\n\t\t\tknownRegions = (\n\t\t\t\ten,\n\t\t\t\tBase,\n\t\t\t);\n\t\t\tmainGroup = 97C146E51CF9000F007C117D;\n\t\t\tproductRefGroup = 97C146EF1CF9000F007C117D /* Products */;\n\t\t\tprojectDirPath = \"\";\n\t\t\tprojectRoot = \"\";\n\t\t\ttargets = (\n\t\t\t\t97C146ED1CF9000F007C117D /* Runner */,\n\t\t\t);\n\t\t};\n/* End PBXProject section */\n\n/* Begin PBXResourcesBuildPhase section */\n\t\t97C146EC1CF9000F007C117D /* Resources */ = {\n\t\t\tisa = PBXResourcesBuildPhase;\n\t\t\tbuildActionMask = 2147483647;\n\t\t\tfiles = (\n\t\t\t\t97C147011CF9000F007C117D /* LaunchScreen.storyboard in Resources */,\n\t\t\t\t3B3967161E833CAA004F5970 /* AppFrameworkInfo.plist in Resources */,\n\t\t\t\t97C146FE1CF9000F007C117D /* Assets.xcassets in Resources */,\n\t\t\t\t97C146FC1CF9000F007C117D /* Main.storyboard in Resources */,\n\t\t\t);\n\t\t\trunOnlyForDeploymentPostprocessing = 0;\n\t\t};\n/* End PBXResourcesBuildPhase section */\n\n/* Begin PBXShellScriptBuildPhase section */\n\t\t3B06AD1E1E4923F5004D2608 /* Thin Binary */ = {\n\t\t\tisa = PBXShellScriptBuildPhase;\n\t\t\tbuildActionMask = 2147483647;\n\t\t\tfiles = (\n\t\t\t);\n\t\t\tinputPaths = (\n\t\t\t);\n\t\t\tname = \"Thin Binary\";\n\t\t\toutputPaths = (\n\t\t\t);\n\t\t\trunOnlyForDeploymentPostprocessing = 0;\n\t\t\tshellPath = /bin/sh;\n\t\t\tshellScript = \"/bin/sh \\\"$FLUTTER_ROOT/packages/flutter_tools/bin/xcode_backend.sh\\\" thin\";\n\t\t};\n\t\t9740EEB61CF901F6004384FC /* Run Script */ = {\n\t\t\tisa = PBXShellScriptBuildPhase;\n\t\t\tbuildActionMask = 2147483647;\n\t\t\tfiles = (\n\t\t\t);\n\t\t\tinputPaths = (\n\t\t\t);\n\t\t\tname = \"Run Script\";\n\t\t\toutputPaths = (\n\t\t\t);\n\t\t\trunOnlyForDeploymentPostprocessing = 0;\n\t\t\tshellPath = /bin/sh;\n\t\t\tshellScript = \"/bin/sh \\\"$FLUTTER_ROOT/packages/flutter_tools/bin/xcode_backend.sh\\\" build\";\n\t\t};\n/* End PBXShellScriptBuildPhase section */\n\n/* Begin PBXSourcesBuildPhase section */\n\t\t97C146EA1CF9000F007C117D /* Sources */ = {\n\t\t\tisa = PBXSourcesBuildPhase;\n\t\t\tbuildActionMask = 2147483647;\n\t\t\tfiles = (\n\t\t\t\t978B8F6F1D3862AE00F588F7 /* AppDelegate.m in Sources */,\n\t\t\t\t97C146F31CF9000F007C117D /* main.m in Sources */,\n\t\t\t\t1498D2341E8E89220040F4C2 /* GeneratedPluginRegistrant.m in Sources */,\n\t\t\t);\n\t\t\trunOnlyForDeploymentPostprocessing = 0;\n\t\t};\n/* End PBXSourcesBuildPhase section */\n\n/* Begin PBXVariantGroup section */\n\t\t97C146FA1CF9000F007C117D /* Main.storyboard */ = {\n\t\t\tisa = PBXVariantGroup;\n\t\t\tchildren = (\n\t\t\t\t97C146FB1CF9000F007C117D /* Base */,\n\t\t\t);\n\t\t\tname = Main.storyboard;\n\t\t\tsourceTree = \"<group>\";\n\t\t};\n\t\t97C146FF1CF9000F007C117D /* LaunchScreen.storyboard */ = {\n\t\t\tisa = PBXVariantGroup;\n\t\t\tchildren = (\n\t\t\t\t97C147001CF9000F007C117D /* Base */,\n\t\t\t);\n\t\t\tname = LaunchScreen.storyboard;\n\t\t\tsourceTree = \"<group>\";\n\t\t};\n/* End PBXVariantGroup section */\n\n/* Begin XCBuildConfiguration section */\n\t\t249021D3217E4FDB00AE95B9 /* Profile */ = {\n\t\t\tisa = XCBuildConfiguration;\n\t\t\tbaseConfigurationReference = 7AFA3C8E1D35360C0083082E /* Release.xcconfig */;\n\t\t\tbuildSettings = {\n\t\t\t\tALWAYS_SEARCH_USER_PATHS = NO;\n\t\t\t\tCLANG_ANALYZER_NONNULL = YES;\n\t\t\t\tCLANG_CXX_LANGUAGE_STANDARD = \"gnu++0x\";\n\t\t\t\tCLANG_CXX_LIBRARY = \"libc++\";\n\t\t\t\tCLANG_ENABLE_MODULES = YES;\n\t\t\t\tCLANG_ENABLE_OBJC_ARC = YES;\n\t\t\t\tCLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES;\n\t\t\t\tCLANG_WARN_BOOL_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_COMMA = YES;\n\t\t\t\tCLANG_WARN_CONSTANT_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS = YES;\n\t\t\t\tCLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;\n\t\t\t\tCLANG_WARN_EMPTY_BODY = YES;\n\t\t\t\tCLANG_WARN_ENUM_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_INFINITE_RECURSION = YES;\n\t\t\t\tCLANG_WARN_INT_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF = YES;\n\t\t\t\tCLANG_WARN_OBJC_LITERAL_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;\n\t\t\t\tCLANG_WARN_RANGE_LOOP_ANALYSIS = YES;\n\t\t\t\tCLANG_WARN_STRICT_PROTOTYPES = YES;\n\t\t\t\tCLANG_WARN_SUSPICIOUS_MOVE = YES;\n\t\t\t\tCLANG_WARN_UNREACHABLE_CODE = YES;\n\t\t\t\tCLANG_WARN__DUPLICATE_METHOD_MATCH = YES;\n\t\t\t\t\"CODE_SIGN_IDENTITY[sdk=iphoneos*]\" = \"iPhone Developer\";\n\t\t\t\tCOPY_PHASE_STRIP = NO;\n\t\t\t\tDEBUG_INFORMATION_FORMAT = \"dwarf-with-dsym\";\n\t\t\t\tENABLE_NS_ASSERTIONS = NO;\n\t\t\t\tENABLE_STRICT_OBJC_MSGSEND = YES;\n\t\t\t\tGCC_C_LANGUAGE_STANDARD = gnu99;\n\t\t\t\tGCC_NO_COMMON_BLOCKS = YES;\n\t\t\t\tGCC_WARN_64_TO_32_BIT_CONVERSION = YES;\n\t\t\t\tGCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;\n\t\t\t\tGCC_WARN_UNDECLARED_SELECTOR = YES;\n\t\t\t\tGCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;\n\t\t\t\tGCC_WARN_UNUSED_FUNCTION = YES;\n\t\t\t\tGCC_WARN_UNUSED_VARIABLE = YES;\n\t\t\t\tIPHONEOS_DEPLOYMENT_TARGET = 8.0;\n\t\t\t\tMTL_ENABLE_DEBUG_INFO = NO;\n\t\t\t\tSDKROOT = iphoneos;\n\t\t\t\tSUPPORTED_PLATFORMS = iphoneos;\n\t\t\t\tTARGETED_DEVICE_FAMILY = \"1,2\";\n\t\t\t\tVALIDATE_PRODUCT = YES;\n\t\t\t};\n\t\t\tname = Profile;\n\t\t};\n\t\t249021D4217E4FDB00AE95B9 /* Profile */ = {\n\t\t\tisa = XCBuildConfiguration;\n\t\t\tbaseConfigurationReference = 7AFA3C8E1D35360C0083082E /* Release.xcconfig */;\n\t\t\tbuildSettings = {\n\t\t\t\tASSETCATALOG_COMPILER_APPICON_NAME = AppIcon;\n\t\t\t\tCURRENT_PROJECT_VERSION = \"$(FLUTTER_BUILD_NUMBER)\";\n\t\t\t\tENABLE_BITCODE = NO;\n\t\t\t\tFRAMEWORK_SEARCH_PATHS = (\n\t\t\t\t\t\"$(inherited)\",\n\t\t\t\t\t\"$(PROJECT_DIR)/Flutter\",\n\t\t\t\t);\n\t\t\t\tINFOPLIST_FILE = Runner/Info.plist;\n\t\t\t\tLD_RUNPATH_SEARCH_PATHS = \"$(inherited) @executable_path/Frameworks\";\n\t\t\t\tLIBRARY_SEARCH_PATHS = (\n\t\t\t\t\t\"$(inherited)\",\n\t\t\t\t\t\"$(PROJECT_DIR)/Flutter\",\n\t\t\t\t);\n\t\t\t\tPRODUCT_BUNDLE_IDENTIFIER = xyz.zhzh.flutterHandTrackingPluginExample;\n\t\t\t\tPRODUCT_NAME = \"$(TARGET_NAME)\";\n\t\t\t\tVERSIONING_SYSTEM = \"apple-generic\";\n\t\t\t};\n\t\t\tname = Profile;\n\t\t};\n\t\t97C147031CF9000F007C117D /* Debug */ = {\n\t\t\tisa = XCBuildConfiguration;\n\t\t\tbaseConfigurationReference = 9740EEB21CF90195004384FC /* Debug.xcconfig */;\n\t\t\tbuildSettings = {\n\t\t\t\tALWAYS_SEARCH_USER_PATHS = NO;\n\t\t\t\tCLANG_ANALYZER_NONNULL = YES;\n\t\t\t\tCLANG_CXX_LANGUAGE_STANDARD = \"gnu++0x\";\n\t\t\t\tCLANG_CXX_LIBRARY = \"libc++\";\n\t\t\t\tCLANG_ENABLE_MODULES = YES;\n\t\t\t\tCLANG_ENABLE_OBJC_ARC = YES;\n\t\t\t\tCLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES;\n\t\t\t\tCLANG_WARN_BOOL_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_COMMA = YES;\n\t\t\t\tCLANG_WARN_CONSTANT_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS = YES;\n\t\t\t\tCLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;\n\t\t\t\tCLANG_WARN_EMPTY_BODY = YES;\n\t\t\t\tCLANG_WARN_ENUM_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_INFINITE_RECURSION = YES;\n\t\t\t\tCLANG_WARN_INT_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF = YES;\n\t\t\t\tCLANG_WARN_OBJC_LITERAL_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;\n\t\t\t\tCLANG_WARN_RANGE_LOOP_ANALYSIS = YES;\n\t\t\t\tCLANG_WARN_STRICT_PROTOTYPES = YES;\n\t\t\t\tCLANG_WARN_SUSPICIOUS_MOVE = YES;\n\t\t\t\tCLANG_WARN_UNREACHABLE_CODE = YES;\n\t\t\t\tCLANG_WARN__DUPLICATE_METHOD_MATCH = YES;\n\t\t\t\t\"CODE_SIGN_IDENTITY[sdk=iphoneos*]\" = \"iPhone Developer\";\n\t\t\t\tCOPY_PHASE_STRIP = NO;\n\t\t\t\tDEBUG_INFORMATION_FORMAT = dwarf;\n\t\t\t\tENABLE_STRICT_OBJC_MSGSEND = YES;\n\t\t\t\tENABLE_TESTABILITY = YES;\n\t\t\t\tGCC_C_LANGUAGE_STANDARD = gnu99;\n\t\t\t\tGCC_DYNAMIC_NO_PIC = NO;\n\t\t\t\tGCC_NO_COMMON_BLOCKS = YES;\n\t\t\t\tGCC_OPTIMIZATION_LEVEL = 0;\n\t\t\t\tGCC_PREPROCESSOR_DEFINITIONS = (\n\t\t\t\t\t\"DEBUG=1\",\n\t\t\t\t\t\"$(inherited)\",\n\t\t\t\t);\n\t\t\t\tGCC_WARN_64_TO_32_BIT_CONVERSION = YES;\n\t\t\t\tGCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;\n\t\t\t\tGCC_WARN_UNDECLARED_SELECTOR = YES;\n\t\t\t\tGCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;\n\t\t\t\tGCC_WARN_UNUSED_FUNCTION = YES;\n\t\t\t\tGCC_WARN_UNUSED_VARIABLE = YES;\n\t\t\t\tIPHONEOS_DEPLOYMENT_TARGET = 8.0;\n\t\t\t\tMTL_ENABLE_DEBUG_INFO = YES;\n\t\t\t\tONLY_ACTIVE_ARCH = YES;\n\t\t\t\tSDKROOT = iphoneos;\n\t\t\t\tTARGETED_DEVICE_FAMILY = \"1,2\";\n\t\t\t};\n\t\t\tname = Debug;\n\t\t};\n\t\t97C147041CF9000F007C117D /* Release */ = {\n\t\t\tisa = XCBuildConfiguration;\n\t\t\tbaseConfigurationReference = 7AFA3C8E1D35360C0083082E /* Release.xcconfig */;\n\t\t\tbuildSettings = {\n\t\t\t\tALWAYS_SEARCH_USER_PATHS = NO;\n\t\t\t\tCLANG_ANALYZER_NONNULL = YES;\n\t\t\t\tCLANG_CXX_LANGUAGE_STANDARD = \"gnu++0x\";\n\t\t\t\tCLANG_CXX_LIBRARY = \"libc++\";\n\t\t\t\tCLANG_ENABLE_MODULES = YES;\n\t\t\t\tCLANG_ENABLE_OBJC_ARC = YES;\n\t\t\t\tCLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES;\n\t\t\t\tCLANG_WARN_BOOL_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_COMMA = YES;\n\t\t\t\tCLANG_WARN_CONSTANT_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_DEPRECATED_OBJC_IMPLEMENTATIONS = YES;\n\t\t\t\tCLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;\n\t\t\t\tCLANG_WARN_EMPTY_BODY = YES;\n\t\t\t\tCLANG_WARN_ENUM_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_INFINITE_RECURSION = YES;\n\t\t\t\tCLANG_WARN_INT_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_OBJC_IMPLICIT_RETAIN_SELF = YES;\n\t\t\t\tCLANG_WARN_OBJC_LITERAL_CONVERSION = YES;\n\t\t\t\tCLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;\n\t\t\t\tCLANG_WARN_RANGE_LOOP_ANALYSIS = YES;\n\t\t\t\tCLANG_WARN_STRICT_PROTOTYPES = YES;\n\t\t\t\tCLANG_WARN_SUSPICIOUS_MOVE = YES;\n\t\t\t\tCLANG_WARN_UNREACHABLE_CODE = YES;\n\t\t\t\tCLANG_WARN__DUPLICATE_METHOD_MATCH = YES;\n\t\t\t\t\"CODE_SIGN_IDENTITY[sdk=iphoneos*]\" = \"iPhone Developer\";\n\t\t\t\tCOPY_PHASE_STRIP = NO;\n\t\t\t\tDEBUG_INFORMATION_FORMAT = \"dwarf-with-dsym\";\n\t\t\t\tENABLE_NS_ASSERTIONS = NO;\n\t\t\t\tENABLE_STRICT_OBJC_MSGSEND = YES;\n\t\t\t\tGCC_C_LANGUAGE_STANDARD = gnu99;\n\t\t\t\tGCC_NO_COMMON_BLOCKS = YES;\n\t\t\t\tGCC_WARN_64_TO_32_BIT_CONVERSION = YES;\n\t\t\t\tGCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;\n\t\t\t\tGCC_WARN_UNDECLARED_SELECTOR = YES;\n\t\t\t\tGCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;\n\t\t\t\tGCC_WARN_UNUSED_FUNCTION = YES;\n\t\t\t\tGCC_WARN_UNUSED_VARIABLE = YES;\n\t\t\t\tIPHONEOS_DEPLOYMENT_TARGET = 8.0;\n\t\t\t\tMTL_ENABLE_DEBUG_INFO = NO;\n\t\t\t\tSDKROOT = iphoneos;\n\t\t\t\tSUPPORTED_PLATFORMS = iphoneos;\n\t\t\t\tTARGETED_DEVICE_FAMILY = \"1,2\";\n\t\t\t\tVALIDATE_PRODUCT = YES;\n\t\t\t};\n\t\t\tname = Release;\n\t\t};\n\t\t97C147061CF9000F007C117D /* Debug */ = {\n\t\t\tisa = XCBuildConfiguration;\n\t\t\tbaseConfigurationReference = 9740EEB21CF90195004384FC /* Debug.xcconfig */;\n\t\t\tbuildSettings = {\n\t\t\t\tASSETCATALOG_COMPILER_APPICON_NAME = AppIcon;\n\t\t\t\tCURRENT_PROJECT_VERSION = \"$(FLUTTER_BUILD_NUMBER)\";\n\t\t\t\tENABLE_BITCODE = NO;\n\t\t\t\tFRAMEWORK_SEARCH_PATHS = (\n\t\t\t\t\t\"$(inherited)\",\n\t\t\t\t\t\"$(PROJECT_DIR)/Flutter\",\n\t\t\t\t);\n\t\t\t\tINFOPLIST_FILE = Runner/Info.plist;\n\t\t\t\tLD_RUNPATH_SEARCH_PATHS = \"$(inherited) @executable_path/Frameworks\";\n\t\t\t\tLIBRARY_SEARCH_PATHS = (\n\t\t\t\t\t\"$(inherited)\",\n\t\t\t\t\t\"$(PROJECT_DIR)/Flutter\",\n\t\t\t\t);\n\t\t\t\tPRODUCT_BUNDLE_IDENTIFIER = xyz.zhzh.flutterHandTrackingPluginExample;\n\t\t\t\tPRODUCT_NAME = \"$(TARGET_NAME)\";\n\t\t\t\tVERSIONING_SYSTEM = \"apple-generic\";\n\t\t\t};\n\t\t\tname = Debug;\n\t\t};\n\t\t97C147071CF9000F007C117D /* Release */ = {\n\t\t\tisa = XCBuildConfiguration;\n\t\t\tbaseConfigurationReference = 7AFA3C8E1D35360C0083082E /* Release.xcconfig */;\n\t\t\tbuildSettings = {\n\t\t\t\tASSETCATALOG_COMPILER_APPICON_NAME = AppIcon;\n\t\t\t\tCURRENT_PROJECT_VERSION = \"$(FLUTTER_BUILD_NUMBER)\";\n\t\t\t\tENABLE_BITCODE = NO;\n\t\t\t\tFRAMEWORK_SEARCH_PATHS = (\n\t\t\t\t\t\"$(inherited)\",\n\t\t\t\t\t\"$(PROJECT_DIR)/Flutter\",\n\t\t\t\t);\n\t\t\t\tINFOPLIST_FILE = Runner/Info.plist;\n\t\t\t\tLD_RUNPATH_SEARCH_PATHS = \"$(inherited) @executable_path/Frameworks\";\n\t\t\t\tLIBRARY_SEARCH_PATHS = (\n\t\t\t\t\t\"$(inherited)\",\n\t\t\t\t\t\"$(PROJECT_DIR)/Flutter\",\n\t\t\t\t);\n\t\t\t\tPRODUCT_BUNDLE_IDENTIFIER = xyz.zhzh.flutterHandTrackingPluginExample;\n\t\t\t\tPRODUCT_NAME = \"$(TARGET_NAME)\";\n\t\t\t\tVERSIONING_SYSTEM = \"apple-generic\";\n\t\t\t};\n\t\t\tname = Release;\n\t\t};\n/* End XCBuildConfiguration section */\n\n/* Begin XCConfigurationList section */\n\t\t97C146E91CF9000F007C117D /* Build configuration list for PBXProject \"Runner\" */ = {\n\t\t\tisa = XCConfigurationList;\n\t\t\tbuildConfigurations = (\n\t\t\t\t97C147031CF9000F007C117D /* Debug */,\n\t\t\t\t97C147041CF9000F007C117D /* Release */,\n\t\t\t\t249021D3217E4FDB00AE95B9 /* Profile */,\n\t\t\t);\n\t\t\tdefaultConfigurationIsVisible = 0;\n\t\t\tdefaultConfigurationName = Release;\n\t\t};\n\t\t97C147051CF9000F007C117D /* Build configuration list for PBXNativeTarget \"Runner\" */ = {\n\t\t\tisa = XCConfigurationList;\n\t\t\tbuildConfigurations = (\n\t\t\t\t97C147061CF9000F007C117D /* Debug */,\n\t\t\t\t97C147071CF9000F007C117D /* Release */,\n\t\t\t\t249021D4217E4FDB00AE95B9 /* Profile */,\n\t\t\t);\n\t\t\tdefaultConfigurationIsVisible = 0;\n\t\t\tdefaultConfigurationName = Release;\n\t\t};\n/* End XCConfigurationList section */\n\t};\n\trootObject = 97C146E61CF9000F007C117D /* Project object */;\n}\n"
  },
  {
    "path": "example/ios/Runner.xcodeproj/project.xcworkspace/contents.xcworkspacedata",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Workspace\n   version = \"1.0\">\n   <FileRef\n      location = \"group:Runner.xcodeproj\">\n   </FileRef>\n</Workspace>\n"
  },
  {
    "path": "example/ios/Runner.xcodeproj/xcshareddata/xcschemes/Runner.xcscheme",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Scheme\n   LastUpgradeVersion = \"1020\"\n   version = \"1.3\">\n   <BuildAction\n      parallelizeBuildables = \"YES\"\n      buildImplicitDependencies = \"YES\">\n      <BuildActionEntries>\n         <BuildActionEntry\n            buildForTesting = \"YES\"\n            buildForRunning = \"YES\"\n            buildForProfiling = \"YES\"\n            buildForArchiving = \"YES\"\n            buildForAnalyzing = \"YES\">\n            <BuildableReference\n               BuildableIdentifier = \"primary\"\n               BlueprintIdentifier = \"97C146ED1CF9000F007C117D\"\n               BuildableName = \"Runner.app\"\n               BlueprintName = \"Runner\"\n               ReferencedContainer = \"container:Runner.xcodeproj\">\n            </BuildableReference>\n         </BuildActionEntry>\n      </BuildActionEntries>\n   </BuildAction>\n   <TestAction\n      buildConfiguration = \"Debug\"\n      selectedDebuggerIdentifier = \"Xcode.DebuggerFoundation.Debugger.LLDB\"\n      selectedLauncherIdentifier = \"Xcode.DebuggerFoundation.Launcher.LLDB\"\n      shouldUseLaunchSchemeArgsEnv = \"YES\">\n      <Testables>\n      </Testables>\n      <MacroExpansion>\n         <BuildableReference\n            BuildableIdentifier = \"primary\"\n            BlueprintIdentifier = \"97C146ED1CF9000F007C117D\"\n            BuildableName = \"Runner.app\"\n            BlueprintName = \"Runner\"\n            ReferencedContainer = \"container:Runner.xcodeproj\">\n         </BuildableReference>\n      </MacroExpansion>\n      <AdditionalOptions>\n      </AdditionalOptions>\n   </TestAction>\n   <LaunchAction\n      buildConfiguration = \"Debug\"\n      selectedDebuggerIdentifier = \"Xcode.DebuggerFoundation.Debugger.LLDB\"\n      selectedLauncherIdentifier = \"Xcode.DebuggerFoundation.Launcher.LLDB\"\n      launchStyle = \"0\"\n      useCustomWorkingDirectory = \"NO\"\n      ignoresPersistentStateOnLaunch = \"NO\"\n      debugDocumentVersioning = \"YES\"\n      debugServiceExtension = \"internal\"\n      allowLocationSimulation = \"YES\">\n      <BuildableProductRunnable\n         runnableDebuggingMode = \"0\">\n         <BuildableReference\n            BuildableIdentifier = \"primary\"\n            BlueprintIdentifier = \"97C146ED1CF9000F007C117D\"\n            BuildableName = \"Runner.app\"\n            BlueprintName = \"Runner\"\n            ReferencedContainer = \"container:Runner.xcodeproj\">\n         </BuildableReference>\n      </BuildableProductRunnable>\n      <AdditionalOptions>\n      </AdditionalOptions>\n   </LaunchAction>\n   <ProfileAction\n      buildConfiguration = \"Profile\"\n      shouldUseLaunchSchemeArgsEnv = \"YES\"\n      savedToolIdentifier = \"\"\n      useCustomWorkingDirectory = \"NO\"\n      debugDocumentVersioning = \"YES\">\n      <BuildableProductRunnable\n         runnableDebuggingMode = \"0\">\n         <BuildableReference\n            BuildableIdentifier = \"primary\"\n            BlueprintIdentifier = \"97C146ED1CF9000F007C117D\"\n            BuildableName = \"Runner.app\"\n            BlueprintName = \"Runner\"\n            ReferencedContainer = \"container:Runner.xcodeproj\">\n         </BuildableReference>\n      </BuildableProductRunnable>\n   </ProfileAction>\n   <AnalyzeAction\n      buildConfiguration = \"Debug\">\n   </AnalyzeAction>\n   <ArchiveAction\n      buildConfiguration = \"Release\"\n      revealArchiveInOrganizer = \"YES\">\n   </ArchiveAction>\n</Scheme>\n"
  },
  {
    "path": "example/ios/Runner.xcworkspace/contents.xcworkspacedata",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Workspace\n   version = \"1.0\">\n   <FileRef\n      location = \"group:Runner.xcodeproj\">\n   </FileRef>\n</Workspace>\n"
  },
  {
    "path": "example/lib/main.dart",
    "content": "import 'dart:ui';\n\nimport 'package:flutter/material.dart';\nimport 'package:flutter_colorpicker/flutter_colorpicker.dart';\nimport 'package:flutter_hand_tracking_plugin/HandGestureRecognition.dart';\nimport 'package:flutter_hand_tracking_plugin/flutter_hand_tracking_plugin.dart';\nimport 'package:flutter_hand_tracking_plugin/gen/landmark.pb.dart';\n\nvoid main() => runApp(MaterialApp(home: MyApp()));\n\nclass MyApp extends StatefulWidget {\n  @override\n  _MyAppState createState() => _MyAppState();\n}\n\nclass _MyAppState extends State<MyApp> {\n  HandTrackingViewController _controller;\n  Gestures _gesture;\n\n  Color _selectedColor = Colors.black;\n  Color _pickerColor = Colors.black;\n  double _opacity = 1.0;\n  double _strokeWidth = 3.0;\n  double _canvasHeight = 300;\n  double _canvasWeight = 300;\n\n  bool _showBottomList = false;\n  List<DrawingPoints> _points = List();\n  SelectedMode _selectedMode = SelectedMode.StrokeWidth;\n\n  List<Color> _colors = [\n    Colors.red,\n    Colors.green,\n    Colors.blue,\n    Colors.amber,\n    Colors.black\n  ];\n\n  void continueDraw(landmark) => setState(() => _points.add(DrawingPoints(\n      points: Offset(landmark.x * _canvasWeight, landmark.y * _canvasHeight),\n      paint: Paint()\n        ..strokeCap = StrokeCap.butt\n        ..isAntiAlias = true\n        ..color = _selectedColor.withOpacity(_opacity)\n        ..strokeWidth = _strokeWidth)));\n\n  void finishDraw() => setState(() => _points.add(null));\n\n  void _onLandMarkStream(NormalizedLandmarkList landmarkList) {\n    if (landmarkList.landmark != null && landmarkList.landmark.length != 0) {\n      setState(() => _gesture =\n          HandGestureRecognition.handGestureRecognition(landmarkList.landmark));\n      if (_gesture == Gestures.ONE)\n        continueDraw(landmarkList.landmark[8]);\n      else if (_points.length != 0) finishDraw();\n    } else\n      _gesture = null;\n  }\n\n  getColorList() {\n    List<Widget> listWidget = List();\n    for (Color color in _colors) {\n      listWidget.add(colorCircle(color));\n    }\n    Widget colorPicker = GestureDetector(\n      onTap: () {\n        showDialog(\n          context: context,\n          child: AlertDialog(\n            title: const Text('选择颜色'),\n            content: SingleChildScrollView(\n              child: ColorPicker(\n                pickerColor: _pickerColor,\n                onColorChanged: (color) => _pickerColor = color,\n//                enableLabel: true,\n                pickerAreaHeightPercent: 0.8,\n              ),\n            ),\n            actions: <Widget>[\n              FlatButton(\n                child: const Text('保存'),\n                onPressed: () {\n                  setState(() => _selectedColor = _pickerColor);\n                  Navigator.of(context).pop();\n                },\n              ),\n            ],\n          ),\n        );\n      },\n      child: ClipOval(\n        child: Container(\n          padding: const EdgeInsets.only(bottom: 16.0),\n          height: 36,\n          width: 36,\n          decoration: BoxDecoration(\n              gradient: LinearGradient(\n            colors: [Colors.red, Colors.green, Colors.blue],\n            begin: Alignment.topLeft,\n            end: Alignment.bottomRight,\n          )),\n        ),\n      ),\n    );\n    listWidget.add(colorPicker);\n    return listWidget;\n  }\n\n  Widget colorCircle(Color color) {\n    return GestureDetector(\n      onTap: () => setState(() => _selectedColor = color),\n      child: ClipOval(\n        child: Container(\n          padding: const EdgeInsets.only(bottom: 16.0),\n          height: 36,\n          width: 36,\n          color: color,\n        ),\n      ),\n    );\n  }\n\n  @override\n  Widget build(BuildContext context) {\n    return Scaffold(\n      appBar: AppBar(\n        title: const Text('Hand Tracking Example App'),\n      ),\n      body: SingleChildScrollView(\n        child: Column(\n          children: <Widget>[\n            Container(\n              height: 300,\n              child: HandTrackingView(\n                onViewCreated: (HandTrackingViewController c) => setState(() {\n                  _controller = c;\n                  if (_controller != null)\n                    _controller.landMarksStream.listen(_onLandMarkStream);\n                }),\n              ),\n            ),\n            _controller == null\n                ? Text(\n                    \"Please grant camera permissions and reopen the application.\")\n                : Column(\n                    children: <Widget>[\n                      Text(_gesture == null\n                          ? \"No hand landmarks.\"\n                          : _gesture.toString()),\n                      CustomPaint(\n                        size: Size(_canvasWeight, _canvasHeight),\n                        painter: DrawingPainter(\n                          pointsList: _points,\n                        ),\n                      )\n                    ],\n                  )\n          ],\n        ),\n      ),\n      bottomNavigationBar: Padding(\n        padding: const EdgeInsets.all(8.0),\n        child: Container(\n          padding: const EdgeInsets.only(left: 8.0, right: 8.0),\n          decoration: BoxDecoration(\n              borderRadius: BorderRadius.circular(50.0),\n              color: Colors.greenAccent),\n          child: Padding(\n            padding: const EdgeInsets.all(8.0),\n            child: Column(\n              mainAxisSize: MainAxisSize.min,\n              children: <Widget>[\n                Row(\n                  mainAxisAlignment: MainAxisAlignment.spaceBetween,\n                  children: <Widget>[\n                    IconButton(\n                      icon: Icon(Icons.album),\n                      onPressed: () => setState(() {\n                        if (_selectedMode == SelectedMode.StrokeWidth)\n                          _showBottomList = !_showBottomList;\n                        _selectedMode = SelectedMode.StrokeWidth;\n                      }),\n                    ),\n                    IconButton(\n                      icon: Icon(Icons.opacity),\n                      onPressed: () => setState(() {\n                        if (_selectedMode == SelectedMode.Opacity)\n                          _showBottomList = !_showBottomList;\n                        _selectedMode = SelectedMode.Opacity;\n                      }),\n                    ),\n                    IconButton(\n                      icon: Icon(Icons.color_lens),\n                      onPressed: () => setState(() {\n                        if (_selectedMode == SelectedMode.Color)\n                          _showBottomList = !_showBottomList;\n                        _selectedMode = SelectedMode.Color;\n                      }),\n                    ),\n                    IconButton(\n                      icon: Icon(Icons.clear),\n                      onPressed: () => setState(() {\n                        _showBottomList = false;\n                        _points.clear();\n                      }),\n                    ),\n                  ],\n                ),\n                Visibility(\n                  child: (_selectedMode == SelectedMode.Color)\n                      ? Row(\n                          mainAxisAlignment: MainAxisAlignment.spaceEvenly,\n                          children: getColorList(),\n                        )\n                      : Slider(\n                          value: (_selectedMode == SelectedMode.StrokeWidth)\n                              ? _strokeWidth\n                              : _opacity,\n                          max: (_selectedMode == SelectedMode.StrokeWidth)\n                              ? 50.0\n                              : 1.0,\n                          min: 0.0,\n                          onChanged: (val) {\n                            setState(() {\n                              if (_selectedMode == SelectedMode.StrokeWidth)\n                                _strokeWidth = val;\n                              else\n                                _opacity = val;\n                            });\n                          }),\n                  visible: _showBottomList,\n                ),\n              ],\n            ),\n          ),\n        ),\n      ),\n    );\n  }\n}\n\nclass DrawingPainter extends CustomPainter {\n  DrawingPainter({this.pointsList});\n\n  List<DrawingPoints> pointsList;\n  List<Offset> offsetPoints = List();\n\n  @override\n  void paint(Canvas canvas, Size size) {\n    for (int i = 0; i < pointsList.length - 1; i++) {\n      if (pointsList[i] != null && pointsList[i + 1] != null) {\n        canvas.drawLine(pointsList[i].points, pointsList[i + 1].points,\n            pointsList[i].paint);\n      } else if (pointsList[i] != null && pointsList[i + 1] == null) {\n        offsetPoints.clear();\n        offsetPoints.add(pointsList[i].points);\n        offsetPoints.add(Offset(\n            pointsList[i].points.dx + 0.1, pointsList[i].points.dy + 0.1));\n        canvas.drawPoints(PointMode.points, offsetPoints, pointsList[i].paint);\n      }\n    }\n  }\n\n  @override\n  bool shouldRepaint(DrawingPainter oldDelegate) => true;\n}\n\nclass DrawingPoints {\n  Paint paint;\n  Offset points;\n\n  DrawingPoints({this.points, this.paint});\n}\n\nenum SelectedMode { StrokeWidth, Opacity, Color }\n"
  },
  {
    "path": "example/pubspec.yaml",
    "content": "name: flutter_hand_tracking_plugin_example\ndescription: Demonstrates how to use the flutter_hand_tracking_plugin plugin.\npublish_to: 'none'\n\nenvironment:\n  sdk: \">=2.1.0 <3.0.0\"\n\ndependencies:\n  flutter:\n    sdk: flutter\n\n  # The following adds the Cupertino Icons font to your application.\n  # Use with the CupertinoIcons class for iOS style icons.\n  cupertino_icons: ^0.1.2\n\ndev_dependencies:\n  flutter_test:\n    sdk: flutter\n\n  flutter_hand_tracking_plugin:\n    path: ../\n\n  flutter_colorpicker: any\n\n# For information on the generic Dart part of this file, see the\n# following page: https://dart.dev/tools/pub/pubspec\n\n# The following section is specific to Flutter.\nflutter:\n\n  # The following line ensures that the Material Icons font is\n  # included with your application, so that you can use the icons in\n  # the material Icons class.\n  uses-material-design: true\n\n  # To add assets to your application, add an assets section, like this:\n  # assets:\n  #  - images/a_dot_burr.jpeg\n  #  - images/a_dot_ham.jpeg\n\n  # An image asset can refer to one or more resolution-specific \"variants\", see\n  # https://flutter.dev/assets-and-images/#resolution-aware.\n\n  # For details regarding adding assets from package dependencies, see\n  # https://flutter.dev/assets-and-images/#from-packages\n\n  # To add custom fonts to your application, add a fonts section here,\n  # in this \"flutter\" section. Each entry in this list should have a\n  # \"family\" key with the font family name, and a \"fonts\" key with a\n  # list giving the asset and other descriptors for the font. For\n  # example:\n  # fonts:\n  #   - family: Schyler\n  #     fonts:\n  #       - asset: fonts/Schyler-Regular.ttf\n  #       - asset: fonts/Schyler-Italic.ttf\n  #         style: italic\n  #   - family: Trajan Pro\n  #     fonts:\n  #       - asset: fonts/TrajanPro.ttf\n  #       - asset: fonts/TrajanPro_Bold.ttf\n  #         weight: 700\n  #\n  # For details regarding fonts from package dependencies,\n  # see https://flutter.dev/custom-fonts/#from-packages\n"
  },
  {
    "path": "example/test/widget_test.dart",
    "content": "// This is a basic Flutter widget test.\n//\n// To perform an interaction with a widget in your test, use the WidgetTester\n// utility that Flutter provides. For example, you can send tap and scroll\n// gestures. You can also use WidgetTester to find child widgets in the widget\n// tree, read text, and verify that the values of widget properties are correct.\n\nimport 'package:flutter/material.dart';\nimport 'package:flutter_test/flutter_test.dart';\n\nimport 'package:flutter_hand_tracking_plugin_example/main.dart';\n\nvoid main() {\n  testWidgets('Verify Platform version', (WidgetTester tester) async {\n    // Build our app and trigger a frame.\n    await tester.pumpWidget(MyApp());\n\n    // Verify that platform version is retrieved.\n    expect(\n      find.byWidgetPredicate(\n        (Widget widget) => widget is Text &&\n                           widget.data.startsWith('Running on:'),\n      ),\n      findsOneWidget,\n    );\n  });\n}\n"
  },
  {
    "path": "flutter_hand_tracking_plugin/android/flutter_hand_tracking_plugin.iml",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<module external.linked.project.id=\":flutter_hand_tracking_plugin\" external.linked.project.path=\"$MODULE_DIR$\" external.root.project.path=\"$MODULE_DIR$/../../example/android\" external.system.id=\"GRADLE\" type=\"JAVA_MODULE\" version=\"4\">\n  <component name=\"FacetManager\">\n    <facet type=\"android-gradle\" name=\"Android-Gradle\">\n      <configuration>\n        <option name=\"GRADLE_PROJECT_PATH\" value=\":flutter_hand_tracking_plugin\" />\n        <option name=\"LAST_SUCCESSFUL_SYNC_AGP_VERSION\" />\n        <option name=\"LAST_KNOWN_AGP_VERSION\" />\n      </configuration>\n    </facet>\n    <facet type=\"java-gradle\" name=\"Java-Gradle\">\n      <configuration>\n        <option name=\"BUILD_FOLDER_PATH\" value=\"$MODULE_DIR$/../../example/build/flutter_hand_tracking_plugin\" />\n        <option name=\"BUILDABLE\" value=\"false\" />\n      </configuration>\n    </facet>\n  </component>\n  <component name=\"NewModuleRootManager\" LANGUAGE_LEVEL=\"JDK_1_8\" inherit-compiler-output=\"true\">\n    <exclude-output />\n    <content url=\"file://$MODULE_DIR$\">\n      <excludeFolder url=\"file://$MODULE_DIR$/.gradle\" />\n    </content>\n    <orderEntry type=\"inheritedJdk\" />\n    <orderEntry type=\"sourceFolder\" forTests=\"false\" />\n  </component>\n</module>"
  },
  {
    "path": "flutter_hand_tracking_plugin.iml",
    "content": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<module type=\"JAVA_MODULE\" version=\"4\">\n  <component name=\"NewModuleRootManager\" inherit-compiler-output=\"true\">\n    <exclude-output />\n    <content url=\"file://$MODULE_DIR$\">\n      <sourceFolder url=\"file://$MODULE_DIR$/lib\" isTestSource=\"false\" />\n      <excludeFolder url=\"file://$MODULE_DIR$/.dart_tool\" />\n      <excludeFolder url=\"file://$MODULE_DIR$/.idea\" />\n      <excludeFolder url=\"file://$MODULE_DIR$/.pub\" />\n      <excludeFolder url=\"file://$MODULE_DIR$/build\" />\n      <excludeFolder url=\"file://$MODULE_DIR$/example/.dart_tool\" />\n      <excludeFolder url=\"file://$MODULE_DIR$/example/.pub\" />\n      <excludeFolder url=\"file://$MODULE_DIR$/example/build\" />\n    </content>\n    <orderEntry type=\"sourceFolder\" forTests=\"false\" />\n    <orderEntry type=\"library\" name=\"Dart SDK\" level=\"project\" />\n    <orderEntry type=\"library\" name=\"Flutter Plugins\" level=\"project\" />\n    <orderEntry type=\"library\" name=\"Bundled Protobuf Distribution\" level=\"application\" />\n  </component>\n</module>"
  },
  {
    "path": "ios/.gitignore",
    "content": ".idea/\n.vagrant/\n.sconsign.dblite\n.svn/\n\n.DS_Store\n*.swp\nprofile\n\nDerivedData/\nbuild/\nGeneratedPluginRegistrant.h\nGeneratedPluginRegistrant.m\n\n.generated/\n\n*.pbxuser\n*.mode1v3\n*.mode2v3\n*.perspectivev3\n\n!default.pbxuser\n!default.mode1v3\n!default.mode2v3\n!default.perspectivev3\n\nxcuserdata\n\n*.moved-aside\n\n*.pyc\n*sync/\nIcon?\n.tags*\n\n/Flutter/Generated.xcconfig\n/Flutter/flutter_export_environment.sh"
  },
  {
    "path": "ios/Assets/.gitkeep",
    "content": ""
  },
  {
    "path": "ios/Classes/FlutterHandTrackingPlugin.h",
    "content": "#import <Flutter/Flutter.h>\n\n@interface FlutterHandTrackingPlugin : NSObject<FlutterPlugin>\n@end\n"
  },
  {
    "path": "ios/Classes/FlutterHandTrackingPlugin.m",
    "content": "#import \"FlutterHandTrackingPlugin.h\"\n\n@implementation FlutterHandTrackingPlugin\n+ (void)registerWithRegistrar:(NSObject<FlutterPluginRegistrar>*)registrar {\n  FlutterMethodChannel* channel = [FlutterMethodChannel\n      methodChannelWithName:@\"flutter_hand_tracking_plugin\"\n            binaryMessenger:[registrar messenger]];\n  FlutterHandTrackingPlugin* instance = [[FlutterHandTrackingPlugin alloc] init];\n  [registrar addMethodCallDelegate:instance channel:channel];\n}\n\n- (void)handleMethodCall:(FlutterMethodCall*)call result:(FlutterResult)result {\n  if ([@\"getPlatformVersion\" isEqualToString:call.method]) {\n    result([@\"iOS \" stringByAppendingString:[[UIDevice currentDevice] systemVersion]]);\n  } else {\n    result(FlutterMethodNotImplemented);\n  }\n}\n\n@end\n"
  },
  {
    "path": "ios/flutter_hand_tracking_plugin.podspec",
    "content": "#\n# To learn more about a Podspec see http://guides.cocoapods.org/syntax/podspec.html.\n# Run `pod lib lint flutter_hand_tracking_plugin.podspec' to validate before publishing.\n#\nPod::Spec.new do |s|\n  s.name             = 'flutter_hand_tracking_plugin'\n  s.version          = '0.0.1'\n  s.summary          = 'A new Flutter hand tracking plugin.'\n  s.description      = <<-DESC\nA new Flutter hand tracking plugin.\n                       DESC\n  s.homepage         = 'http://example.com'\n  s.license          = { :file => '../LICENSE' }\n  s.author           = { 'Your Company' => 'email@example.com' }\n  s.source           = { :path => '.' }\n  s.source_files = 'Classes/**/*'\n  s.public_header_files = 'Classes/**/*.h'\n  s.dependency 'Flutter'\n  s.platform = :ios, '8.0'\n\n  # Flutter.framework does not contain a i386 slice. Only x86_64 simulators are supported.\n  s.pod_target_xcconfig = { 'DEFINES_MODULE' => 'YES', 'VALID_ARCHS[sdk=iphonesimulator*]' => 'x86_64' }\nend\n"
  },
  {
    "path": "lib/HandGestureRecognition.dart",
    "content": "import 'dart:math';\n\nimport 'package:flutter_hand_tracking_plugin/gen/landmark.pb.dart';\n\nenum Gestures {\n  FIVE,\n  FOUR,\n  TREE,\n  TWO,\n  ONE,\n  YEAH,\n  ROCK,\n  SPIDERMAN,\n  FIST,\n  OK,\n  UNKNOWN\n}\n\nclass HandGestureRecognition {\n  static bool fingerIsOpen(\n          double pseudoFixKeyPoint, double point1, double point2) =>\n      point1 < pseudoFixKeyPoint && point2 < pseudoFixKeyPoint;\n\n  static bool thumbIsOpen(List landmarks) =>\n      fingerIsOpen(landmarks[2].x, landmarks[3].x, landmarks[4].x);\n\n  static bool firstFingerIsOpen(List landmarks) =>\n      fingerIsOpen(landmarks[6].y, landmarks[7].y, landmarks[8].y);\n\n  static bool secondFingerIsOpen(List landmarks) =>\n      fingerIsOpen(landmarks[10].y, landmarks[11].y, landmarks[12].y);\n\n  static bool thirdFingerIsOpen(List landmarks) =>\n      fingerIsOpen(landmarks[14].y, landmarks[15].y, landmarks[16].y);\n\n  static bool fourthFingerIsOpen(List landmarks) =>\n      fingerIsOpen(landmarks[18].y, landmarks[19].y, landmarks[20].y);\n\n  static double getEuclideanDistanceAB(\n          double aX, double aY, double bX, double bY) =>\n      sqrt(pow(aX - bX, 2) + pow(aY - bY, 2));\n\n  static bool isThumbNearFirstFinger(\n          NormalizedLandmark point1, NormalizedLandmark point2) =>\n      getEuclideanDistanceAB(point1.x, point1.y, point2.x, point2.y) < 0.1;\n\n  static Gestures handGestureRecognition(List landmarks) {\n    if (landmarks.length == 0) return Gestures.UNKNOWN;\n    // finger states\n    bool thumbIsOpen = HandGestureRecognition.thumbIsOpen(landmarks);\n    bool firstFingerIsOpen =\n        HandGestureRecognition.firstFingerIsOpen(landmarks);\n    bool secondFingerIsOpen =\n        HandGestureRecognition.secondFingerIsOpen(landmarks);\n    bool thirdFingerIsOpen =\n        HandGestureRecognition.thirdFingerIsOpen(landmarks);\n    bool fourthFingerIsOpen =\n        HandGestureRecognition.fourthFingerIsOpen(landmarks);\n    if (thumbIsOpen &&\n        firstFingerIsOpen &&\n        secondFingerIsOpen &&\n        thirdFingerIsOpen &&\n        fourthFingerIsOpen)\n      return Gestures.FIVE;\n    else if (!thumbIsOpen &&\n        firstFingerIsOpen &&\n        secondFingerIsOpen &&\n        thirdFingerIsOpen &&\n        fourthFingerIsOpen)\n      return Gestures.FOUR;\n    else if (thumbIsOpen &&\n        firstFingerIsOpen &&\n        secondFingerIsOpen &&\n        !thirdFingerIsOpen &&\n        !fourthFingerIsOpen)\n      return Gestures.TREE;\n    else if (thumbIsOpen &&\n        firstFingerIsOpen &&\n        !secondFingerIsOpen &&\n        !thirdFingerIsOpen &&\n        !fourthFingerIsOpen)\n      return Gestures.TWO;\n    else if (!thumbIsOpen &&\n        firstFingerIsOpen &&\n        !secondFingerIsOpen &&\n        !thirdFingerIsOpen &&\n        !fourthFingerIsOpen)\n      return Gestures.ONE;\n    else if (!thumbIsOpen &&\n        firstFingerIsOpen &&\n        secondFingerIsOpen &&\n        !thirdFingerIsOpen &&\n        !fourthFingerIsOpen)\n      return Gestures.YEAH;\n    else if (!thumbIsOpen &&\n        firstFingerIsOpen &&\n        !secondFingerIsOpen &&\n        !thirdFingerIsOpen &&\n        fourthFingerIsOpen)\n      return Gestures.ROCK;\n    else if (thumbIsOpen &&\n        firstFingerIsOpen &&\n        !secondFingerIsOpen &&\n        !thirdFingerIsOpen &&\n        fourthFingerIsOpen)\n      return Gestures.SPIDERMAN;\n    else if (!thumbIsOpen &&\n        !firstFingerIsOpen &&\n        !secondFingerIsOpen &&\n        !thirdFingerIsOpen &&\n        !fourthFingerIsOpen)\n      return Gestures.FIST;\n    else if (!firstFingerIsOpen &&\n        secondFingerIsOpen &&\n        thirdFingerIsOpen &&\n        fourthFingerIsOpen &&\n        isThumbNearFirstFinger(landmarks[4], landmarks[8]))\n      return Gestures.OK;\n    else\n      return Gestures.UNKNOWN;\n  }\n}\n"
  },
  {
    "path": "lib/flutter_hand_tracking_plugin.dart",
    "content": "import 'dart:async';\n\nimport 'package:flutter/cupertino.dart';\nimport 'package:flutter/foundation.dart';\nimport 'package:flutter/services.dart';\nimport 'package:flutter_hand_tracking_plugin/gen/landmark.pb.dart';\n\nconst NAMESPACE = \"plugins.zhzh.xyz/flutter_hand_tracking_plugin\";\n\ntypedef void HandTrackingViewCreatedCallback(\n    HandTrackingViewController controller);\n\nclass HandTrackingView extends StatelessWidget {\n  const HandTrackingView({@required this.onViewCreated})\n      : assert(onViewCreated != null);\n\n  final HandTrackingViewCreatedCallback onViewCreated;\n\n  @override\n  Widget build(BuildContext context) {\n    switch (defaultTargetPlatform) {\n      case TargetPlatform.android:\n        return AndroidView(\n          viewType: \"$NAMESPACE/view\",\n          onPlatformViewCreated: (int id) => onViewCreated == null\n              ? null\n              : onViewCreated(HandTrackingViewController._(id)),\n        );\n      case TargetPlatform.fuchsia:\n      case TargetPlatform.iOS:\n      default:\n        throw UnsupportedError(\n            \"Trying to use the default webview implementation for\"\n            \" $defaultTargetPlatform but there isn't a default one\");\n    }\n  }\n}\n\nclass HandTrackingViewController {\n  final MethodChannel _methodChannel;\n  final EventChannel _eventChannel;\n\n  HandTrackingViewController._(int id)\n      : _methodChannel = MethodChannel(\"$NAMESPACE/$id\"),\n        _eventChannel = EventChannel(\"$NAMESPACE/$id/landmarks\");\n\n  Future<String> get platformVersion async =>\n      await _methodChannel.invokeMethod(\"getPlatformVersion\");\n\n  Stream<NormalizedLandmarkList> get landMarksStream async* {\n    yield* _eventChannel\n        .receiveBroadcastStream()\n        .map((buffer) => NormalizedLandmarkList.fromBuffer(buffer));\n  }\n}\n"
  },
  {
    "path": "lib/gen/landmark.pb.dart",
    "content": "///\n//  Generated code. Do not modify.\n//  source: landmark.proto\n//\n// @dart = 2.3\n// ignore_for_file: camel_case_types,non_constant_identifier_names,library_prefixes,unused_import,unused_shown_name,return_of_invalid_type\n\nimport 'dart:core' as $core;\n\nimport 'package:protobuf/protobuf.dart' as $pb;\n\nclass Landmark extends $pb.GeneratedMessage {\n  static final $pb.BuilderInfo _i = $pb.BuilderInfo('Landmark', package: const $pb.PackageName('mediapipe'), createEmptyInstance: create)\n    ..a<$core.double>(1, 'x', $pb.PbFieldType.OF)\n    ..a<$core.double>(2, 'y', $pb.PbFieldType.OF)\n    ..a<$core.double>(3, 'z', $pb.PbFieldType.OF)\n    ..hasRequiredFields = false\n  ;\n\n  Landmark._() : super();\n  factory Landmark() => create();\n  factory Landmark.fromBuffer($core.List<$core.int> i, [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => create()..mergeFromBuffer(i, r);\n  factory Landmark.fromJson($core.String i, [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => create()..mergeFromJson(i, r);\n  Landmark clone() => Landmark()..mergeFromMessage(this);\n  Landmark copyWith(void Function(Landmark) updates) => super.copyWith((message) => updates(message as Landmark));\n  $pb.BuilderInfo get info_ => _i;\n  @$core.pragma('dart2js:noInline')\n  static Landmark create() => Landmark._();\n  Landmark createEmptyInstance() => create();\n  static $pb.PbList<Landmark> createRepeated() => $pb.PbList<Landmark>();\n  @$core.pragma('dart2js:noInline')\n  static Landmark getDefault() => _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor<Landmark>(create);\n  static Landmark _defaultInstance;\n\n  @$pb.TagNumber(1)\n  $core.double get x => $_getN(0);\n  @$pb.TagNumber(1)\n  set x($core.double v) { $_setFloat(0, v); }\n  @$pb.TagNumber(1)\n  $core.bool hasX() => $_has(0);\n  @$pb.TagNumber(1)\n  void clearX() => clearField(1);\n\n  @$pb.TagNumber(2)\n  $core.double get y => $_getN(1);\n  @$pb.TagNumber(2)\n  set y($core.double v) { $_setFloat(1, v); }\n  @$pb.TagNumber(2)\n  $core.bool hasY() => $_has(1);\n  @$pb.TagNumber(2)\n  void clearY() => clearField(2);\n\n  @$pb.TagNumber(3)\n  $core.double get z => $_getN(2);\n  @$pb.TagNumber(3)\n  set z($core.double v) { $_setFloat(2, v); }\n  @$pb.TagNumber(3)\n  $core.bool hasZ() => $_has(2);\n  @$pb.TagNumber(3)\n  void clearZ() => clearField(3);\n}\n\nclass LandmarkList extends $pb.GeneratedMessage {\n  static final $pb.BuilderInfo _i = $pb.BuilderInfo('LandmarkList', package: const $pb.PackageName('mediapipe'), createEmptyInstance: create)\n    ..pc<Landmark>(1, 'landmark', $pb.PbFieldType.PM, subBuilder: Landmark.create)\n    ..hasRequiredFields = false\n  ;\n\n  LandmarkList._() : super();\n  factory LandmarkList() => create();\n  factory LandmarkList.fromBuffer($core.List<$core.int> i, [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => create()..mergeFromBuffer(i, r);\n  factory LandmarkList.fromJson($core.String i, [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => create()..mergeFromJson(i, r);\n  LandmarkList clone() => LandmarkList()..mergeFromMessage(this);\n  LandmarkList copyWith(void Function(LandmarkList) updates) => super.copyWith((message) => updates(message as LandmarkList));\n  $pb.BuilderInfo get info_ => _i;\n  @$core.pragma('dart2js:noInline')\n  static LandmarkList create() => LandmarkList._();\n  LandmarkList createEmptyInstance() => create();\n  static $pb.PbList<LandmarkList> createRepeated() => $pb.PbList<LandmarkList>();\n  @$core.pragma('dart2js:noInline')\n  static LandmarkList getDefault() => _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor<LandmarkList>(create);\n  static LandmarkList _defaultInstance;\n\n  @$pb.TagNumber(1)\n  $core.List<Landmark> get landmark => $_getList(0);\n}\n\nclass NormalizedLandmark extends $pb.GeneratedMessage {\n  static final $pb.BuilderInfo _i = $pb.BuilderInfo('NormalizedLandmark', package: const $pb.PackageName('mediapipe'), createEmptyInstance: create)\n    ..a<$core.double>(1, 'x', $pb.PbFieldType.OF)\n    ..a<$core.double>(2, 'y', $pb.PbFieldType.OF)\n    ..a<$core.double>(3, 'z', $pb.PbFieldType.OF)\n    ..hasRequiredFields = false\n  ;\n\n  NormalizedLandmark._() : super();\n  factory NormalizedLandmark() => create();\n  factory NormalizedLandmark.fromBuffer($core.List<$core.int> i, [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => create()..mergeFromBuffer(i, r);\n  factory NormalizedLandmark.fromJson($core.String i, [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => create()..mergeFromJson(i, r);\n  NormalizedLandmark clone() => NormalizedLandmark()..mergeFromMessage(this);\n  NormalizedLandmark copyWith(void Function(NormalizedLandmark) updates) => super.copyWith((message) => updates(message as NormalizedLandmark));\n  $pb.BuilderInfo get info_ => _i;\n  @$core.pragma('dart2js:noInline')\n  static NormalizedLandmark create() => NormalizedLandmark._();\n  NormalizedLandmark createEmptyInstance() => create();\n  static $pb.PbList<NormalizedLandmark> createRepeated() => $pb.PbList<NormalizedLandmark>();\n  @$core.pragma('dart2js:noInline')\n  static NormalizedLandmark getDefault() => _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor<NormalizedLandmark>(create);\n  static NormalizedLandmark _defaultInstance;\n\n  @$pb.TagNumber(1)\n  $core.double get x => $_getN(0);\n  @$pb.TagNumber(1)\n  set x($core.double v) { $_setFloat(0, v); }\n  @$pb.TagNumber(1)\n  $core.bool hasX() => $_has(0);\n  @$pb.TagNumber(1)\n  void clearX() => clearField(1);\n\n  @$pb.TagNumber(2)\n  $core.double get y => $_getN(1);\n  @$pb.TagNumber(2)\n  set y($core.double v) { $_setFloat(1, v); }\n  @$pb.TagNumber(2)\n  $core.bool hasY() => $_has(1);\n  @$pb.TagNumber(2)\n  void clearY() => clearField(2);\n\n  @$pb.TagNumber(3)\n  $core.double get z => $_getN(2);\n  @$pb.TagNumber(3)\n  set z($core.double v) { $_setFloat(2, v); }\n  @$pb.TagNumber(3)\n  $core.bool hasZ() => $_has(2);\n  @$pb.TagNumber(3)\n  void clearZ() => clearField(3);\n}\n\nclass NormalizedLandmarkList extends $pb.GeneratedMessage {\n  static final $pb.BuilderInfo _i = $pb.BuilderInfo('NormalizedLandmarkList', package: const $pb.PackageName('mediapipe'), createEmptyInstance: create)\n    ..pc<NormalizedLandmark>(1, 'landmark', $pb.PbFieldType.PM, subBuilder: NormalizedLandmark.create)\n    ..hasRequiredFields = false\n  ;\n\n  NormalizedLandmarkList._() : super();\n  factory NormalizedLandmarkList() => create();\n  factory NormalizedLandmarkList.fromBuffer($core.List<$core.int> i, [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => create()..mergeFromBuffer(i, r);\n  factory NormalizedLandmarkList.fromJson($core.String i, [$pb.ExtensionRegistry r = $pb.ExtensionRegistry.EMPTY]) => create()..mergeFromJson(i, r);\n  NormalizedLandmarkList clone() => NormalizedLandmarkList()..mergeFromMessage(this);\n  NormalizedLandmarkList copyWith(void Function(NormalizedLandmarkList) updates) => super.copyWith((message) => updates(message as NormalizedLandmarkList));\n  $pb.BuilderInfo get info_ => _i;\n  @$core.pragma('dart2js:noInline')\n  static NormalizedLandmarkList create() => NormalizedLandmarkList._();\n  NormalizedLandmarkList createEmptyInstance() => create();\n  static $pb.PbList<NormalizedLandmarkList> createRepeated() => $pb.PbList<NormalizedLandmarkList>();\n  @$core.pragma('dart2js:noInline')\n  static NormalizedLandmarkList getDefault() => _defaultInstance ??= $pb.GeneratedMessage.$_defaultFor<NormalizedLandmarkList>(create);\n  static NormalizedLandmarkList _defaultInstance;\n\n  @$pb.TagNumber(1)\n  $core.List<NormalizedLandmark> get landmark => $_getList(0);\n}\n\n"
  },
  {
    "path": "lib/gen/landmark.pbenum.dart",
    "content": "///\n//  Generated code. Do not modify.\n//  source: landmark.proto\n//\n// @dart = 2.3\n// ignore_for_file: camel_case_types,non_constant_identifier_names,library_prefixes,unused_import,unused_shown_name,return_of_invalid_type\n\n"
  },
  {
    "path": "lib/gen/landmark.pbjson.dart",
    "content": "///\n//  Generated code. Do not modify.\n//  source: landmark.proto\n//\n// @dart = 2.3\n// ignore_for_file: camel_case_types,non_constant_identifier_names,library_prefixes,unused_import,unused_shown_name,return_of_invalid_type\n\nconst Landmark$json = const {\n  '1': 'Landmark',\n  '2': const [\n    const {'1': 'x', '3': 1, '4': 1, '5': 2, '10': 'x'},\n    const {'1': 'y', '3': 2, '4': 1, '5': 2, '10': 'y'},\n    const {'1': 'z', '3': 3, '4': 1, '5': 2, '10': 'z'},\n  ],\n};\n\nconst LandmarkList$json = const {\n  '1': 'LandmarkList',\n  '2': const [\n    const {'1': 'landmark', '3': 1, '4': 3, '5': 11, '6': '.mediapipe.Landmark', '10': 'landmark'},\n  ],\n};\n\nconst NormalizedLandmark$json = const {\n  '1': 'NormalizedLandmark',\n  '2': const [\n    const {'1': 'x', '3': 1, '4': 1, '5': 2, '10': 'x'},\n    const {'1': 'y', '3': 2, '4': 1, '5': 2, '10': 'y'},\n    const {'1': 'z', '3': 3, '4': 1, '5': 2, '10': 'z'},\n  ],\n};\n\nconst NormalizedLandmarkList$json = const {\n  '1': 'NormalizedLandmarkList',\n  '2': const [\n    const {'1': 'landmark', '3': 1, '4': 3, '5': 11, '6': '.mediapipe.NormalizedLandmark', '10': 'landmark'},\n  ],\n};\n\n"
  },
  {
    "path": "lib/gen/landmark.pbserver.dart",
    "content": "///\n//  Generated code. Do not modify.\n//  source: landmark.proto\n//\n// @dart = 2.3\n// ignore_for_file: camel_case_types,non_constant_identifier_names,library_prefixes,unused_import,unused_shown_name,return_of_invalid_type\n\nexport 'landmark.pb.dart';\n\n"
  },
  {
    "path": "protos/landmark.proto",
    "content": "// Copyright 2019 The MediaPipe Authors.\n//\n// Licensed under the Apache License, Version 2.0 (the \"License\");\n// you may not use this file except in compliance with the License.\n// You may obtain a copy of the License at\n//\n//      http://www.apache.org/licenses/LICENSE-2.0\n//\n// Unless required by applicable law or agreed to in writing, software\n// distributed under the License is distributed on an \"AS IS\" BASIS,\n// WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n// See the License for the specific language governing permissions and\n// limitations under the License.\n\nsyntax = \"proto2\";\n\npackage mediapipe;\n\noption java_package = \"com.google.mediapipe.formats.proto\";\noption java_outer_classname = \"LandmarkProto\";\n\n// A landmark that can have 1 to 3 dimensions. Use x for 1D points, (x, y) for\n// 2D points and (x, y, z) for 3D points. For more dimensions, consider using\n// matrix_data.proto.\nmessage Landmark {\n  optional float x = 1;\n  optional float y = 2;\n  optional float z = 3;\n}\n\n// Group of Landmark protos.\nmessage LandmarkList {\n  repeated Landmark landmark = 1;\n}\n\n// A normalized version of above Landmark proto. All coordiates should be within\n// [0, 1].\nmessage NormalizedLandmark {\n  optional float x = 1;\n  optional float y = 2;\n  optional float z = 3;\n}\n\n// Group of NormalizedLandmark protos.\nmessage NormalizedLandmarkList {\n  repeated NormalizedLandmark landmark = 1;\n}\n"
  },
  {
    "path": "protos/regenerate.md",
    "content": "# Generate protobuf files in Dart\n1. If upgrading, delete all proto files from /home/.pub-cache/bin\n1. Clone the latest dart-protoc-plugin from https://github.com/dart-lang/protobuf\n1. Run `pub install` inside protobuf/protoc_plugin\n1. Run `pub global activate protoc_plugin` to get .dart files into /home/.pub-cache/bin/\n1. Get the latest linux protoc compiler from https://github.com/google/protobuf/releases/ (protoc-X.X.X-linux-x86_64.zip)\n1. Copy /bin/protoc into /home/.pub-cache/bin/\n1. Run the following commands from this project's protos folder\n\n    ```protoc --dart_out=../lib/gen ./landmark.proto```\n    \n    ```protoc --objc_out=../ios/gen ./landmark.proto```"
  },
  {
    "path": "pubspec.yaml",
    "content": "name: flutter_hand_tracking_plugin\ndescription: A new Flutter hand tracking plugin.\nversion: 0.0.1\nauthor:\nhomepage:\n\nenvironment:\n  sdk: \">=2.1.0 <3.0.0\"\n\ndependencies:\n  flutter:\n    sdk: flutter\n  protobuf: ^1.0.1\n\ndev_dependencies:\n  flutter_test:\n    sdk: flutter\n\n# For information on the generic Dart part of this file, see the\n# following page: https://dart.dev/tools/pub/pubspec\n\n# The following section is specific to Flutter.\nflutter:\n  # This section identifies this Flutter project as a plugin project.\n  # The androidPackage and pluginClass identifiers should not ordinarily\n  # be modified. They are used by the tooling to maintain consistency when\n  # adding or updating assets for this project.\n  plugin:\n    androidPackage: xyz.zhzh.flutter_hand_tracking_plugin\n    pluginClass: FlutterHandTrackingPlugin\n\n  # To add assets to your plugin package, add an assets section, like this:\n  # assets:\n  #  - images/a_dot_burr.jpeg\n  #  - images/a_dot_ham.jpeg\n  #\n  # For details regarding assets in packages, see\n  # https://flutter.dev/assets-and-images/#from-packages\n  #\n  # An image asset can refer to one or more resolution-specific \"variants\", see\n  # https://flutter.dev/assets-and-images/#resolution-aware.\n\n  # To add custom fonts to your plugin package, add a fonts section here,\n  # in this \"flutter\" section. Each entry in this list should have a\n  # \"family\" key with the font family name, and a \"fonts\" key with a\n  # list giving the asset and other descriptors for the font. For\n  # example:\n  # fonts:\n  #   - family: Schyler\n  #     fonts:\n  #       - asset: fonts/Schyler-Regular.ttf\n  #       - asset: fonts/Schyler-Italic.ttf\n  #         style: italic\n  #   - family: Trajan Pro\n  #     fonts:\n  #       - asset: fonts/TrajanPro.ttf\n  #       - asset: fonts/TrajanPro_Bold.ttf\n  #         weight: 700\n  #\n  # For details regarding fonts in packages, see\n  # https://flutter.dev/custom-fonts/#from-packages\n"
  }
]