mediapipe Hand-tracking Usage 实记

mediapipe Hand-tracking Usage 实记_第1张图片

  • blog:Face and hand tracking in the browser with MediaPipe and TensorFlow.js
  • 博客:谷歌 TensorFlow 团队和 MediaPipe 联合推出两款软件包,可对面部和手部特征进行跟踪
  • blog:
    On-Device, Real-Time Hand Tracking with MediaPipe
  • Web Demo:https://storage.googleapis.com/tfjs-models/demos/handpose/index.html
  • source code: https://github.com/tensorflow/tfjs-models/tree/master/handpose

相关文档

  • Building MediaPipe Examples
  • hand_tracking_mobile_gpu.md
  • multi_hand_tracking_mobile_gpu.md
  • MediaPipe
  • tensorflow/tfjs-models

环境准备

  • 操作系统:macOS Catalina version 10.15.5

bazel

$ brew install bazel
$ bazel --version
$ brew upgrade bazel

参考:Installing Bazel on macOS

opencv

1、 安装OpenCV

$ brew install opencv  # 安装目录为 /usr/local/Cellar/opencv

参考: MacOS使用homebrew安装OpenCV及遇到的坑

2、 fatal error: opencv2/core/version.hpp: No such file or directory

ln -s /usr/local/Cellar/opencv/4.3.0_1/include/opencv4/opencv2 /usr/local/Cellar/opencv/4.3.0_1/include/opencv2

编辑 ./mediapipe/third_party/opencv_macos.BUILD:

# Description:
#   OpenCV libraries for video/image processing on MacOS

licenses(["notice"])  # BSD license

exports_files(["LICENSE"])

# The following build rule assumes that OpenCV is installed by
# 'brew install opencv@3' command on macos.
# If you install OpenCV separately, please modify the build rule accordingly.
cc_library(
    name = "opencv",
    srcs = glob(
        [
            # "local/opt/opencv@3/lib/libopencv_core.dylib",
            # "local/opt/opencv@3/lib/libopencv_calib3d.dylib",
            # "local/opt/opencv@3/lib/libopencv_features2d.dylib",
            # "local/opt/opencv@3/lib/libopencv_highgui.dylib",
            # "local/opt/opencv@3/lib/libopencv_imgcodecs.dylib",
            # "local/opt/opencv@3/lib/libopencv_imgproc.dylib",
            # "local/opt/opencv@3/lib/libopencv_video.dylib",
            # "local/opt/opencv@3/lib/libopencv_videoio.dylib",

            "local/opt/opencv@4/lib/libopencv_core.dylib",
            "local/opt/opencv@4/lib/libopencv_calib3d.dylib",
            "local/opt/opencv@4/lib/libopencv_features2d.dylib",
            "local/opt/opencv@4/lib/libopencv_highgui.dylib",
            "local/opt/opencv@4/lib/libopencv_imgcodecs.dylib",
            "local/opt/opencv@4/lib/libopencv_imgproc.dylib",
            "local/opt/opencv@4/lib/libopencv_video.dylib",
            "local/opt/opencv@4/lib/libopencv_videoio.dylib",
            
        ],
    ),
    # hdrs = glob(["local/opt/opencv@3/include/opencv2/**/*.h*"]),
    # includes = ["local/opt/opencv@3/include/"],
    hdrs = glob(["local/opt/opencv@4/include/opencv2/**/*.h*"]),
    includes = ["local/opt/opencv@4/include/"],
    linkstatic = 1,
    visibility = ["//visibility:public"],
)

参考:OpenCV 2 headers not found on Arch Linux with OpenCV 4 #496

使用

Option 1: Running on CPU

hand_tracking_cpu
$ bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/hand_tracking:hand_tracking_cpu
# $ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/hand_tracking/hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/hand_tracking_desktop_live.pbtxt --input_video_path=
$ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/hand_tracking/hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/hand_tracking_desktop_live.pbtxt --input_video_path=/Users/snorlaxse/Desktop/multi-hand-demo.mp4 --output_video_path=/Users/snorlaxse/Desktop/multi-hand-demo-output.mp4

第一次编译的时候会下载一些依赖,之后就不会了.
编译完成后,项目根目录下会出现 4个 alias文件夹(alias: origin)
**1、 bazel-bin: /private/var/tmp/_bazel_snorlaxse/8b6206b871f2e541142be86f99764a24/execroot/mediapipe/bazel-out/darwin-opt/bin
2、 bazel-mediapipe-master: /private/var/tmp/_bazel_snorlaxse/8b6206b871f2e541142be86f99764a24/execroot/mediapipe
3、 bazel-out: /private/var/tmp/_bazel_snorlaxse/8b6206b871f2e541142be86f99764a24/execroot/mediapipe/bazel-out
4、 bazel-testlogs: /private/var/tmp/_bazel_snorlaxse/8b6206b871f2e541142be86f99764a24/execroot/mediapipe/bazel-out/darwin-opt/testlogs
**

参考:Macos Catalina Hand tracking · Issue #477 · google/mediapipe

multi_hand_tracking_cpu
$ bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/multi_hand_tracking:multi_hand_tracking_cpu
$ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/multi_hand_tracking_desktop_live.pbtxt --input_video_path=<input_video_path> --output_video_path=<output_video_path>
  

欲尝试检测图片数据

$ GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/multi_hand_tracking_desktop_live.pbtxt --input_video_path=/Users/snorlaxse/Desktop/A-1.JPG --output_video_path=/Users/snorlaxse/Desktop/A-1-output.JPG

不可用…

Option 2: Running on GPU

未尝试…

拓展

  • Accessing landmarks, tracking multiple hands, and enabling depth on desktop #200

    @JECBello For C++ desktop example, what you described is correct way to get result protos
    I have created an example for Object Detection desktop CPU of how to print out detection protos
    https://github.com/mgyong/mediapipe-issue200
    I copied the demo_run_graph_main.cc into demo_run_graph_main_out.cc

    1、Created a listener for detection stream
    2、i polled the detection stream kDetectionsStream using Next() for the detection packet
    3、Load detection_packet into variable output_detections
    4、In my case, the output of the packet is of type std::vector<::mediapipe::Detection>. In your case, it should be landmark proto

    Once you have example of how you do it for hand tracking, appreciate if you could share your implementation with the community
    A. For multi-hands, we are working on an example that will hopefully be available in early mid Nov.
    B. Can you create a separate issue on the question for desktop implementations of hand tracking such that they can capture depth (similar to how the android/ios 3D builds can output z coordinates)

  • Extracting landmarks #703

    Call CalculatorGraph::ObserveOutputStream to register a packet callback, then use Packet::Get,especially Get, to read data.
    If you want to interfere the graph’s execution, just write a Node and put it in the graph pbtxt.

多手检测点输出

  • Multi-Hand Tracking via Live Webcam on CPU on Desktop: Shows how to extract landmarks on desktop.
bazel build -c opt --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/multi_hand_tracking:multi_hand_tracking_cpu
# GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/multi_hand_tracking_desktop_live.pbtxt --input_video_path=/Users/snorlaxse/Desktop/multi-hand-demo.mp4 --output_video_path=/Users/snorlaxse/Desktop/multi-hand-demo-output.mp4
GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/multi_hand_tracking/multi_hand_tracking_cpu --input_video_path=/Users/snorlaxse/Desktop/multi-hand-demo.mp4 --output_video_path=/Users/snorlaxse/Desktop/multi-hand-demo-output.mp4
  • ERROR: unknown command line flag ‘calculator_graph_config_file’


写在最后:若本文章对您有帮助,请点个赞啦 ٩(๑•̀ω•́๑)۶

你可能感兴趣的:(手势识别,计算机视觉)