steb6 / ISBFSAR

Interactive Skeleton Based Few Shot Action Recognition
14 stars 2 forks source link
few-shot few-shot-classifcation few-shot-learning fewshot-learning gaze-estimation human-activity-recognition human-pose-estimation open-set open-set-classification open-set-recognition pose-estimation python pytorch

Interactive Open-Set Skeleton-Based One-Shot Action-Recognition

MIT License GitHub stars

The aim of this project is to provide an efficient pipeline for Action Recognition in Human Robot Interaction.

The whole 3D human pose is estimated and used to understand which action inside the support set the human is performing. Action can be easily added or removed from the support set in any moment. The Open-Set score confirms or rejects the Few-Shot prediction to avoid false positives. The Mutual Gaze Constraint can be added to an action as additional filter. Our visualizer

Modules

This repository contains different modules:

Installation

The program is divided into two parts:

Since the hpe modules is accelerated with TensorRT engines that requires to be built on the target machine, we provide the engines build over the Dockerfile, that allows for a fast installation. Check here the instruction to install the Human Pose Estimation module.

Run with Docker

Follow the instruction inside the README.md of every module: hpe, ar, and focus. Install Vispy and pyrealsense2 and build the Docker image with:

docker build -t ecub .

To run, start two separate processes:

python manager.py python source.py

Launch the main script with the following command (replace PATH with %cd% in Windows or {$pwd} on Ubuntu):

docker run -it --rm --gpus=all -v "PATH":/home/ecub ecub:latest python main.py