fengxudi / mmWave-gesture-dataset

64 stars 13 forks source link

mmWave-gesture-dataset

OVERVIEW

overview

This dataset is the first mmWave gesture dataset so far and it has several advantages:

Following we introduce the theroy principle of mmWave radar, the advantages of mmWave sensing, and the implementation details of this dataset.

RADAR PRINCIPLE

We capture human gestures by utilizing FMCW (frequency modulated continuous wave) signals which is a mature method in the field of radar detection. It has unique properties to obtain information about reflection points, such as their position and velocity. Therefore, it can record the change patterns of different gestures by converting the signals reflected at different positions of the hand into motion information in three-dimensional space. Specifically, the FMCW signal can obtain the distance of the reflector from the radar by mixing the received signal, then analyze the phase change of the signal of the point at that distance to obtain the speed of this point. Finally, the angle information is obtained by the phase difference between the reflected signal at different two-dimensional Rxs.

ADVANTAGES of MMWAVE

Potential in 5G era

As we all know, the emerging 5G technology is based on mmWave frequency bands. Due to its short wavelength, mmWaves have the following two advantages when sensing the surrounding environment: high-precision (cm-level ranging ability) and fine sensitivity (can monitor movements at the fractional millimeter level). Based on these advantages, the current work not only "sees" the indoor wall structure through a 5G commercial network card, but also "recognizes" different human movements through a specific FMCW radar. It is exciting that the recent smartphone Google Pixel4 has been equipped with a mmWave FMCW radar module to implement relatively simple gestures "in air" to control music players and other functions. This means that this function is gradually showing the powerful capabilities of mmWave signals in addition to high-speed transmission in the 5G era. However, according to the current research, there are still some problems that need to be solved urgently, such as concurrent gestures and power consumption.

"See-through" ability

The millimeter wave can be used when blocking by nonconductor materials (such as cloth or plastic), which is determined by the nature of the mmWave. Because when the signal penetrates these objects, these materials will only change the amplitude of the signal and consume its energy, and will not affect the signal change pattern that records a certain action. In other words, the blocking of these materials will not affect the ability of the mmWave to capture action, when a certain signal strength is guaranteed. In addition, compared to cameras with similar recognition accuracy and infrared 3D structured light, mmWave is deficient of fine-grained imaging of the detected objects, thereby protecting personal privacy and safe use. Furthermore, compared with the recognition technology of sound wave, ultrasonic wave and Wi-Fi frequency band, mmWave FMCW radar has more accurate detection ability, so that it can recognize more delicate human actions.

Free to the effect of surroundings

Compared with sonic, ultrasonic wave and Wi-Fi frequency band signals, it is convenient for mmWave signal to eliminate the adverse effects of static objects in the surrounding environment.

First of all, there is no need to worry about the reflection of static objects: because the FMCW reflected signal of static objects after mixing, its influence is a constant frequency that does not change with time, so it is easy to be eliminated by simple subtraction of adjacent time slices. eliminate.

Secondly, also no need to worry about the signal effects caused by multiple-time reflections: based on the common sense of optoelectronic physics, the attenuation of electromagnetic waves on the propagation path is proportional to the square of its transmission distance. Therefore, its energy intensity will be severely reduced, after the reflection signal of a dynamic object is reflected by the static object multiple times, not to mention that each reflection of the signal will also cause serious energy attenuation (especially for the mmWaves). As a result, these very weak "multipath reflections" are convenient to be submerged by general noises. To ensure the exclusion, we also applied the mature CFAR-CASO algorithm in the field of signal processing to filter the weak effects of multi-path reflections.

DATASET IMPLEMENTATION

In this mmWave gesture dataset, we utilize the TI-IWR1443 single-chip 76-GHz to 81-GHz mmWave sensor evaluation module.

module1module2

Since the gestures we recorded were completed in two scenarios, we set different parameters for the radar for different scenes as follows: Radar parameters short range long range
Rx channels 4 4
Tx channels 2 3
chirp cycle time 400 μs 158 μs
ADC sampling rate 2 kHz 7.5 kHz
Rx gain 48 dB 42 dB
frame periodicity 55 ms 100 ms
dynamic point detection CFAR-CA CFAR-CASO
point energy threshold 1200 dB 1280 dB

Short range scenario

short Specifically, in the short range scenario, the same with M-Gesture, we invited 131 people (including 60 men and 71 women) to perform five groups of gestures, 30,360 traces in total. Not only the predesignated gestures but numbers of unexpected motions are provided, e.g. fingers’ motions and English letters’ writing.

This dataset has been separated into 4 parts as following:

Long range scenario

long

In the long range scenario, we invited 100 people (including 44 men and 56 women) to perform the same five gestures (i.e. left swiping, right swiping, knocking, rotating and some unexpected motions), 26,060 traces in total.

Example demonstration

Before performing gestures, volunteers are asked to watch the example demonstrations of each gesture and the example demonstrations are available in the folder "/gesture_dataset".

How to Cite?

H. Liu et al., "M-Gesture: Person-Independent Real-Time In-Air Gesture Recognition Using Commodity Millimeter Wave Radar," in IEEE Internet of Things Journal, doi: 10.1109/JIOT.2021.3098338.