Dataset for Vertical Jump Height Estimation from Depth Camera and Wearable Accelerometer Motion Data
Contributors
Contact person:
Data collectors:
Description
Dataset for Vertical Jump Height Estimation from Depth Camera and Wearable Accelerometer Motion Data
While the training of vertical jumps offers benefits for agility and performance across various amateur sports, the objective measurement of jump height remains a challenge compared to simpler assessments like the broad jump distance in a sand pit. Aiming at the estimation of the vertical jump height with easy-to-use and cost-efficient devices, we recorded a comprehensive dataset with an off-the-shelf depth camera and cost-efficient wearable motion sensors, equipped with an onboard three-axis accelerometer sensor. In our publication, we assessed the accuracy achievable at diverse fiducial positions, which are 7 skeletal joints from the depth camera and 10 wearing positions of the sensing devices. The user study was conducted with 44 subjects (33 male, 11 female, 23.1 ± 2.2 years) performing five countermovement jumps each. In order to gather ground truth information, a conventional digital camera was used to document the jumps and the vertical hip displacement along a measuring tape. Thales’ theorem on proportionality was then applied to rectify the perspective displacement of the manual readings from the video footage.
Context and Methodology
- Dataset for research on the estimation of vertical jump height and in adjacent fields such as human activity recognition etc.
- The dataset provides recordings from two easy-to-use and cost-efficient sensing modalities, the off-the-shelf depth camera Microsoft Azure Kinect and 10 wearable 3-axis accelerometers
- The ground truth information are manually determined and rectified
- The dataset was used in a publication showing that the most accurate estimates of the depth camera were obtained from the pelvis and thoracic spine joint (error of -15.8 ± 23.3mm and 24.2 ± 35.1 mm) while the best estimates from the wearable motion sensor data were obtained from the neck position and the ankles (error of 18.8 ± 29.0mm and -4.8 ± 35.2 mm)
- We encourage researchers to improve on our findings, e.g., by applying advanced machine learning techniques on the provided dataset
Technical Details
- A total of 220 recordings of countermovement jumps: 44 subjects, 5 jumps, two sensing modalities, manually determined and rectified ground truth
- 44 subjects: 33 male and 11 female with an average age of 23.1 ± 2.2 years
- Subjects gave written consent to provide the measurements for research purposes and publication (video recordings were deleted after ground truth determination)
- Depth camera: Microsoft Azure Kinect, 3d-coordinates (x, y, and z) for all 32 joint positions recorded along with the accompanied timestamp, frame rate of 30 Hz
- Wearable motion sensors: 10 wearing positions (lower neck, chest (sternum), hips (left and right), thighs (left and right), ankles (left and right), and wrists (left and right)), 3-axis accelerometer sensor data (x, y, and z) along with a timestamp for each sample, sampling rate of 100 Hz
- Two folders 'depthcamera' and 'wearables', each containing 44 subfolders labeled with the subject ids '01' to '44'
- Every subject's folder again contains 5 subfolders labeled with the jump ids 'j1' to 'j5' that contain the files of the countermovement jump recordings
- The recordings are provided in both Python pickle files *.p as well as comma-separated value *.csv (';' as separator) files with the file names composed of subject id and jump id for *.p files, e.g., 's05_j3.p', as well as the joint or wearing position for the *.csv files, e.g., 's05_j3_hip_left.csv'
- The pickle files have been tested successfully with Python 3.13.1 and NumPy 2.2.2
- Unfortunately, for subject 02, the accelerometer data of neck and chest were not successfully recorded and, hence, the associated lists in the *.p and the *.csv files empty
- Demographic information: a summary of the individual subjects' demographic information is available in the files 'subjects.p' or 'subjects.csv', providing the gender as female or male, age in years, height in cm, weight in kg, if they were conveyed by the lecture, if they were a student and, if so, of which degree
- Ground truth: the manually determined and rectified ground truth information are provided in the files 'groundtruth.p' or 'groundtruth.csv', associated with the individual subject ids, providing jump number, jump height, and whether the jump is considered an overall well-executed jump
- The dataset description at hand is also provided in the 'README.txt' file of the dataset's *.zip file
Files
TUW_2024-JumpMetric_-_Vertical_Jump_Height_Estimation.zip
Additional details
Related works
- Is described by
- Conference Paper: 10.1145/3701571.3701607 (DOI)