Version 3 2021-06-03, 14:15Version 3 2021-06-03, 14:15
Version 1 2020-08-13, 00:00Version 1 2020-08-13, 00:00
dataset
posted on 2021-06-03, 14:15authored byJack Geissinger, Alan Asbeck, Mohammad Mehdi Alemi, S. Emily Chang
The Virginia Tech Natural Motion Dataset contains 40 hours of unscripted human motion (full body kinematics) collected in the open world using an XSens MVN Link system. In total, there are data from 17 participants (13 participants on a college campus and 4 at a home improvement store). Participants did a wide variety of activities, including: walking from one place to another; operating machinery; talking with others; manipulating objects; working at a desk; driving; eating; pushing/pulling carts and dollies; physical exercises such as jumping jacks, jogging, and pushups; sweeping; vacuuming; and emptying a dishwasher. The code for analyzing the data is freely available with this dataset and also at: https://github.com/ARLab-VT/VT-Natural-Motion-Processing. The portion of the dataset involving workers was funded by Lowe's, Inc.