Gait3D is a large-scale 3D representation-based gait recognition dataset. It contains 4,000 subjects and over 25,000 sequences extracted from 39 cameras in an unconstrained indoor scene.
35 PAPERS • 2 BENCHMARKS
CASIA-B is a large multiview gait database, which is created in January 2005. There are 124 subjects, and the gait data was captured from 11 views. Three variations, namely view angle, clothing and carrying condition changes, are separately considered. Besides the video files, we still provide human silhouettes extracted from video files. The detailed information about Dataset B and an evaluation framework can be found in this paper .
15 PAPERS • 1 BENCHMARK
The USF Human ID Gait Challenge Dataset is a dataset of videos for gait recognition. It has videos from 122 subjects in up to 32 possible combinations of variations in factors.
10 PAPERS • NO BENCHMARKS YET
TUM-GAID (TUM Gait from Audio, Image and Depth) collects 305 subjects performing two walking trajectories in an indoor environment. The first trajectory is traversed from left to right and the second one from right to left. Two recording sessions were performed, one in January, where subjects wore heavy jackets and mostly winter boots, and another one in April, where subjects wore lighter clothes. The action is captured by a Microsoft Kinect sensor which provides a video stream with a resolution of 640×480 pixels and a frame rate around 30 FPS.
9 PAPERS • NO BENCHMARKS YET
The OU-ISIR Gait Database, Multi-View Large Population Dataset (OU-MVLP) is meant to aid research efforts in the general area of developing, testing and evaluating algorithms for cross-view gait recognition. The Institute of Scientific and Industrial Research (ISIR), Osaka University (OU) has copyright in the collection of gait video and associated data and serves as a distributor of the OU-ISIR Gait Database.
7 PAPERS • 1 BENCHMARK
Psychological trait estimation from external factors such as movement and appearance is a challenging and long-standing problem in psychology, and is principally based on the psychological theory of embodiment. To date, attempts to tackle this problem have utilized private small-scale datasets with intrusive body-attached sensors. Potential applications of an automated system for psychological trait estimation include estimation of occupational fatigue and psychology, and marketing and advertisement. In this work, we propose PsyMo (Psychological traits from Motion), a novel, multi-purpose and multi-modal dataset for exploring psychological cues manifested in walking patterns. We gathered walking sequences from 312 subjects in 7 different walking variations and 6 camera angles. In conjunction with walking sequences, participants filled in 6 psychological questionnaires, totalling 17 psychometric attributes related to personality, self-esteem, fatigue, aggressiveness and mental health. W
2 PAPERS • NO BENCHMARKS YET
CCGR (Cross-Covariate Gait Recognition), the first gait dataset for studying cross-covariate challenges, contains 970 subjects, approximately 1.6 million sequences, 53 covariates, and 33 views.
1 PAPER • NO BENCHMARKS YET
Gait3D-Parsing is a dataset for gait recognition in the wild. It is an extension of the large-scale and challenging Gait-3D dataset which is collected from an in-the-wild environment. The train set has 3,000 IDs, and the test set has 1,000 IDs. Meanwhile, 1,000 sequences in the test set are taken as the query set, and the rest of the test set is taken as the gallery set.
The OU-ISIR Gait Database, Multi-View Large Population Database with Pose Sequence (OUMVLP-Pose) is meant to aid research efforts in the general area of developing, testing and evaluating algorithms for model-based gait recognition.