Introducing HOT3D: an Egocentric Dataset for 3D Hand and Object Tracking

arXiv (Cornell University)(2024)

引用 0|浏览19
摘要
We introduce HOT3D, a publicly available dataset for egocentric hand andobject tracking in 3D. The dataset offers over 833 minutes (more than 3.7Mimages) of multi-view RGB/monochrome image streams showing 19 subjectsinteracting with 33 diverse rigid objects, multi-modal signals such as eye gazeor scene point clouds, as well as comprehensive ground truth annotationsincluding 3D poses of objects, hands, and cameras, and 3D models of hands andobjects. In addition to simple pick-up/observe/put-down actions, HOT3D containsscenarios resembling typical actions in a kitchen, office, and living roomenvironment. The dataset is recorded by two head-mounted devices from Meta:Project Aria, a research prototype of light-weight AR/AI glasses, and Quest 3,a production VR headset sold in millions of units. Ground-truth poses wereobtained by a professional motion-capture system using small optical markersattached to hands and objects. Hand annotations are provided in the UmeTrackand MANO formats and objects are represented by 3D meshes with PBR materialsobtained by an in-house scanner. We aim to accelerate research on egocentrichand-object interaction by making the HOT3D dataset publicly available and byco-organizing public challenges on the dataset at ECCV 2024. The dataset can bedownloaded from the project website: https://facebookresearch.github.io/hot3d/.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
0
您的评分 :

暂无评分

数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn