Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos ...

We propose the task of Hand-Object Stable Grasp Reconstruction (HO-SGR), the reconstruction of frames during which the hand is stably holding the object. We first develop the stable grasp definition based on the intuition that the in-contact area between the hand and object should remain stable. By...

Full description

Bibliographic Details
Main Authors: Zhu, Zhifan, Damen, Dima
Format: Article in Journal/Newspaper
Language:unknown
Published: arXiv 2023
Subjects:
Online Access:https://dx.doi.org/10.48550/arxiv.2312.15719
https://arxiv.org/abs/2312.15719
id ftdatacite:10.48550/arxiv.2312.15719
record_format openpolar
spelling ftdatacite:10.48550/arxiv.2312.15719 2024-06-09T07:44:08+00:00 Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos ... Zhu, Zhifan Damen, Dima 2023 https://dx.doi.org/10.48550/arxiv.2312.15719 https://arxiv.org/abs/2312.15719 unknown arXiv Creative Commons Attribution Non Commercial Share Alike 4.0 International https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode cc-by-nc-sa-4.0 Computer Vision and Pattern Recognition cs.CV FOS Computer and information sciences CreativeWork article Preprint Article 2023 ftdatacite https://doi.org/10.48550/arxiv.2312.15719 2024-05-13T10:56:00Z We propose the task of Hand-Object Stable Grasp Reconstruction (HO-SGR), the reconstruction of frames during which the hand is stably holding the object. We first develop the stable grasp definition based on the intuition that the in-contact area between the hand and object should remain stable. By analysing the 3D ARCTIC dataset, we identify stable grasp durations and showcase that objects in stable grasps move within a single degree of freedom (1-DoF). We thereby propose a method to jointly optimise all frames within a stable grasp, minimising object motions to a latent 1-DoF. Finally, we extend the knowledge to in-the-wild videos by labelling 2.4K clips of stable grasps. Our proposed EPIC-Grasps dataset includes 390 object instances of 9 categories, featuring stable grasps from videos of daily interactions in 141 environments. Without 3D ground truth, we use stable contact areas and 2D projection masks to assess the HO-SGR task in the wild. We evaluate relevant methods and our approach preserves ... : webpage: https://zhifanzhu.github.io/getagrip ... Article in Journal/Newspaper Arctic DataCite Metadata Store (German National Library of Science and Technology) Arctic
institution Open Polar
collection DataCite Metadata Store (German National Library of Science and Technology)
op_collection_id ftdatacite
language unknown
topic Computer Vision and Pattern Recognition cs.CV
FOS Computer and information sciences
spellingShingle Computer Vision and Pattern Recognition cs.CV
FOS Computer and information sciences
Zhu, Zhifan
Damen, Dima
Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos ...
topic_facet Computer Vision and Pattern Recognition cs.CV
FOS Computer and information sciences
description We propose the task of Hand-Object Stable Grasp Reconstruction (HO-SGR), the reconstruction of frames during which the hand is stably holding the object. We first develop the stable grasp definition based on the intuition that the in-contact area between the hand and object should remain stable. By analysing the 3D ARCTIC dataset, we identify stable grasp durations and showcase that objects in stable grasps move within a single degree of freedom (1-DoF). We thereby propose a method to jointly optimise all frames within a stable grasp, minimising object motions to a latent 1-DoF. Finally, we extend the knowledge to in-the-wild videos by labelling 2.4K clips of stable grasps. Our proposed EPIC-Grasps dataset includes 390 object instances of 9 categories, featuring stable grasps from videos of daily interactions in 141 environments. Without 3D ground truth, we use stable contact areas and 2D projection masks to assess the HO-SGR task in the wild. We evaluate relevant methods and our approach preserves ... : webpage: https://zhifanzhu.github.io/getagrip ...
format Article in Journal/Newspaper
author Zhu, Zhifan
Damen, Dima
author_facet Zhu, Zhifan
Damen, Dima
author_sort Zhu, Zhifan
title Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos ...
title_short Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos ...
title_full Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos ...
title_fullStr Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos ...
title_full_unstemmed Get a Grip: Reconstructing Hand-Object Stable Grasps in Egocentric Videos ...
title_sort get a grip: reconstructing hand-object stable grasps in egocentric videos ...
publisher arXiv
publishDate 2023
url https://dx.doi.org/10.48550/arxiv.2312.15719
https://arxiv.org/abs/2312.15719
geographic Arctic
geographic_facet Arctic
genre Arctic
genre_facet Arctic
op_rights Creative Commons Attribution Non Commercial Share Alike 4.0 International
https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode
cc-by-nc-sa-4.0
op_doi https://doi.org/10.48550/arxiv.2312.15719
_version_ 1801372933687345152