18 datasets found

Tags: hand-object interaction

Filter Results
  • OakInk

    The OakInk dataset is a large dataset for understanding hand-object interaction.
  • ContactDB

    A dataset of grasps with object contact and hand pose, containing 3D annotations of hand pose, object pose, and contact area.
  • FPHA Dataset

    The FPHA dataset is a real-world dataset for studying hand-object interaction.
  • HO-3D and FPHA Datasets

    The HO-3D and FPHA datasets are real-world datasets for studying hand-object interaction.
  • Obman Dataset

    The Obman dataset is a synthetic dataset containing hand-object mesh pairs. The hands are generated by a non-learning based method GraspIt! and are parameterized by the MANO model.
  • EPIC-KITCHENS

    EPIC-KITCHENS is a large-scale egocentric video benchmark recorded by 32 participants in their native kitchen environments. Our videos depict non-scripted daily activities: we...
  • RGB2Hands

    The RGB2Hands dataset contains RGB videos of interacting two hands without shape annotations.
  • ObMan

    The ObMan dataset is a large synthetic dataset produced by rendering hand meshes with selected objects from ShapeNet.
  • First-person hand benchmark (FHB)

    The FHB dataset contains egocentric RGB-D videos on a wide range of hand-object interactions.
  • GRAB dataset

    GRAB dataset is a dataset of whole-body human grasping of objects.
  • HO3D dataset

    HO3D dataset is a real-world dataset capturing 10 different subjects performing various fine-grained manipulations on one of the 10 objects from the YCB models.
  • ContactDescribe dataset

    ContactDescribe dataset is an enhanced dataset based on the ContactPose, which combines hand-object contact information with hand pose, object pose, and RGB-D images.
  • HOI4D

    We aggregate 7 diverse real-world interaction datasets resulting in long-tailed collection of interactions across 157 object categories, and train a shared model across these.
  • G-HOP: Generative Hand-Object Prior for Interaction Reconstruction and Grasp ...

    We propose G-HOP, a denoising diffusion based generative prior for hand-object interactions that allows modeling both the 3D object and a human hand, conditioned on the object...
  • InterHand2.6M

    The InterHand2.6M dataset contains 366K training samples, 110K validation samples, and 261K test samples. It is the only interacting two-hand dataset with dense shape annotations.
  • Arctic

    The Arctic dataset is a benchmark for dexterous bimanual hand-object manipulation.
  • DexYCB

    The DexYCB dataset is a large-scale dataset of hand-grasping postures captured using a synchronized setup of 8 cameras.
  • GazeHOI

    The GazeHOI dataset is a comprehensive resource for studying the complex interaction between human hands, objects, and gaze.
You can also access this registry using the API (see API Docs).