-
Legged Locomotion in Challenging Terrains using Egocentric Vision
The dataset used in this paper for training and testing the legged locomotion system using egocentric vision. -
Ego-4D dataset
Ego-4D dataset -
Epic Kitchens dataset
Epic Kitchens dataset is a dataset for egocentric vision and action recognition. -
Sewing Machine Operation Task Dataset
The dataset contains 40 experiences associated with a sewing machine operation task performed by amateur operators. -
EPIC-KITCHENS
EPIC-KITCHENS is a large-scale egocentric video benchmark recorded by 32 participants in their native kitchen environments. Our videos depict non-scripted daily activities: we... -
Anticipating Next Active Object (ANACTO)
The ANACTO task is a combination of two tasks merged into one: (1) identifying NAO from past observed segment, and (2) modeling the motion of a person to estimate NAO's location... -
Epic-tent: an egocentric video dataset for camping tent assembly
Epic-tent: an egocentric video dataset for camping tent assembly -
Real-Time Hand Tracking
Real-time hand tracking under occlusion from an egocentric RGB-D sensor. -
EGTEA Gaze+
The EGTEA Gaze+ dataset offers approximately 10,000 samples of 106 non-scripted daily activities that occur in a kitchen. -
EPIC-KITCHENS-55
EPIC-KITCHENS-55 is a dataset of 432 videos acquired by 32 subjects and labeled with 39,595 action segments. -
Towards Streaming Egocentric Action Anticipation
Egocentric action anticipation is the task of predicting the future actions a camera wearer will likely perform based on past video observations.