17 results for “topic:egocentric-videos”
A curated list of egocentric (first-person) vision and related area resources
[NeurIPS 2024] Official code for HourVideo: 1-Hour Video Language Understanding
A repo for training and finetuning models for hands segmentation.
[CVPR 2022] Joint hand motion and interaction hotspots prediction from egocentric videos
Action Scene Graphs for Long-Form Understanding of Egocentric Videos (CVPR 2024)
[MICCAI2023 Oral] POV-Surgery: A Dataset for Egocentric Hand and Tool Pose Estimation During Surgical Activities
Code for the paper "Differentiable Task Graph Learning: Procedural Activity Representation and Online Mistake Detection from Egocentric Videos" [NeurIPS (spotlight), 2024]
Learning Precise Affordances from Egocentric Videos for Robotic Manipulation (ICCV 2025)
The official code and data for paper "VidEgoThink: Assessing Egocentric Video Understanding Capabilities for Embodied AI"
This is a third party implementation of the paper "The Audio-Visual Conversational Graph: From an Egocentric-Exocentric Perspective".
Making a long story short: A multi-importance fast-forwarding egocentric videos with the emphasis on relevant objects @ Journal of Visual Communication and Image Representation 53 (2018)
A Weighted Sparse Sampling and Smoothing Frame Transition Approach for Semantic Fast-Forward First-Person Videos @ IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2018
Collect related papers and datasets for research
Official code release for "Generative Adversarial Network for Future Hand Segmentation from Egocentric Video" (ECCV 2022)
CVMHAT: Multiple Human Association and Tracking from Egocentric and Complementary Top Views, IEEE TPAMI.
Public shell for a spatial memory benchmark from egocentric video. Documentation-only: protocol, schema, and data card—no code or datasets.
Identificación de actividades cotidianas basado en visión por computador y cámara con montaje en la cabeza.