⌘K
Change language Switch ThemeSign In
Narrow Mode
Ropedia Xperience-10M: A Large-Scale Egocentric Multimodal Dataset for Embodied AI ==================================================================================
Ropedia Xperience-10M: A Large-Scale Egocentric Multimodal Dataset for Embodied AI ==================================================================================  ### AK
@_akhaliq
Ropedia Xperience-10M is out on Hugging Face
a large-scale egocentric multimodal dataset of human experience for embodied AI, robotics, world models, and spatial intelligence
It contains 10 million experiences (interaction) and 10,000 hours of synchronized first-person recordings with six video streams, audio, stereo depth, camera pose, hand mocap, full-body mocap, IMU, and hierarchical language annotations
dataset: huggingface.co/datasets/roped…
01:32
Mar 17, 2026, 4:28 PM View on X
7 Replies
7 Retweets
36 Likes
3,957 Views  AK @_akhaliq
One Sentence Summary
Ropedia Xperience-10M, a new large-scale egocentric multimodal dataset, has been released on Hugging Face, offering 10 million experiences for embodied AI, robotics, world models, and spatial intelligence research.
Summary
This tweet announces the release of Ropedia Xperience-10M, a significant new dataset for AI research. It is described as a large-scale egocentric multimodal dataset designed to capture human experience, comprising 10 million interactions and 10,000 hours of synchronized first-person recordings. The dataset includes diverse data streams such as six video feeds, audio, stereo depth, camera pose, hand and full-body motion capture, IMU data, and hierarchical language annotations, making it highly valuable for developing embodied AI, robotics, world models, and spatial intelligence systems.
AI Score
83
Influence Score 13
Published At Today
Language
English
Tags
Ropedia Xperience-10M
Multimodal Dataset
Egocentric AI
Embodied AI
Robotics HomeArticlesPodcastsVideosTweets
Ropedia Xperience-10M: A Large-Scale Egocentric Multimoda... ===============