| video
				 video | 
|---|
MicroAGI00: MicroAGI Egocentric Dataset (2025)
License: MicroAGI00 Open Use, No-Resale v1.0 (see
LICENSE). No resale: You may not sell or paywall this dataset or derivative data. Trained models/outputs may be released under any terms.
Overview
MicroAGI00 is a large-scale egocentric RGB+D dataset of human manipulation in https://behavior.stanford.edu/challenge/index.html tasks.
Quick facts
- Modality: synchronized RGB + 16‑bit depth + IMU + annotations
- Resolution & rate (RGB): 1920×1080 @ 30 FPS (in MCAP)
- Depth: 16‑bit, losslessly compressed inside MCAP
- Scale: ≈1,000,000 synchronized RGB frames and ≈1,000,000 depth frames (≈1M frame pairs)
- Container: .mcap(all signals + annotations)
- Previews: For as sample for only some bags .mp4per sequence (annotated RGB; visualized native depth)
- Annotations: Only in %5 of the dataset, hand landmarks and short action text
What’s included per sequence
- One large MCAP file containing: - RGB frames (1080p/30 fps) 
- 16‑bit depth stream (lossless compression) 
- IMU data (as available) 
- For Some Data the Embedded annotations (hands, action text) - MP4 preview videos: 
- Annotated RGB (for quick review) 
- Visualized native depth map (for quick review) 
 
Note: MP4 previews may be lower quality than MCAP due to compression and post‑processing. Research use should read from MCAP.
Annotations
Annotations are generated by our in‑house.
Hand annotations 21 Joints (not all shown below as it would be too long) (per frame) — JSON schema example
{
  "frame_number": 9,
  "timestamp_seconds": 0.3,
  "resolution": { "width": 1920, "height": 1080 },
  "hands": [
    {
      "hand_index": 0,
      "landmarks": [
        { "id": 0,  "name": "WRIST", "x": 0.7124036550521851, "y": 0.7347621917724609, "z": -1.444301744868426e-07, "visibility": 0.0 },
 
      ],
      "hand": "Left",
      "confidence": 0.9268525838851929
    },
    {
      "hand_index": 1,
      "landmarks": [
        { "id": 0,  "name": "WRIST", "x": 0.4461262822151184, "y": 0.35183972120285034, "z": -1.2342320587777067e-07, "visibility": 0.0 },
    
      "hand": "Right",
      "confidence": 0.908446729183197
    }
  ],
  "frame_idx": 9,
  "exact_frame_timestamp": 1758122341583104000,
  "exact_frame_timestamp_sec": 1758122341.583104
}
Text (action) annotations (per frame/window) — JSON schema example
{
  "schema_version": "v1.0",
  "action_text": "Right hand, holding a knife, is chopping cooked meat held by the left hand on the red cutting board.",
  "confidence": 1.0,
  "source": { "model": "MicroAGI, MAGI01" },
  "exact_frame_timestamp": 1758122341583104000,
  "exact_frame_timestamp_sec": 1758122341.583104
}
Data access and structure
- Each top-level sample folder contains: One folder of strong heavy mcap dump, one folder of annotated mcap dump, one folder of mp4 previews
- All authoritative signals and annotations are inside the MCAP. Use the MP4s for quick visual QA only.
Getting started
- Inspect an MCAP: mcap info your_sequence.mcap
- Extract messages: mcap cat --topics <topic> your_sequence.mcap > out.bin
- Python readers: pip install mcap(see the MCAP Python docs) or any MCAP-compatible tooling. Typical topics include RGB, depth, IMU, and annotation channels.
Intended uses
- Policy and skill learning (robotics/VLA)
- Action detection and segmentation
- Hand/pose estimation and grasp analysis
- Depth-based reconstruction, SLAM, scene understanding
- World-model pre-post training
Services and custom data
MicroAGI provides on-demand:
- Real‑to‑Sim pipelines
- ML‑enhanced 3D point clouds and SLAM reconstructions
- New data capture via our network of skilled tradespeople and factory workers (often below typical market cost)
- Enablement for your workforce to wear our device and run through our processing pipeline
Typical lead times: under two weeks (up to four weeks for large jobs).
How to order more
Email [email protected] with:
- Task description
- Desired hours or frame counts
- Proposed price We will reply within one business day with lead time and final pricing.
Questions: [email protected]
License
This dataset is released under the MicroAGI00 Open Use, No‑Resale License v1.0 (custom). See LICENSE. Redistribution must be free‑of‑charge under the same license. Required credit: "This work uses the MicroAGI00 dataset (MicroAGI, 2025)."
Attribution reminder
Public uses of the Dataset or Derivative Data must include the credit line above in a reasonable location for the medium (papers, repos, product docs, dataset pages, demo descriptions). Attribution is appreciated but not required for Trained Models or Outputs.
- Downloads last month
- 2,183
