Dataset Viewer
sample_id
stringlengths 36
36
| task
stringlengths 5
95
| embodiments
listlengths 1
4
| image
imagewidth (px) 1.02k
2.28k
| segmentation_mask
listlengths 1.02k
2.27k
| ground_truth
dict | category
listlengths 1
4
| context
stringlengths 32
294
| metadata
dict |
|---|---|---|---|---|---|---|---|---|
1dfcd2f0-d1db-4197-9ee8-2e573145e9aa
|
Go to the garden
|
[
"Human",
"Legged Robot"
] | [[30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30(...TRUNCATED)
| {"Bicycle":null,"Human":[[[682.5,972.7509765625],[808.93408203125,935.0203247070312],[1051.984008789(...TRUNCATED)
|
[
"Semantic Terrain",
"Geometric Terrain",
"Visibility"
] |
Corner of a house; Garden is downstairs behind the house; bike parking area is in the garden
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Stru(...TRUNCATED)
|
|
a8147848-5967-41fe-a662-2dbf93b19387
|
Go to the bike parking space
|
[
"Human",
"Legged Robot"
] | [[30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30(...TRUNCATED)
| {"Bicycle":null,"Human":[[[682.5,972.7509765625],[761.9803466796875,826.7230834960938],[786.83868408(...TRUNCATED)
|
[
"Semantic Terrain",
"Geometric Terrain"
] | "house on the right side; path leads overs stairs and gravel to garden; on the left side is a pike p(...TRUNCATED)
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Natu(...TRUNCATED)
|
|
cc91aa2b-2735-4e32-b4ef-7fda40f70d25
|
Go to the street, turn left and stop
|
[
"Human",
"Legged Robot"
] | [[17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17,17(...TRUNCATED)
| {"Bicycle":null,"Human":[[[682.5,972.7509765625],[609.1189575195312,842.970703125],[608.194763183593(...TRUNCATED)
|
[
"Geometric Terrain",
"Visibility"
] |
Pedestrian walkway leads over stairs to a road; Backyard between houses; Trees occlude street
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Stru(...TRUNCATED)
|
|
08fa7d11-a82a-450f-b670-d687fe6a499f
|
Cross the street and move to the closest trash can
|
[
"Bicycle",
"Human",
"Legged Robot",
"Wheeled Robot"
] | [[30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30(...TRUNCATED)
| {"Bicycle":[[[682.5,972.7509765625],[825.687744140625,831.673828125],[887.01708984375,698.5366821289(...TRUNCATED)
|
[
"Social Norms"
] |
Crosswalk leads over road; Closer trash can is to the left; On the right is also a trash can
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Stru(...TRUNCATED)
|
|
1e8c58c3-f557-4793-b8fd-54e08c72e96f
|
Go straight, then up the stairs to the right
|
[
"Human",
"Legged Robot"
] | [[30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30(...TRUNCATED)
| {"Bicycle":null,"Human":[[[682.5,972.7509765625],[774.0491333007812,829.3860473632812],[789.03521728(...TRUNCATED)
|
[
"Geometric Terrain"
] |
Street leads downhill; On the right side is a house; Only stairs lead up to house
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Stru(...TRUNCATED)
|
|
c7c69c7f-4587-45e2-b48c-797b502f6f2e
|
Follow the road until you're past the gas station
|
[
"Bicycle",
"Human",
"Legged Robot",
"Wheeled Robot"
] | [[27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27(...TRUNCATED)
| {"Bicycle":[[[682.5,972.7509765625],[559.04443359375,824.8202514648438],[448.3956604003906,683.18652(...TRUNCATED)
|
[
"Social Norms"
] | "On the right side is a gas station; The street continues on the left side; Currently in car parking(...TRUNCATED)
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Stru(...TRUNCATED)
|
|
86fd9c79-7128-4c7b-92c3-70538872bd08
|
Go to the church
|
[
"Bicycle",
"Human",
"Legged Robot",
"Wheeled Robot"
] | [[27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27(...TRUNCATED)
| {"Bicycle":[[[682.5,972.7509765625],[605.6500854492188,849.6502685546875],[574.4791870117188,728.749(...TRUNCATED)
|
[
"Visibility",
"Social Norms"
] |
Currently on walkway; Road to the left leads straight to a church; Church tower is in the far left
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Stru(...TRUNCATED)
|
|
a9136740-58ac-4959-82c5-97e8f8798c75
|
Follow the path straightforward until you reach the crossing
|
[
"Bicycle",
"Human",
"Legged Robot",
"Wheeled Robot"
] | [[30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30(...TRUNCATED)
| {"Bicycle":[[[682.5,972.7509765625],[729.8411254882812,860.5400390625],[766.5303955078125,736.240295(...TRUNCATED)
|
[
"Visibility",
"Semantic Terrain"
] |
Stone path leads through park; Slight curve then crossing ahead
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Natu(...TRUNCATED)
|
|
06ccce6b-6183-4bb9-9c8e-1077c2d4fa4c
|
Enter the parking lot and stop there
|
[
"Bicycle",
"Human",
"Legged Robot",
"Wheeled Robot"
] | [[27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27,27(...TRUNCATED)
| {"Bicycle":[[[682.5,972.7509765625],[604.3262939453125,816.2452392578125],[579.3895263671875,672.269(...TRUNCATED)
|
[
"Stationary Obstacle",
"Dynamic Obstacle"
] |
Road leads to parking lot; Cars are in front of the gate to the parking lot
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Stru(...TRUNCATED)
|
|
7c2a55bb-cd8b-4788-8706-fb3bde67dc2c
|
Go to the construction sign on the very left
|
[
"Bicycle",
"Human",
"Legged Robot",
"Wheeled Robot"
] | [[30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30,30(...TRUNCATED)
| {"Bicycle":[[[682.5,972.7509765625],[655.5823974609375,790.2430419921875],[600.8377685546875,583.527(...TRUNCATED)
|
[
"Social Norms",
"Stationary Obstacle"
] | "Construction site across the street; Crosswalk is to the left; Rail tracks are in the middle of the(...TRUNCATED)
| {"city":"Zurich","country":"Switzerland","lighting_conditions":"Daylight","natural_structured":"Stru(...TRUNCATED)
|
End of preview. Expand
in Data Studio
NaviTrace
NaviTrace is a novel VQA benchmark for VLMs that evaluates models on their embodiment-specific understanding of navigation across challenging real-world scenarios.
Key Features
- ✏️ Core Task: Given a real-world image in first-person perspective, a language instruction, and an embodiment type, models should predict a 2D navigation path in image space that solves the instruction.
- 🤖 Embodiments: Four embodiment types capturing distinct physical and spatial constraints (human, legged robot, wheeled robot, or bicycle).
- 📏 Scale: 1,002 diverse real-world scenarios and over 3,000 expert-annotated traces.
- ⚖️ Splits:
- Validation split (~50%) for experimentation and model fine-tuning.
- Test split (~50%) with hidden ground-truths for public leaderboard evaluation.
- 🔎 Annotation Quality: All images and traces manually collected and labeled by human experts.
- 🏅 Evaluation Metric: Semantic-aware Trace Score, combining Dynamic Time Warping distance, goal endpoint error, and embodiment-conditioned semantic penalties.
Uses
Run Benchmark
We provide a notebook with example code on how to run this benchmark with an API model. You can use this as a template to adapt to your own model. Additionally, we host a public leaderboard where you can submit your model's results.
Model Training
You can use the validation split to fine-tune models for this task.
Load the dataset with dataset = load_dataset("leggedrobotics/NaviTrace") and use dataset["validation"] for training your model.
See the next section for details on the dataset columns.
Structure
| Column | Type | Description |
|---|---|---|
| sample_id | str |
Unique identifier of a scenario. |
| task | str |
Language instruction (English) solvable purely from the visual information, emphasizing cases where different embodiments behave differently, while still reflecting everyday scenarios. |
| embodiments | List[str] |
All embodiments ("Human", "Legged Robot", "Wheeled Robot", "Bicycle") suitable for the task. |
| image | PIL.Image |
First-person image of a real-world environment with blured faces and license plates. |
| segmentation_mask | List[List[int]] |
Semantic segmentation mask of the image generated with the Mask2Former model. |
| ground_truth | dict[str, Optional[List[List[List[float]]]]] |
A dict mapping an embodiment name to a sequence of 2D points in image coordinates that describes a navigation path solution. One path per suitable embodiment, and multiple paths if equally valid alternatives exist (e.g., avoiding an obstacle from the left or right). If an embodiment is not suitable for the task, the value is None. |
| category | List[str] |
List with one or more categories ("Semantic Terrain", "Geometric Terrain", "Stationary Obstacle", "Dynamic Obstacle", "Accessibility", "Visibility", "Social Norms") that describe the main challenges of the navigation task. |
| context | str |
Short description of the scene as bullet points separated with ";", including the location, ongoing activities, and key elements needed to solve the task. |
| metadata | dict[str, str] |
Additional information about the scenario: - "country": The image's country of origin. - "city": The image's city of origin or "GrandTour Dataset" if the image comes from the Grand Tour dataset. - "urban_rural": "Urban", "Rural", or "Mixed" depending on the image's setting. - "natural_structured": "Structured", "Natural", or "Mixed" depending on the image's environment. - "lighting_conditions": "Night", "Daylight", "Indoor Lighting", or "Low Light" depending on the image's lighting. - "weather_conditions": "Cloudy", "Clear", "Rainy", "Unknown", "Foggy", "Snowy", or "Windy" depending on the image's weather. - "task_type": Distinguishes between instruction styles. Goal-Directed tasks ("Goal") specify the target explicitly (e.g., “Go straight to the painting.”), while Directional tasks ("Directions") emphasize the movement leading to it (e.g., “Move forward until you see the painting.”). Since this is ambiguous sometimes, there are also mixed tasks ("Mixed"). |
Citation
If you find this dataset helpful for your work, please cite us with:
BibTeX:
@article{Windecker2025NaviTrace,
author = {Tim Windecker and Manthan Patel and Moritz Reuss and Richard Schwarzkopf and Cesar Cadena and Rudolf Lioutikov and Marco Hutter and Jonas Frey},
title = {NaviTrace: Evaluating Embodied Navigation of Vision-Language Models},
year = {2025},
month = {October},
note = {Awaiting peer review and journal submission.},
}
- Downloads last month
- 26
Size of downloaded dataset files:
345 MB
Size of the auto-converted Parquet files:
345 MB
Number of rows:
1,002