Commit
4001853
Β·
verified Β·
1 Parent(s): d3af3df

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +174 -0
README.md ADDED
@@ -0,0 +1,174 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ task_categories:
3
+ - robotics
4
+ language:
5
+ - en
6
+ tags:
7
+ - RDT
8
+ - rdt
9
+ - RDT 2
10
+ - manipulation
11
+ - bimanual
12
+ - ur5e
13
+ - webdatset
14
+ - vision-language-action
15
+ ---
16
+
17
+ ## Dataset Summary
18
+
19
+ This dataset provides shards in the **WebDataset** format for training [RDT-2]() or policy models on **bimanual manipulation**.
20
+ Each sample packs:
21
+
22
+ * a **binocular RGB image** (left + right wrist cameras concatenated horizontally)
23
+ * a **relative action chunk** (continuous control, 0.8s, 30Hz)
24
+ * a **discrete action token sequence** (e.g., from an [Residual VQ action tokenizer](https://huggingface.co/robotics-diffusion-transformer/RVQActionTokenizer))
25
+ * a **metadata JSON** with an instruction key `sub_task_instruction_key` to index corresponding instruction from `instructions.json`
26
+
27
+ > Data were collected on a **bimanual UR5e** setup.
28
+
29
+ ---
30
+
31
+ ## Supported Tasks
32
+
33
+ * **Instruction-conditioned bimanual manipulation **, including:
34
+ - Fold cloths
35
+ -
36
+
37
+ ---
38
+
39
+ ## Data Structure
40
+
41
+ ### Shard layout
42
+
43
+ Shards are named `shard-*.tar`. Inside each shard:
44
+
45
+ ```
46
+ shard-000000.tar
47
+ β”œβ”€β”€ 0.image.jpg # binocular RGB, H=384, W=768, C=3, uint8
48
+ β”œβ”€β”€ 0.action.npy # relative actions, shape (24, 20), float32
49
+ β”œβ”€β”€ 0.action_token.npy # action tokens, shape (27,), int16 ∈ [0, 1024]
50
+ β”œβ”€β”€ 0.meta.json # metadata; includes "sub_task_instruction_key"
51
+ β”œβ”€β”€ 1.image.jpg
52
+ β”œβ”€β”€ 1.action.npy
53
+ β”œβ”€β”€ 1.action_token.npy
54
+ β”œβ”€β”€ 1.meta.json
55
+ └── ...
56
+ shard-000001.tar
57
+ shard-000002.tar
58
+ ...
59
+ ```
60
+
61
+ > **Image:** binocular wrist cameras concatenated horizontally β†’ `np.ndarray` of shape `(384, 768, 3)` with `dtype=uint8` (stored as JPEG).
62
+ > **Action (continuous):** `np.ndarray` of shape `(24, 20)`, `dtype=float32` (24-step chunk, 20-D control).
63
+ > **Action tokens (discrete):** `np.ndarray` of shape `(27,)`, `dtype=int16`, values in `[0, 1024]`.
64
+ > **Metadata:** `meta.json` contains at least `sub_task_instruction_key` pointing to an entry in top-level `instructions.json`.`
65
+
66
+
67
+ ---
68
+
69
+ ## Example Data Instance
70
+
71
+ ```json
72
+ {
73
+ "image": "0.image.jpg",
74
+ "action": "0.action.npy",
75
+ "action_token": "0.action_token.npy",
76
+ "meta": {
77
+ "sub_task_instruction_key": "fold_cloth_step_3"
78
+ }
79
+ }
80
+ ```
81
+
82
+ ---
83
+
84
+ ## How to Use
85
+
86
+ ### 1) Official Guidelines to fine-tune RDT 2s
87
+
88
+ Use the example [scripts](https://github.com/thu-ml/RDT2/blob/cf71b69927f726426c928293e37c63c4881b0165/data/utils.py#L48) and [guidelines](https://github.com/thu-ml/RDT2/blob/cf71b69927f726426c928293e37c63c4881b0165/data/utils.py#L48):
89
+
90
+ ### 2) Minimal Loading example with
91
+
92
+ ```python
93
+ import os
94
+ import glob
95
+ import json
96
+ import random
97
+
98
+ import webdataset as wds
99
+
100
+
101
+ def no_split(src):
102
+ yield from src
103
+
104
+ def get_train_dataset(shards_dir):
105
+ shards = sorted(glob.glob(os.path.join(shards_dir, "shard-*.tar")))
106
+ random.shuffle(shards)
107
+
108
+ num_workers = wds.utils.pytorch_worker_info()[-1]
109
+ workersplitter = wds.split_by_worker if len(shards) > num_workers else no_split
110
+
111
+ assert shards, f"No shards under {shards_dir}"
112
+ dataset = (
113
+ wds.WebDataset(
114
+ shards,
115
+ shardshuffle=False,
116
+ nodesplitter=no_split,
117
+ workersplitter=workersplitter,
118
+ resampled=True,
119
+ )
120
+ .repeat()
121
+ .shuffle(8192, initial=8192)
122
+ .decode("pil")
123
+ .map(
124
+ lambda sample: {
125
+ "image": sample["image.jpg"],
126
+ "action_token": sample["action_token.npy"],
127
+ "meta": sample["meta.json"],
128
+ }
129
+ )
130
+ .with_epoch(nsamples=(2048 * 30 * 60 * 60)) # 2048 hours
131
+ )
132
+
133
+ return dataset
134
+
135
+ with open(os.path.join("<Dataset Diretory>", "instructions.json") as fp:
136
+ instructions = json.load(fp)
137
+ dataset = get_train_dataset("<Dataset Diretory>")
138
+ ```
139
+ ---
140
+
141
+ ## Ethical Considerations
142
+
143
+ * Contains robot teleoperation/automation data. No PII is present by design.
144
+ * Ensure safe deployment/testing on real robots; follow lab safety and manufacturer guidelines.
145
+
146
+ ---
147
+
148
+ ## Citation
149
+
150
+ If you use this dataset, please cite the dataset and your project appropriately. For example:
151
+
152
+ ```bibtex
153
+ TBD
154
+ ```
155
+
156
+ ---
157
+
158
+ ## License
159
+
160
+ * **Dataset license:** Apache-2.0 (unless otherwise noted by the maintainers of your fork/release).
161
+ * Ensure compliance when redistributing derived data or models.
162
+
163
+ ---
164
+
165
+ ## Maintainers & Contributions
166
+
167
+ We welcome fixes and improvements to the conversion scripts and docs (see https://github.com/thu-ml/RDT2/tree/main#troubleshooting).
168
+ Please open issues/PRs with:
169
+
170
+ * OS + Python versions
171
+ * Minimal repro code
172
+ * Error tracebacks
173
+ * Any other helpful context
174
+