The dataset viewer is not available for this split.
Error code: TooBigContentError
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
ColliderML: ColliderML Higgs Boson Production from Gluon-Gluon Fusion (No Pileup)
Dataset Description
This dataset contains simulated high-energy physics collision events for Higgs boson production from gluon-gluon fusion with no pileup (single interaction per event) generated using the Open Data Detector (ODD) geometry within the Key4hep and ACTS (A Common Tracking Software) frameworks, representing a generic collider detector similar to those at the HL-LHC.
Dataset Summary
- Campaign:
hard_scatter - Process: Higgs boson production from gluon-gluon fusion
- Version:
v1 - Number of Events: ~100000 events
- Pileup: 0 (no additional interactions)
- Detector: Open Data Detector (ODD)
- Format: Apache Parquet with list columns for variable-length data
- License: cc-by-4.0
Supported Tasks
This dataset is designed for machine learning tasks in high-energy physics, including:
- Particle tracking: Reconstruct charged particle trajectories from detector hits
- Track-to-particle matching: Associate reconstructed tracks with truth particles
- Jet tagging: Identify jets originating from top quarks, b-quarks, or light quarks
- Energy reconstruction: Predict particle energies from calorimeter deposits
- Physics analysis: Event classification (signal vs. background discrimination)
- Representation learning: Study hierarchical information at different detector levels
Languages
N/A (Physics data)
Quick Start
Installation
pip install datasets pyarrow
Load First 100 Events (All Objects)
from datasets import load_dataset
# Load first 100 rows of each configuration
particles = load_dataset("OpenDataDetector/ColliderML_higgs_pu0", "particles", split="train[:100]")
tracker_hits = load_dataset("OpenDataDetector/ColliderML_higgs_pu0", "tracker_hits", split="train[:100]")
calo_hits = load_dataset("OpenDataDetector/ColliderML_higgs_pu0", "calo_hits", split="train[:100]")
tracks = load_dataset("OpenDataDetector/ColliderML_higgs_pu0", "tracks", split="train[:100]")
print(f"Loaded {len(particles)} particle events")
print(f"Loaded {len(tracker_hits)} tracker hit events")
print(f"Loaded {len(calo_hits)} calo hit events")
print(f"Loaded {len(tracks)} track events")
Load Specific Columns from First 100 Events
from datasets import load_dataset
import numpy as np
# Load only specific columns from particles
particles = load_dataset(
"OpenDataDetector/ColliderML_higgs_pu0",
"particles",
split="train[:100]",
columns=["event_id", "px", "py", "pz", "energy", "pdg_id"]
)
# Access data
for event in particles:
event_id = event['event_id']
# Convert to numpy arrays
px = np.array(event['px'])
py = np.array(event['py'])
pz = np.array(event['pz'])
# Calculate transverse momentum
pt = np.sqrt(px**2 + py**2)
print(f"Event {event_id}: {len(px)} particles, mean pt = {pt.mean():.2f} GeV")
# Load only specific columns from tracks
tracks = load_dataset(
"OpenDataDetector/ColliderML_higgs_pu0",
"tracks",
split="train[:100]",
columns=["event_id", "qop", "theta", "phi"]
)
# Calculate derived quantities
for event in tracks:
qop = np.array(event['qop'])
theta = np.array(event['theta'])
# Compute transverse momentum from track parameters
pt = np.abs(1.0 / qop) * np.sin(theta)
eta = -np.log(np.tan(theta / 2.0))
print(f"Event {event['event_id']}: {len(qop)} tracks, pt range [{pt.min():.2f}, {pt.max():.2f}] GeV")
Dataset Structure
Data Instances
Each row in the Parquet files represents a single collision event. Variable-length quantities (e.g., lists of particles, hits, tracks) are stored as Parquet list columns.
Example event structure:
{
'event_id': 42,
'particle_id': [0, 1, 2, 3, ...], # List of particle IDs
'pdg_id': [11, -11, 211, ...], # Particle type codes
'px': [1.2, -0.5, 3.4, ...], # Momentum components (GeV)
'py': [0.8, 1.1, -0.3, ...],
'pz': [5.2, -2.1, 10.5, ...],
'energy': [5.5, 2.3, 11.2, ...],
# ... additional fields
}
Data Fields
The dataset contains 4 data types organized by detector hierarchy:
1. particles (Truth-level)
Truth information about generated particles before detector simulation.
| Field | Type | Description |
|---|---|---|
event_id |
int64 | Unique event identifier |
particle_id |
list | Unique particle ID within event |
pdg_id |
list | PDG particle code (e.g., 11=electron, 13=muon, 211=pion) |
mass |
list | Particle rest mass (GeV/c²) |
energy |
list | Particle total energy (GeV) |
charge |
list | Electric charge (in units of e) |
px, py, pz |
list | Momentum components (GeV/c) |
vx, vy, vz |
list | Vertex position (mm) |
time |
list | Production time (ns) |
num_tracker_hits |
list | Number of hits in tracker |
num_calo_hits |
list | Number of hits in calorimeter |
vertex_primary |
list | Primary vertex flag (1 = hard scatter, 2,...,N = pileup) |
parent_id |
list | ID of parent particle |
Typical event: ~200-500 particles per event
2. tracker_hits (Detector-level)
Digitized spatial measurements from the tracking detector (silicon sensors).
| Field | Type | Description |
|---|---|---|
event_id |
int64 | Unique event identifier |
x, y, z |
list | Measured hit position (mm) |
true_x, true_y, true_z |
list | True (simulated) hit position before digitization (mm) |
time |
list | Hit time (ns) |
particle_id |
list | Truth particle that created this hit |
volume_id |
list | Detector volume identifier |
layer_id |
list | Detector layer number |
surface_id |
list | Sensor surface identifier |
cell_id |
list | Cell/pixel identifier |
detector |
list | Detector subsystem code |
Typical event: ~2,000-5,000 hits per event
3. calo_hits (Calorimeter-level)
Energy deposits in the calorimeter system (electromagnetic + hadronic).
| Field | Type | Description |
|---|---|---|
event_id |
int64 | Unique event identifier |
detector |
list | Calorimeter subsystem name |
cell_id |
list | Calorimeter cell identifier |
total_energy |
list | Total energy deposited in cell (GeV) |
x, y, z |
list | Cell center position (mm) |
contrib_particle_ids |
list<list> | IDs of particles contributing to this cell |
contrib_energies |
list<list> | Energy contribution from each particle (GeV) |
contrib_times |
list<list> | Time of each contribution (ns) |
Note: Nested lists for contributions (one cell can have multiple particle deposits).
Typical event: ~500-1,000 calorimeter cells with deposits
4. tracks (Reconstruction-level)
Reconstructed particle tracks from pattern recognition and track fitting algorithms.
| Field | Type | Description |
|---|---|---|
event_id |
int64 | Unique event identifier |
track_id |
list | Unique track identifier within event |
majority_particle_id |
list | Truth particle with most hits on this track |
d0 |
list | Transverse impact parameter (mm) |
z0 |
list | Longitudinal impact parameter (mm) |
phi |
list | Azimuthal angle (radians) |
theta |
list | Polar angle (radians) |
qop |
list | Charge divided by momentum (e/GeV) |
hit_ids |
list<list> | List of tracker hit IDs assigned to this track |
Track parameters: Standard ACTS track representation (perigee parameters at origin).
Derived quantities:
- Transverse momentum:
pt = abs(1/qop) * sin(theta) - Pseudorapidity:
eta = -ln(tan(theta/2)) - Total momentum:
p = abs(1/qop)
Typical event: ~50-150 reconstructed tracks per event
Data Splits
Currently, the dataset does not have predefined train/validation/test splits. Users should implement their own splitting strategy based on their use case. Recommended approach:
from sklearn.model_selection import train_test_split
# Example: 70% train, 15% validation, 15% test
all_events = list(range(100000))
train_val, test = train_test_split(all_events, test_size=0.15, random_state=42)
train, val = train_test_split(train_val, test_size=0.176, random_state=42) # 0.176 * 0.85 ≈ 0.15
Support
For questions, issues, or feature requests:
- Email: [email protected]
- You can also open a discussion in the HuggingFace community panel for this dataset.
Acknowledgments
This work was supported by:
- NERSC computing resources
- U.S. Department of Energy, Office of Science
- Danish Data Science Academy (DDSA)
Last updated: October 2025 Dataset version: v1
- Downloads last month
- 26