Model Card for CrystaLLM-pi_SLME
Model Details
Model Description
CrystaLLM-pi_SLME is a conditional generative model designed for the discovery of high-performance photovoltaic materials. It is a fine-tuned version of the CrystaLLM-pi framework, based on a GPT-2 decoder-only architecture. This variant employs the Property-Key-Value (PKV) attention mechanism to condition the generation of Crystallographic Information Files (CIFs) on the Spectroscopic Limited Maximum Efficiency (SLME) metric.
The model generates crystal structures based on a single target scalar property:
- SLME (%) - A theoretical maximum efficiency metric for photovoltaic absorbers.
- Developed by: Bone et al. (University College London)
- Model type: Autoregressive Transformer with Prefix Attention Conditioning
- Language(s): CIF (Crystallographic Information File) syntax
- License: MIT
- Finetuned from model:
c-bone/CrystaLLM-pi_base
Model Sources
- Repository: GitHub: CrystaLLM-pi
- Paper: Discovery and recovery of crystalline materials with property-conditioned transformers (arXiv:2511.21299)
- Dataset: HuggingFace: c-bone/mpdb-slme-full
Uses
Direct Use
The model is intended for the exploration of chemical space for new photovoltaic candidates. Users can condition generation on high SLME values (e.g., >25%) to discover novel materials with optimal optical and electronic properties for solar energy conversion.
Out-of-Scope Use
- Large Unit Cells: Context window limit applies (~1024 tokens).
- Production Deployment: Generated structures are theoretical predictions. Verification via Hybrid-DFT calculations and experimental synthesis is required.
Bias, Risks, and Limitations
- Implicit Learning: The model was not explicitly trained on band gap data, but implicitly learned to target the optimal Shockley-Queisser range (1.2-1.4 eV) via the SLME metric. It may be less effective at targeting SLME values driven by mechanisms outside the primary training distribution.
- Data Scarcity: The model was fine-tuned on a relatively small dataset (~5.3K materials).
How to Get Started with the Model
For instructions on how to load and run generation with this model, please refer to the _load_and_generate.py script in the CrystaLLM-pi GitHub Repository.
Training Details
Training Data
The model was fine-tuned on the MP SLME dataset, containing inorganic structures labeled with their calculated Spectroscopic Limited Maximum Efficiency.
- Source: Materials Project / Derived from Walker and Butler (via
c-bone/mpdb-slme-full) - Preprocessing: CIFs are augmented, tokenized, and SLME values are normalized.
Training Procedure
- Architecture: GPT-2 with Property-Key-Value (PKV) encoder layers. (~38.7M parameters)
- Mechanism: Prefix Tuning (PKV) is used to inject the SLME target directly into the attention mechanism.
Evaluation
Metrics
The model is evaluated based on:
- Hit-Rate: Fraction of generated materials with predicted SLME values near the target.
- VSUN: Validity, Stability, Uniqueness, and Novelty of the generated candidates.
Results
The model successfully generated stable, novel candidates (e.g., $Rb_2(NbBr_3)_3$) with high predicted efficiencies, demonstrating the ability to map complex structure-property relationships from limited data.
Citation
@misc{bone2025discoveryrecoverycrystallinematerials,
title={Discovery and recovery of crystalline materials with property-conditioned transformers},
author={Cyprien Bone and Matthew Walker and Kuangdai Leng and Luis M. Antunes and Ricardo Grau-Crespo and Amil Aligayev and Javier Dominguez and Keith T. Butler},
year={2025},
eprint={2511.21299},
archivePrefix={arXiv},
primaryClass={cond-mat.mtrl-sci},
url={[https://arxiv.org/abs/2511.21299](https://arxiv.org/abs/2511.21299)},
}
- Downloads last month
- 36
Model tree for c-bone/CrystaLLM-pi_SLME
Base model
c-bone/CrystaLLM-pi_base