Ettin Checkpoints
	
 
 
 

This repository contains the raw training checkpoints for the mmBERT models.  Each model contains three subfolders for decay, ext, and pretrain.
These files work with Composer and contain all state needed to resume pre-training. Please see the ModernBERT repository for usage details. 
	
		
	
	
		π Related Resources
	
	
		
	
	
		Citation
	
@misc{marone2025mmbertmodernmultilingualencoder,
      title={mmBERT: A Modern Multilingual Encoder with Annealed Language Learning}, 
      author={Marc Marone and Orion Weller and William Fleshman and Eugene Yang and Dawn Lawrie and Benjamin Van Durme},
      year={2025},
      eprint={2509.06888},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2509.06888}, 
}