ethanolivertroy's picture
Upload NIST v1.1 MLX model (530,912 examples, 596 documents)
b682563 verified
metadata
license: cc0-1.0
base_model: mlx-community/Qwen2.5-Coder-7B-Instruct-4bit
tags:
  - mlx
  - cybersecurity
  - nist
  - security-controls
  - compliance
  - fine-tuned
language:
  - en

HackIDLE-NIST-Coder v1.1 (MLX 4-bit)

The most comprehensive NIST cybersecurity model - Fine-tuned on 530,912 examples from 596 NIST publications.

Model Overview

This is an MLX-optimized 4-bit quantized model fine-tuned specifically for NIST cybersecurity expertise. Version 1.1 includes significant improvements over v1.0:

  • +7,206 training examples (530,912 total)
  • +28 new documents (596 NIST publications)
  • CSWP series added: CSF 2.0, Zero Trust Architecture, Post-Quantum Cryptography
  • Improved quality: Fixed 6,150 malformed DOI links

Training Results

  • Training iterations: 1,000 (+ 200 checkpoint recovery)
  • Best validation loss: 1.512 (12.5% improvement)
  • Training loss: 1.420 (final)
  • Trainable parameters: 11.5M (0.151% of 7.6B total)
  • Training time: ~5 hours on M4 Max

Installation

pip install mlx-lm

Usage

from mlx_lm import load, generate

model, tokenizer = load("ethanolivertroy/HackIDLE-NIST-Coder-v1.1-MLX-4bit")

prompt = "What is Zero Trust Architecture according to NIST SP 800-207?"
response = generate(model, tokenizer, prompt=prompt, max_tokens=500)
print(response)

Other Formats

License

CC0 1.0 Universal (Public Domain) - All NIST publications are in the public domain.


Version: 1.1 Release Date: October 2025