bird-of-paradise commited on
Commit
fe9be17
·
verified ·
1 Parent(s): a90af63

cross referencing extended work.

Browse files
Files changed (1) hide show
  1. README.md +8 -0
README.md CHANGED
@@ -27,6 +27,7 @@ datasets:
27
  8. [Optimization Techniques](#8-optimization-techniques)
28
  9. [Lessons Learned](#9-lessons-learned-from-implementing-muon)
29
  10. [Concludsion](#10-conclusion)
 
30
 
31
  ## 🧪 Try It Yourself
32
 
@@ -253,7 +254,14 @@ Muon represents a significant advancement in neural network optimization by addr
253
 
254
  As you build and train your own models, consider Muon for hidden layer optimization, especially during pre-training phases where building new capabilities is the priority.
255
 
 
 
 
 
 
 
256
  ---
 
257
  👉 If you find this useful, please ⭐️ the repo or cite it in your work — it helps support the project.
258
 
259
  ## 📖 Citation
 
27
  8. [Optimization Techniques](#8-optimization-techniques)
28
  9. [Lessons Learned](#9-lessons-learned-from-implementing-muon)
29
  10. [Concludsion](#10-conclusion)
30
+ 10. [Extended Work](#11-extended-work)
31
 
32
  ## 🧪 Try It Yourself
33
 
 
254
 
255
  As you build and train your own models, consider Muon for hidden layer optimization, especially during pre-training phases where building new capabilities is the priority.
256
 
257
+
258
+ ## 11. Extended Work
259
+ For the distributed (DP × TP) implementation built for CPU/Gloo environments, see:
260
+
261
+ [🧩 The "Muon is Scalable" Blueprint: A Distributed Muon Engineering Breakdown (CPU-Friendly, Tutorial Style)](https://huggingface.co/datasets/bird-of-paradise/muon-distributed)
262
+
263
  ---
264
+
265
  👉 If you find this useful, please ⭐️ the repo or cite it in your work — it helps support the project.
266
 
267
  ## 📖 Citation