uploaded readme
Browse files
README.md
ADDED
|
@@ -0,0 +1,76 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Quantization made by Richard Erkhov.
|
| 2 |
+
|
| 3 |
+
[Github](https://github.com/RichardErkhov)
|
| 4 |
+
|
| 5 |
+
[Discord](https://discord.gg/pvy7H8DZMG)
|
| 6 |
+
|
| 7 |
+
[Request more models](https://github.com/RichardErkhov/quant_request)
|
| 8 |
+
|
| 9 |
+
|
| 10 |
+
Llama3-KALE-LM-Chem-1.5-8B - bnb 8bits
|
| 11 |
+
- Model creator: https://huggingface.co/USTC-KnowledgeComputingLab/
|
| 12 |
+
- Original model: https://huggingface.co/USTC-KnowledgeComputingLab/Llama3-KALE-LM-Chem-1.5-8B/
|
| 13 |
+
|
| 14 |
+
|
| 15 |
+
|
| 16 |
+
|
| 17 |
+
Original model description:
|
| 18 |
+
---
|
| 19 |
+
license: llama3
|
| 20 |
+
language:
|
| 21 |
+
- en
|
| 22 |
+
base_model:
|
| 23 |
+
- meta-llama/Meta-Llama-3-8B-Instruct
|
| 24 |
+
tags:
|
| 25 |
+
- KALE-LM
|
| 26 |
+
- science
|
| 27 |
+
- chemistry
|
| 28 |
+
pipeline_tag: text-generation
|
| 29 |
+
---
|
| 30 |
+
|
| 31 |
+
# Llama3-KALE-LM-Chem-1.5-8B
|
| 32 |
+
|
| 33 |
+
## Introduction
|
| 34 |
+
|
| 35 |
+
We are thrilled to present Llama3-KALE-LM-Chem-1.5-8B, a new version of our open-source KALE-LM for science, which specializes in chemistry.
|
| 36 |
+
|
| 37 |
+
We have trained our model with a larger amount of data.
|
| 38 |
+
|
| 39 |
+
## Benchmarks
|
| 40 |
+
|
| 41 |
+
### Open Benchmarks
|
| 42 |
+
| Models | ChemBench | MMLU | MMLU-Chem | SciQ | IE(Acc) | IE(LS) |
|
| 43 |
+
| ---- | ---- | ---- | ---- | ---- | ---- | ---- |
|
| 44 |
+
| GPT-3.5 | 47.15 | 69.75 | 53.32 | 89.6 | 52.98 | 68.28 |
|
| 45 |
+
| GPT-4 | 53.72 | 78.67 | 63.70 | 94.10 | 54.20 | 69.74 |
|
| 46 |
+
| Llama3-8B-Instruct | 46.02 | 68.3 | 51.10 | 93.30 | 45.83 | 61.22 |
|
| 47 |
+
| LlaSMol | 28.47 | 54.47 | 33.24 | 72.30 | 2.16 | 3.23 |
|
| 48 |
+
| ChemDFM | 44.44 | 58.11 | 45.60 | 86.70 | 7.61 | 11.49 |
|
| 49 |
+
| ChemLLM-7B-Chat | 34.16 | 61.79 | 48.39 | 94.00 | 29.66 | 39.17 |
|
| 50 |
+
| ChemLLM-7B-Chat-1.5-SFT | 42.75 | 63.56 | 49.63 | **95.10** | 14.96 | 19.61 |
|
| 51 |
+
| **Llama3-KALE-LM-Chem-1.5-8B** | **57.01** | 68.06 | **54.83** | 91.60 | **57.53** | **64.16** |
|
| 52 |
+
|
| 53 |
+
#### ChemBench Details (Evaluated By OpenCompass)
|
| 54 |
+
|
| 55 |
+
| Models | NC | PP | M2C | C2M | PP | RS | YP | TP | SP | Average |
|
| 56 |
+
| ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ | ------ |
|
| 57 |
+
| GPT-3.5 | 46.93 | 56.98 | 85.28 | 38.25 | 43.67 | 42.33 | 30.33 | 42.57 | 38 | 47.15 |
|
| 58 |
+
| GPT-4 | 54.82 | 65.02 | 92.64 | 52.88 | 62.67 | 52.67 | 42.33 | 24.75 | 35.67 | 53.72 |
|
| 59 |
+
| Llama3-8B-Instruct | 51.31 | 27.79 | 90.30 | 40.88 | 34.00 | 30.00 | 45.33 | 60.89 | 33.67 | 46.02 |
|
| 60 |
+
| LlaSMol | 27.78 | 29.34 | 31.44 | 23.38 | 25.67 | 24.00 | 37.33 | 34.65 | 22.67 | 28.47 |
|
| 61 |
+
| ChemDFM | 36.92 | 55.57 | 83.95 | 42.00 | 40.00 | 37.33 | 39.00 | 33.17 | 32.00 | 44.44 |
|
| 62 |
+
| ChemLLM-7B-Chat | 41.05 | 29.76 | 85.28 | 26.12 | 26.00 | 24.00 | 20.00 | 24.26 | 31.00 | 34.16 |
|
| 63 |
+
| ChemLLM-7B-Chat-1.5-SFT | 50.06 | 49.51 | 85.28 | 38.75 | 38.00 | 26.67 | 28.33 | 31.68 | 33.67 | 42.44 |
|
| 64 |
+
| Llama3-KALE-LM-Chem-1.5-8B | 61.33 | 43.44 | 90.30 | 53.62 | 72.67 | 53.67 | 46.00 | 47.03 | 45.00 | 57.01 |
|
| 65 |
+
|
| 66 |
+
## Cite This Work
|
| 67 |
+
|
| 68 |
+
```
|
| 69 |
+
@article{dai2024kale,
|
| 70 |
+
title={KALE-LM: Unleash The Power Of AI For Science Via Knowledge And Logic Enhanced Large Model},
|
| 71 |
+
author={Dai, Weichen and Chen, Yezeng and Dai, Zijie and Huang, Zhijie and Liu, Yubo and Pan, Yixuan and Song, Baiyang and Zhong, Chengli and Li, Xinhe and Wang, Zeyu and others},
|
| 72 |
+
journal={arXiv preprint arXiv:2409.18695},
|
| 73 |
+
year={2024}
|
| 74 |
+
}
|
| 75 |
+
```
|
| 76 |
+
|