Commit
·
12930ff
1
Parent(s):
fbbb799
update usage
Browse files
README.md
CHANGED
|
@@ -2,21 +2,21 @@
|
|
| 2 |
license: mit
|
| 3 |
license_link: https://huggingface.co/microsoft/wavecoder-pro-6.7b/blob/main/LICENSE
|
| 4 |
language:
|
| 5 |
-
- en
|
| 6 |
library_name: transformers
|
| 7 |
datasets:
|
| 8 |
-
- humaneval
|
| 9 |
pipeline_tag: text-generation
|
| 10 |
tags:
|
| 11 |
-
- code
|
| 12 |
metrics:
|
| 13 |
-
- code_eval
|
| 14 |
---
|
|
|
|
| 15 |
<h1 align="center">
|
| 16 |
🌊 WaveCoder: Widespread And Versatile Enhanced Code LLM
|
| 17 |
</h1>
|
| 18 |
|
| 19 |
-
|
| 20 |
<p align="center">
|
| 21 |
<a href="https://arxiv.org/abs/2312.14187"><b>[📜 Paper]</b></a> •
|
| 22 |
<!-- <a href=""><b>[🤗 HF Models]</b></a> • -->
|
|
@@ -33,31 +33,43 @@ metrics:
|
|
| 33 |
Repo for "<a href="https://arxiv.org/abs/2312.14187" target="_blank">WaveCoder: Widespread And Versatile Enhanced Instruction Tuning with Refined Data Generation</a>"
|
| 34 |
</p>
|
| 35 |
|
| 36 |
-
|
| 37 |
## 🔥 News
|
| 38 |
|
| 39 |
-
- [2024/04/10] 🔥🔥🔥
|
| 40 |
- [2023/12/26] WaveCoder paper released.
|
|
|
|
| 41 |
## 💡 Introduction
|
| 42 |
|
| 43 |
-
WaveCoder 🌊 is a series of large language models (LLMs) for the coding domain, designed to solve relevant problems in the field of code through instruction-following learning. Its training dataset was generated from a subset of code-search-net data using a generator-discriminator framework based on LLMs that we proposed, covering four general code-related tasks: code generation, code summary, code translation, and code repair.
|
| 44 |
|
| 45 |
-
| Model
|
| 46 |
-
|
| 47 |
-
| GPT-4
|
| 48 |
-
| [🌊 WaveCoder-DS-6.7B](https://huggingface.co/microsoft/wavecoder-ds-6.7b)
|
| 49 |
-
| [🌊 WaveCoder-Pro-6.7B](https://huggingface.co/microsoft/wavecoder-pro-6.7b)
|
| 50 |
-
| [🌊 WaveCoder-Ultra-6.7B](https://huggingface.co/microsoft/wavecoder-ultra-6.7b) | 79.9
|
| 51 |
|
| 52 |
## 🪁 Evaluation
|
| 53 |
|
| 54 |
Please refer to WaveCoder's [GitHub repo](https://github.com/microsoft/WaveCoder) for inference, evaluation, and training code.
|
| 55 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 56 |
|
| 57 |
## 📖 License
|
| 58 |
-
This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the its [License](https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/LICENSE-MODEL).
|
| 59 |
-
|
| 60 |
|
|
|
|
| 61 |
|
| 62 |
## ☕️ Citation
|
| 63 |
|
|
@@ -71,6 +83,7 @@ If you find this repository helpful, please consider citing our paper:
|
|
| 71 |
year={2023}
|
| 72 |
}
|
| 73 |
```
|
|
|
|
| 74 |
## Note
|
| 75 |
|
| 76 |
-
WaveCoder models are trained on the synthetic data generated by OpenAI models. Please pay attention to OpenAI's [terms of use](https://openai.com/policies/terms-of-use) when using the models and the datasets.
|
|
|
|
| 2 |
license: mit
|
| 3 |
license_link: https://huggingface.co/microsoft/wavecoder-pro-6.7b/blob/main/LICENSE
|
| 4 |
language:
|
| 5 |
+
- en
|
| 6 |
library_name: transformers
|
| 7 |
datasets:
|
| 8 |
+
- humaneval
|
| 9 |
pipeline_tag: text-generation
|
| 10 |
tags:
|
| 11 |
+
- code
|
| 12 |
metrics:
|
| 13 |
+
- code_eval
|
| 14 |
---
|
| 15 |
+
|
| 16 |
<h1 align="center">
|
| 17 |
🌊 WaveCoder: Widespread And Versatile Enhanced Code LLM
|
| 18 |
</h1>
|
| 19 |
|
|
|
|
| 20 |
<p align="center">
|
| 21 |
<a href="https://arxiv.org/abs/2312.14187"><b>[📜 Paper]</b></a> •
|
| 22 |
<!-- <a href=""><b>[🤗 HF Models]</b></a> • -->
|
|
|
|
| 33 |
Repo for "<a href="https://arxiv.org/abs/2312.14187" target="_blank">WaveCoder: Widespread And Versatile Enhanced Instruction Tuning with Refined Data Generation</a>"
|
| 34 |
</p>
|
| 35 |
|
|
|
|
| 36 |
## 🔥 News
|
| 37 |
|
| 38 |
+
- [2024/04/10] 🔥🔥🔥 WaveCoder repo, models released at [🤗 HuggingFace](https://huggingface.co/microsoft/wavecoder-ultra-6.7b)!
|
| 39 |
- [2023/12/26] WaveCoder paper released.
|
| 40 |
+
|
| 41 |
## 💡 Introduction
|
| 42 |
|
| 43 |
+
WaveCoder 🌊 is a series of large language models (LLMs) for the coding domain, designed to solve relevant problems in the field of code through instruction-following learning. Its training dataset was generated from a subset of code-search-net data using a generator-discriminator framework based on LLMs that we proposed, covering four general code-related tasks: code generation, code summary, code translation, and code repair.
|
| 44 |
|
| 45 |
+
| Model | HumanEval | MBPP(500) | HumanEval<br>Fix(Avg.) | HumanEval<br>Explain(Avg.) |
|
| 46 |
+
| -------------------------------------------------------------------------------- | --------- | --------- | ---------------------- | -------------------------- |
|
| 47 |
+
| GPT-4 | 85.4 | - | 47.8 | 52.1 |
|
| 48 |
+
| [🌊 WaveCoder-DS-6.7B](https://huggingface.co/microsoft/wavecoder-ds-6.7b) | 65.8 | 63.0 | 49.5 | 40.8 |
|
| 49 |
+
| [🌊 WaveCoder-Pro-6.7B](https://huggingface.co/microsoft/wavecoder-pro-6.7b) | 74.4 | 63.4 | 52.1 | 43.0 |
|
| 50 |
+
| [🌊 WaveCoder-Ultra-6.7B](https://huggingface.co/microsoft/wavecoder-ultra-6.7b) | 79.9 | 64.6 | 52.3 | 45.7 |
|
| 51 |
|
| 52 |
## 🪁 Evaluation
|
| 53 |
|
| 54 |
Please refer to WaveCoder's [GitHub repo](https://github.com/microsoft/WaveCoder) for inference, evaluation, and training code.
|
| 55 |
|
| 56 |
+
## How to get start with the model
|
| 57 |
+
|
| 58 |
+
```python
|
| 59 |
+
# Load model directly
|
| 60 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
| 61 |
+
# Load model directly
|
| 62 |
+
from transformers import AutoTokenizer, AutoModelForCausalLM
|
| 63 |
+
|
| 64 |
+
tokenizer = AutoTokenizer.from_pretrained("microsoft/wavecoder-ds-6.7b")
|
| 65 |
+
model = AutoModelForCausalLM.from_pretrained("microsoft/wavecoder-ds-6.7b")
|
| 66 |
+
tokenizer = AutoTokenizer.from_pretrained("microsoft/wavecoder-ds-6.7b")
|
| 67 |
+
model = AutoModelForCausalLM.from_pretrained("microsoft/wavecoder-ds-6.7b")
|
| 68 |
+
```
|
| 69 |
|
| 70 |
## 📖 License
|
|
|
|
|
|
|
| 71 |
|
| 72 |
+
This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the its [License](https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/LICENSE-MODEL).
|
| 73 |
|
| 74 |
## ☕️ Citation
|
| 75 |
|
|
|
|
| 83 |
year={2023}
|
| 84 |
}
|
| 85 |
```
|
| 86 |
+
|
| 87 |
## Note
|
| 88 |
|
| 89 |
+
WaveCoder models are trained on the synthetic data generated by OpenAI models. Please pay attention to OpenAI's [terms of use](https://openai.com/policies/terms-of-use) when using the models and the datasets.
|