Commit
·
bf1c57e
1
Parent(s):
81cf1a5
Update model_cards/article.md
Browse files- model_cards/article.md +17 -34
model_cards/article.md
CHANGED
|
@@ -2,45 +2,33 @@
|
|
| 2 |
|
| 3 |
**Language model**: Type of language model to be used.
|
| 4 |
|
| 5 |
-
**
|
| 6 |
|
| 7 |
-
**
|
| 8 |
|
| 9 |
-
**
|
| 10 |
|
| 11 |
-
**Prefix**: A text prompt that will be passed to the mode **before** the prompt.
|
| 12 |
|
| 13 |
-
**Top-k**: Number of top-k probability tokens to keep.
|
| 14 |
|
| 15 |
-
|
| 16 |
|
| 17 |
-
**
|
| 18 |
|
|
|
|
| 19 |
|
|
|
|
| 20 |
|
| 21 |
-
|
| 22 |
|
| 23 |
-
**Model
|
| 24 |
-
|
| 25 |
-
**Developers**: HuggingFace developers
|
| 26 |
-
|
| 27 |
-
**Distributors**: HuggingFace developers' code integrated into GT4SD.
|
| 28 |
-
|
| 29 |
-
**Model date**: Varies between models.
|
| 30 |
-
|
| 31 |
-
**Model type**: Different types of `transformers` language models:
|
| 32 |
-
- CTRL: `CTRLLMHeadModel`
|
| 33 |
-
- GPT2: `GPT2LMHeadModel`
|
| 34 |
-
- XLNet: `XLNetLMHeadModel`
|
| 35 |
-
- OpenAIGPT: `OpenAIGPTLMHeadModel`
|
| 36 |
-
- TransfoXL: `TransfoXLLMHeadModel`
|
| 37 |
-
- XLM: `XLMWithLMHeadModel`
|
| 38 |
|
| 39 |
**Information about training algorithms, parameters, fairness constraints or other applied approaches, and features**:
|
| 40 |
N.A.
|
| 41 |
|
| 42 |
**Paper or other resource for more information**:
|
| 43 |
-
|
|
|
|
| 44 |
|
| 45 |
**License**: MIT
|
| 46 |
|
|
@@ -64,15 +52,10 @@ Model card prototype inspired by [Mitchell et al. (2019)](https://dl.acm.org/doi
|
|
| 64 |
|
| 65 |
## Citation
|
| 66 |
```bib
|
| 67 |
-
@
|
| 68 |
-
|
| 69 |
-
|
| 70 |
-
|
| 71 |
-
|
| 72 |
-
year = "2020",
|
| 73 |
-
address = "Online",
|
| 74 |
-
publisher = "Association for Computational Linguistics",
|
| 75 |
-
url = "https://www.aclweb.org/anthology/2020.emnlp-demos.6",
|
| 76 |
-
pages = "38--45"
|
| 77 |
}
|
| 78 |
```
|
|
|
|
| 2 |
|
| 3 |
**Language model**: Type of language model to be used.
|
| 4 |
|
| 5 |
+
**Prefix**: Task specific prefix for task definition (see the provided examples for specific tasks).
|
| 6 |
|
| 7 |
+
**Text prompt**: The text input of the model.
|
| 8 |
|
| 9 |
+
**Num beams**: Number of beams to be used for the text generation.
|
| 10 |
|
|
|
|
| 11 |
|
|
|
|
| 12 |
|
| 13 |
+
# Model card -- PatentGenerativeTransformer
|
| 14 |
|
| 15 |
+
**Model Details**: Text+chem T5 : a multi-domain, multi-task language model to solve a wide range of tasks in both the chemical and natural language domains. Published by [Christofidellis et al.](https://arxiv.org/pdf/2301.12586.pdf)
|
| 16 |
|
| 17 |
+
**Developers**: Dimitrios Christofidellis, Giorgio Giannone, Jannis Born and Matteo Manica from IBM Research and Ole Winther from Technical University of Denmark.
|
| 18 |
|
| 19 |
+
**Distributors**: Model natively integrated into GT4SD.
|
| 20 |
|
| 21 |
+
**Model date**: 2022.
|
| 22 |
|
| 23 |
+
**Model type**: A Transformer-based language model that is trained on a multi-domain and a multi-task dataset by aggregating available datasets
|
| 24 |
+
for the tasks of Forward reaction prediction, Retrosynthesis, Molecular captioning, Text-conditional de novo generation and Paragraph to actions.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 25 |
|
| 26 |
**Information about training algorithms, parameters, fairness constraints or other applied approaches, and features**:
|
| 27 |
N.A.
|
| 28 |
|
| 29 |
**Paper or other resource for more information**:
|
| 30 |
+
The Text+chem T5 [Christofidellis et al.](https://arxiv.org/pdf/2301.12586.pdf)
|
| 31 |
+
|
| 32 |
|
| 33 |
**License**: MIT
|
| 34 |
|
|
|
|
| 52 |
|
| 53 |
## Citation
|
| 54 |
```bib
|
| 55 |
+
@article{christofidellis2023unifying,
|
| 56 |
+
title={Unifying Molecular and Textual Representations via Multi-task Language Modelling},
|
| 57 |
+
author={Christofidellis, Dimitrios and Giannone, Giorgio and Born, Jannis and Winther, Ole and Laino, Teodoro and Manica, Matteo},
|
| 58 |
+
journal={arXiv preprint arXiv:2301.12586},
|
| 59 |
+
year={2023}
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 60 |
}
|
| 61 |
```
|