Upload README.md with huggingface_hub
Browse files
    	
        README.md
    CHANGED
    
    | @@ -40,7 +40,7 @@ Chocolate cake. | |
| 40 | 
             
            ```
         | 
| 41 | 
             
            Note that a beginning-of-sequence (BOS) token is automatically added by all Archangel models during tokenization and does not have to be added by you. No end-of-sequence (EOS) token is added to the prompt.
         | 
| 42 |  | 
| 43 | 
            -
            For models trained with our conditional  | 
| 44 | 
             
            To generate with these control tokens in the context, postpend either to the prompt.
         | 
| 45 |  | 
| 46 | 
             
            Please refer to our [code repository](https://github.com/ContextualAI/HALOs) or [blog](https://contextual.ai/better-cheaper-faster-llm-alignment-with-kto/) which contains intructions for training your own HALOs and links to our model cards.
         | 
|  | |
| 40 | 
             
            ```
         | 
| 41 | 
             
            Note that a beginning-of-sequence (BOS) token is automatically added by all Archangel models during tokenization and does not have to be added by you. No end-of-sequence (EOS) token is added to the prompt.
         | 
| 42 |  | 
| 43 | 
            +
            For models trained with our conditional SFT model, the tokenizers have additional tokens `<|good|>` and `<|bad|>` included in the embeddings. 
         | 
| 44 | 
             
            To generate with these control tokens in the context, postpend either to the prompt.
         | 
| 45 |  | 
| 46 | 
             
            Please refer to our [code repository](https://github.com/ContextualAI/HALOs) or [blog](https://contextual.ai/better-cheaper-faster-llm-alignment-with-kto/) which contains intructions for training your own HALOs and links to our model cards.
         | 
