An infinitty max input tokenizer token length
#146
by
kdduha
- opened
Hi! Is it okay? https://huggingface.co/openai/gpt-oss-120b/blob/main/tokenizer_config.json#L180
Max context length is 131,072 as far as I know from the model card
https://platform.openai.com/docs/models/gpt-oss-120b
Yeah, this seems off. As an alternative consider:
config = AutoConfig.from_pretrained("openai/gpt-oss-120b")
context_length = config.max_position_embeddings