An infinitty max input tokenizer token length

#146
by kdduha - opened

Yeah, this seems off. As an alternative consider:

    config = AutoConfig.from_pretrained("openai/gpt-oss-120b")
    context_length = config.max_position_embeddings

Sign up or log in to comment