multi-lang_summay

Fine-tuned seq2seq model for multilingual abstractive summarization.

Usage

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
import torch

repo_id = "vatsal18/multi-lang_summay"
tok = AutoTokenizer.from_pretrained(repo_id)
mdl = AutoModelForSeq2SeqLM.from_pretrained(repo_id).eval()

text = "Paste any article (any supported language) here."
enc = tok(text, return_tensors="pt", truncation=True, max_length=1024)
with torch.no_grad():
    out = mdl.generate(**enc, max_new_tokens=128, num_beams=4, length_penalty=0.8)
print(tok.decode(out[0], skip_special_tokens=True))
Downloads last month
14
Safetensors
Model size
0.6B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support