Upload 25 files
ee215dd
verified
-
models--FacebookAI--roberta-large
Upload 25 files
-
9.06 kB
Upload 25 files
classifier_head.pt
Detected Pickle imports (8)
- "torch.FloatStorage",
- "torch._utils._rebuild_parameter",
- "transformers.models.roberta.modeling_roberta.RobertaClassificationHead",
- "torch.nn.modules.linear.Linear",
- "torch._utils._rebuild_tensor_v2",
- "__builtin__.set",
- "torch.nn.modules.dropout.Dropout",
- "collections.OrderedDict"
How to fix it?
4.21 MB
Upload 25 files
-
482 Bytes
Upload 25 files
-
1.42 GB
Upload 25 files
merged_tsv.pt
Detected Pickle imports (25)
- "transformers.models.roberta.modeling_roberta.RobertaIntermediate",
- "torch.FloatStorage",
- "transformers.models.roberta.modeling_roberta.RobertaEncoder",
- "torch.nn.modules.linear.Linear",
- "transformers.models.roberta.configuration_roberta.RobertaConfig",
- "transformers.models.roberta.modeling_roberta.RobertaClassificationHead",
- "transformers.models.roberta.modeling_roberta.RobertaForSequenceClassification",
- "__builtin__.set",
- "transformers.models.roberta.modeling_roberta.RobertaModel",
- "torch.nn.modules.container.ModuleList",
- "collections.OrderedDict",
- "transformers.models.roberta.modeling_roberta.RobertaOutput",
- "torch.nn.modules.normalization.LayerNorm",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.roberta.modeling_roberta.RobertaSelfAttention",
- "transformers.models.roberta.modeling_roberta.RobertaLayer",
- "transformers.activations.GELUActivation",
- "transformers.models.roberta.modeling_roberta.RobertaSelfOutput",
- "torch._utils._rebuild_parameter",
- "torch.nn.modules.sparse.Embedding",
- "transformers.models.roberta.modeling_roberta.RobertaAttention",
- "torch.nn.modules.dropout.Dropout",
- "transformers.models.roberta.modeling_roberta.RobertaEmbeddings",
- "torch._C._nn.gelu",
- "torch.LongStorage"
How to fix it?
1.42 GB
Upload 25 files
merged_wudi.pt
Detected Pickle imports (25)
- "transformers.models.roberta.modeling_roberta.RobertaIntermediate",
- "torch.FloatStorage",
- "transformers.models.roberta.modeling_roberta.RobertaEncoder",
- "torch.nn.modules.linear.Linear",
- "transformers.models.roberta.configuration_roberta.RobertaConfig",
- "transformers.models.roberta.modeling_roberta.RobertaClassificationHead",
- "transformers.models.roberta.modeling_roberta.RobertaForSequenceClassification",
- "__builtin__.set",
- "transformers.models.roberta.modeling_roberta.RobertaModel",
- "torch.nn.modules.container.ModuleList",
- "collections.OrderedDict",
- "transformers.models.roberta.modeling_roberta.RobertaOutput",
- "torch.nn.modules.normalization.LayerNorm",
- "torch._utils._rebuild_tensor_v2",
- "transformers.models.roberta.modeling_roberta.RobertaSelfAttention",
- "transformers.models.roberta.modeling_roberta.RobertaLayer",
- "transformers.activations.GELUActivation",
- "transformers.models.roberta.modeling_roberta.RobertaSelfOutput",
- "torch._utils._rebuild_parameter",
- "torch.nn.modules.sparse.Embedding",
- "transformers.models.roberta.modeling_roberta.RobertaAttention",
- "torch.nn.modules.dropout.Dropout",
- "transformers.models.roberta.modeling_roberta.RobertaEmbeddings",
- "torch._C._nn.gelu",
- "torch.LongStorage"
How to fix it?
1.42 GB
Upload 25 files
-
456 kB
Upload 25 files
-
1.63 GB
Upload 25 files
-
1.42 GB
Upload 25 files
-
1.43 GB
Upload 25 files
-
1.63 GB
Upload 25 files
-
1.36 MB
Upload 25 files
-
25 Bytes
Upload 25 files
-
899 kB
Upload 25 files