Coref Roberta Large by nielsr

 ยป  All LLMs  ยป  nielsr  ยป  Coref Roberta Large   URL Share it on

  Arxiv:2004.06870   Dataset:docred   Dataset:fever   Dataset:gap   Dataset:glue   Dataset:quoref   Dataset:wikipedia   Dataset:winogender   Dataset:winograd wsc   En   Endpoints compatible   Exbert   Pytorch   Region:us   Safetensors

Coref Roberta Large Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Coref Roberta Large (nielsr/coref-roberta-large)

Coref Roberta Large Parameters and Internals

Model Type 
Masked Language Model, Mention Reference Prediction
Additional Notes 
The model card was not provided by CorefRoBERTa team and was written by a third party.
Supported Languages 
en (English)
Training Details 
Data Sources:
wikipedia, quoref, docred, fever, gap, winograd_wsc, winogender, glue
Methodology:
Pretrained using Masked Language Modeling (MLM) and Mention Reference Prediction (MRP) objectives.
Model Architecture:
Transformers
LLM NameCoref Roberta Large
Repository ๐Ÿค—https://huggingface.co/nielsr/coref-roberta-large 
Model Size356.5m
Required VRAM1.4 GB
Updated2025-02-19
Maintainernielsr
Model Files  1.4 GB   1.4 GB
Supported Languagesen
Model ArchitectureAutoModel
Licenseapache-2.0
Context Length514
Model Max Length514
Vocabulary Size50265

Rank the Coref Roberta Large Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43318 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227