Bart Base by facebook

 ยป  All LLMs  ยป  facebook  ยป  Bart Base   URL Share it on

  Arxiv:1910.13461   Bart   En   Endpoints compatible   Feature-extraction   Jax   Pytorch   Region:us   Safetensors   Tf
Model Card on HF ๐Ÿค—: https://huggingface.co/facebook/bart-base 

Bart Base Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Bart Base (facebook/bart-base)

Bart Base Parameters and Internals

Model Type 
Transformer encoder-decoder (seq2seq)
Use Cases 
Primary Use Cases:
Text generation (e.g. summarization, translation), Comprehension tasks (e.g. text classification, question answering)
Limitations:
Model is mostly meant to be fine-tuned on a supervised dataset
Additional Notes 
The team releasing BART did not write a model card for this model; it was written by the Hugging Face team.
LLM NameBart Base
Repository ๐Ÿค—https://huggingface.co/facebook/bart-base 
Model Size139.4m
Required VRAM0.6 GB
Updated2025-02-22
Maintainerfacebook
Model Typebart
Model Files  0.6 GB   0.6 GB
Supported Languagesen
Model ArchitectureBartModel
Licenseapache-2.0
Context Length1024
Model Max Length1024
Transformers Version4.12.0.dev0
Vocabulary Size50265
Torch Data Typefloat32
Activation Functiongelu

Rank the Bart Base Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43508 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227