Starling LM 7B Beta by Nexusflow

 ยป  All LLMs  ยป  Nexusflow  ยป  Starling LM 7B Beta   URL Share it on

  Arxiv:1909.08593   Autotrain compatible   Conversational   Dataset:berkeley-nest/nectar   En   Endpoints compatible   Has space   License:apache-2.0   Mistral   Region:us   Reward model   Rlaif   Rlhf   Safetensors   Sharded   Tensorflow

Starling LM 7B Beta Benchmarks

nn.n% — How the model compares to the GPT-4.

Rank the Starling LM 7B Beta Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Starling LM 7B Beta (Nexusflow/Starling-LM-7B-beta)

Quantized Models of the Starling LM 7B Beta

Model
Likes
Downloads
VRAM
Starling LM 7B Beta GPTQ25604 GB

Best Alternatives to Starling LM 7B Beta

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
KAI 7B V0.174.4532K / 14.4 GB3710
Dolphin 2.2.1 Mistral 7B73.1732K / 14.4 GB23457184
Mistral 7B V0.168.5332K / 14.4 GB20202313099
Mistral 7B Instruct V0.262.2332K / 14.4 GB29817401932
Notus 7B V160.1532K / 14.4 GB5906111
...t 3.5 0106 128K 3.0bpw H6 EXL260.1128K / 3 GB100
...t 3.5 0106 128K 4.0bpw H6 EXL260.1128K / 3.9 GB131
...t 3.5 0106 128K 5.0bpw H6 EXL260.1128K / 4.7 GB50
...t 3.5 0106 128K 6.0bpw H6 EXL260.1128K / 5.6 GB70
...t 3.5 0106 128K 8.0bpw H8 EXL260.1128K / 7.4 GB121
Note: green Score (e.g. "73.2") means that the model is better than Nexusflow/Starling-LM-7B-beta.

Starling LM 7B Beta Parameters and Internals

LLM NameStarling LM 7B Beta
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM14.4 GB
Updated2024-04-18
MaintainerNexusflow
Model Typemistral
Model Files  4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.5 GB: 3-of-3
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.37.1
Tokenizer ClassLlamaTokenizer
Padding Token<|end_of_turn|>
Vocabulary Size32002
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35008 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024040901