StrangeMerges 25 7B Dare Ties by Gille

 ยป  All LLMs  ยป  Gille  ยป  StrangeMerges 25 7B Dare Ties   URL Share it on

  Merged Model   Autotrain compatible   Bardsai/jaskier-7b-dpo-v5.6 Base model:bardsai/jaskier-7b-...   Endpoints compatible   License:apache-2.0   Mistral   Model-index   Region:us   Safetensors   Sharded   Tensorflow

Rank the StrangeMerges 25 7B Dare Ties Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
StrangeMerges 25 7B Dare Ties (Gille/StrangeMerges_25-7B-dare_ties)

Best Alternatives to StrangeMerges 25 7B Dare Ties

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
KAI 7B V0.174.4532K / 14.4 GB679
Dolphin 2.2.1 Mistral 7B73.1732K / 14.4 GB24943182
Mistral 7B V0.168.5332K / 14.4 GB21575143081
...andle Dolphin 2.2.1 Mistral 7B64.232K / 14.4 GB820
Mistral 7B Instruct V0.262.2332K / 14.4 GB25892111839
Notus 7B V160.1532K / 14.4 GB5789111
...t 3.5 0106 128K 3.0bpw H6 EXL260.1128K / 3 GB80
...t 3.5 0106 128K 4.0bpw H6 EXL260.1128K / 3.9 GB151
...t 3.5 0106 128K 5.0bpw H6 EXL260.1128K / 4.7 GB50
...t 3.5 0106 128K 6.0bpw H6 EXL260.1128K / 5.6 GB50
Note: green Score (e.g. "73.2") means that the model is better than Gille/StrangeMerges_25-7B-dare_ties.

StrangeMerges 25 7B Dare Ties Parameters and Internals

LLM NameStrangeMerges 25 7B Dare Ties
RepositoryOpen on ๐Ÿค— 
Base Model(s)  StrangeMerges 21 7B Slerp   Jaskier 7B DPO V5.6   Gille/StrangeMerges_21-7B-slerp   bardsai/jaskier-7b-dpo-v5.6
Merged ModelYes
Model Size7b
Required VRAM14.4 GB
Updated2024-04-13
MaintainerGille
Model Typemistral
Model Files  2.0 GB: 1-of-8   1.9 GB: 2-of-8   2.0 GB: 3-of-8   2.0 GB: 4-of-8   1.9 GB: 5-of-8   1.9 GB: 6-of-8   1.9 GB: 7-of-8   0.8 GB: 8-of-8
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.35.2
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 36560 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024040901