Ramonda 7B DPO Ties by mayacinka

 ยป  All LLMs  ยป  mayacinka  ยป  Ramonda 7B DPO Ties   URL Share it on

  Merged Model   Autotrain compatible   Bardsai/jaskier-7b-dpo-v4.3 Base model:bardsai/jaskier-7b-...   Base model:paulml/ogno-7b   Endpoints compatible   License:apache-2.0   Mistral   Model-index   Paulml/ogno-7b   Region:us   Safetensors   Sharded   Tensorflow

Ramonda 7B DPO Ties Benchmarks

nn.n% — How the model compares to the GPT-4.

Rank the Ramonda 7B DPO Ties Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Ramonda 7B DPO Ties (mayacinka/ramonda-7b-dpo-ties)

Best Alternatives to Ramonda 7B DPO Ties

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
M7 7B76.8232K / 14.4 GB525515
J4rviz V3.076.5832K / 14.4 GB4070
Nexim 7B76.5332K / 14.4 GB4740
Calme 7B Instruct V0.376.532K / 14.4 GB14435
TriFusionNexus 7B76.3232K / 14.4 GB4730
OGNO 7B DPO Truthful76.1432K / 14.4 GB16771
Cyrax 7B75.9832K / 14.4 GB14199
NeuralTrix 7B DPO Laser75.9232K / 14.4 GB26696
Prima LelantaclesV6.69 7B75.732K / 14.5 GB4513
NeuralTrix 7B DPO Relaser75.6632K / 14.4 GB17982
Note: green Score (e.g. "73.2") means that the model is better than mayacinka/ramonda-7b-dpo-ties.

Ramonda 7B DPO Ties Parameters and Internals

LLM NameRamonda 7B DPO Ties
RepositoryOpen on ๐Ÿค— 
Base Model(s)  OGNO 7B   bardsai/jaskier-7b-dpo-v4.3   paulml/OGNO-7B   bardsai/jaskier-7b-dpo-v4.3
Merged ModelYes
Model Size7b
Required VRAM14.4 GB
Updated2024-06-24
Maintainermayacinka
Model Typemistral
Model Files  2.0 GB: 1-of-8   2.0 GB: 2-of-8   1.9 GB: 3-of-8   2.0 GB: 4-of-8   1.9 GB: 5-of-8   1.9 GB: 6-of-8   1.9 GB: 7-of-8   0.8 GB: 8-of-8
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.35.2
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typefloat16

What open-source LLMs or SLMs are you in search of? 34902 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801