Laser Dolphin Mixtral 2x7b DPO by macadeliccc

 ยป  All LLMs  ยป  macadeliccc  ยป  Laser Dolphin Mixtral 2x7b DPO   URL Share it on

  Arxiv:2312.13558   Autotrain compatible   Endpoints compatible   Mixtral   Model-index   Moe   Region:us   Safetensors   Sharded   Tensorflow

Laser Dolphin Mixtral 2x7b DPO Benchmarks

Laser Dolphin Mixtral 2x7b DPO (macadeliccc/laser-dolphin-mixtral-2x7b-dpo)

Laser Dolphin Mixtral 2x7b DPO Parameters and Internals

Model Type 
MoE implementation, text-generation
Use Cases 
Areas:
Research, Commercial Applications
Additional Notes 
Laser-Dolphin Mixtral is a medium-sized MoE model based on Dolphin model with enhancements and quantizations for better performance.
Input Output 
Accepted Modalities:
text
Output Format:
Text generation outputs
Release Notes 
Date:
2024-01-31
Notes:
Evaluation Score (v2): 72.76 with increased performance.
LLM NameLaser Dolphin Mixtral 2x7b DPO
Repository ๐Ÿค—https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo 
Model Size12.9b
Required VRAM25.8 GB
Updated2025-02-05
Maintainermacadeliccc
Model Typemixtral
Model Files  9.9 GB: 1-of-3   10.0 GB: 2-of-3   5.9 GB: 3-of-3
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typebfloat16

Quantized Models of the Laser Dolphin Mixtral 2x7b DPO

Model
Likes
Downloads
VRAM
... Dolphin Mixtral 2x7b DPO GGUF4818194 GB
...r Dolphin Mixtral 2x7b DPO AWQ9737 GB
... Dolphin Mixtral 2x7b DPO GPTQ10177 GB

Best Alternatives to Laser Dolphin Mixtral 2x7b DPO

Best Alternatives
Context / RAM
Downloads
Likes
MixTAO 7Bx2 MoE V8.132K / 25.8 GB1021455
Inf Silent Kunoichi V0.1 2x7B32K / 25.6 GB50
MixTAO 7Bx2 MoE V8.132K / 25.8 GB876052
Inf Silent Kunoichi V0.2 2x7B32K / 25.6 GB50
LogoS 7Bx2 MoE 13B V0.232K / 25.9 GB314010
MultiMash8 13B Slerp32K / 25.7 GB80
MultiMash11 13B Slerp32K / 25.7 GB60
MultiMash9 13B Slerp32K / 25.7 GB80
MultiMash10 13B Slerp32K / 25.7 GB70
MixTaoTruthful 13B Slerp32K / 25.7 GB50
Note: green Score (e.g. "73.2") means that the model is better than macadeliccc/laser-dolphin-mixtral-2x7b-dpo.

Rank the Laser Dolphin Mixtral 2x7b DPO Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227