LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Laser Dolphin Mixtral 2x7b DPO GPTQ by TheBloke

What open-source LLMs or SLMs are you in search of? 18857 in total.

 ยป  All LLMs  ยป  TheBloke  ยป  Laser Dolphin Mixtral 2x7b DPO GPTQ   URL Share it on

  4-bit   Autotrain compatible Base model:macadeliccc/laser-d...   Gptq   License:apache-2.0   Mixtral   Moe   Quantized   Region:us   Safetensors

Laser Dolphin Mixtral 2x7b DPO GPTQ Benchmarks

Rank the Laser Dolphin Mixtral 2x7b DPO GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Laser Dolphin Mixtral 2x7b DPO GPTQ (TheBloke/laser-dolphin-mixtral-2x7b-dpo-GPTQ)

Best Alternatives to Laser Dolphin Mixtral 2x7b DPO GPTQ

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Mixtral 7Bx2 MoE GPTQ64.432K / 7.1 GB1168
Mixtral 7Bx2 MoE AWQ64.432K / 7.1 GB52
...r Dolphin Mixtral 2x7b DPO AWQ58.932K / 7.1 GB3998
...r Dolphin Mixtral 2x7b DPO AWQ32K / 7.1 GB1130
Manbasya 2x7b MoE32K / 7.1 GB140
Blue Orchid 2x7b AWQ8K / 7.1 GB121
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/laser-dolphin-mixtral-2x7b-dpo-GPTQ.

Laser Dolphin Mixtral 2x7b DPO GPTQ Parameters and Internals

LLM NameLaser Dolphin Mixtral 2x7b DPO GPTQ
RepositoryOpen on ๐Ÿค— 
Model NameLaser Dolphin Mixtral 2X7B DPO
Model Creatortim
Base Model(s)  Laser Dolphin Mixtral 2x7b DPO   macadeliccc/laser-dolphin-mixtral-2x7b-dpo
Model Size2b
Required VRAM7.1 GB
Updated2024-02-28
MaintainerTheBloke
Model Typemixtral
Model Files  7.1 GB
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32001
Initializer Range0.02
Torch Data Typebfloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003