Llama 3 8Bx2 MoE DPO by cloudyu

 ยป  All LLMs  ยป  cloudyu  ยป  Llama 3 8Bx2 MoE DPO   URL Share it on

  Autotrain compatible   Conversational   Endpoints compatible   Instruct   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Llama 3 8Bx2 MoE DPO Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Llama 3 8Bx2 MoE DPO Parameters and Internals

LLM NameLlama 3 8Bx2 MoE DPO
Repository ๐Ÿค—https://huggingface.co/cloudyu/Llama-3-8Bx2-MOE-DPO 
Model Size13.7b
Required VRAM27.4 GB
Updated2024-09-07
Maintainercloudyu
Model Typemixtral
Instruction-BasedYes
Model Files  10.0 GB: 1-of-3   10.0 GB: 2-of-3   7.4 GB: 3-of-3
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.40.1
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|begin_of_text|>
Vocabulary Size128256
Torch Data Typefloat16
Llama 3 8Bx2 MoE DPO (cloudyu/Llama-3-8Bx2-MOE-DPO)

Best Alternatives to Llama 3 8Bx2 MoE DPO

Best Alternatives
Context / RAM
Downloads
Likes
...ma 3 2x8B Instruct MoE 64K Ctx64K / 27.3 GB64
Defne Llama3 2x8B8K / 27.4 GB51165
Inixion 2x8B V28K / 27.4 GB42
MoE Llama3 8bx2 Rag8K / 27.3 GB50
Inixion 2x8B8K / 27.5 GB41
Llama 3 Chatty 2x8B8K / 27.3 GB711
FinalFintetuning XVIII 2x8B8K / 27.5 GB72
Llama 3 Teal Instruct 2x8B MoE8K / 27.3 GB41
...lama3 2x8b MoE 41K Experiment18K / 27.3 GB62
Note: green Score (e.g. "73.2") means that the model is better than cloudyu/Llama-3-8Bx2-MOE-DPO.

Rank the Llama 3 8Bx2 MoE DPO Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 35693 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803