Truthful DPO Cloudyu Mixtral 34Bx2 MoE 60B by cloudyu

 ยป  All LLMs  ยป  cloudyu  ยป  Truthful DPO Cloudyu Mixtral 34Bx2 MoE 60B   URL Share it on

  Autotrain compatible   Dpo   Endpoints compatible   Mixtral   Moe   Region:us   Rl-tuned   Safetensors   Sharded   Tensorflow

Truthful DPO Cloudyu Mixtral 34Bx2 MoE 60B Benchmarks

Truthful DPO Cloudyu Mixtral 34Bx2 MoE 60B (cloudyu/Truthful_DPO_cloudyu_Mixtral_34Bx2_MoE_60B)

Truthful DPO Cloudyu Mixtral 34Bx2 MoE 60B Parameters and Internals

Model Type 
moe, DPO, RL-TUNED
Additional Notes 
Supports the DPO Trainer for training language models from preference data.
Training Details 
Data Sources:
jondurbin/truthy-dpo-v0.1
Methodology:
Direct Preference Optimization
LLM NameTruthful DPO Cloudyu Mixtral 34Bx2 MoE 60B
Repository ๐Ÿค—https://huggingface.co/cloudyu/Truthful_DPO_cloudyu_Mixtral_34Bx2_MoE_60B 
Model Size60.8b
Required VRAM121.8 GB
Updated2025-02-05
Maintainercloudyu
Model Typemixtral
Model Files  4.9 GB: 1-of-25   4.8 GB: 2-of-25   4.9 GB: 3-of-25   4.8 GB: 4-of-25   4.9 GB: 5-of-25   4.8 GB: 6-of-25   4.9 GB: 7-of-25   5.0 GB: 8-of-25   5.0 GB: 9-of-25   5.0 GB: 10-of-25   5.0 GB: 11-of-25   5.0 GB: 12-of-25   5.0 GB: 13-of-25   5.0 GB: 14-of-25   5.0 GB: 15-of-25   5.0 GB: 16-of-25   5.0 GB: 17-of-25   5.0 GB: 18-of-25   5.0 GB: 19-of-25   5.0 GB: 20-of-25   5.0 GB: 21-of-25   5.0 GB: 22-of-25   5.0 GB: 23-of-25   5.0 GB: 24-of-25   2.8 GB: 25-of-25
Model ArchitectureMixtralForCausalLM
Licensemit
Context Length200000
Model Max Length200000
Transformers Version4.37.1
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size64000
Torch Data Typebfloat16

Best Alternatives to Truthful DPO Cloudyu Mixtral 34Bx2 MoE 60B

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 34Bx2 MoE 60B195K / 121.9 GB4251112
Yi 34Bx2 MoE 60B DPO195K / 121.8 GB42113
Bagel Hermes 2x34B195K / 121.9 GB19516
Yi 34Bx2 MoE 200K195K / 121.9 GB41922
Yi 34Bx2 MoE 60B195K / 121.9 GB425565
...34Bx2 MoE V0.1 Full Linear DPO195K / 121.8 GB602
FusionNet 34Bx2 MoE V0.1195K / 121.2 GB558
FusionNet 34Bx2 MoE32K / 121.2 GB12128
...DPO TomGrc FusionNet 34Bx2 MoE32K / 121.8 GB1204
Nous Hermes 2 MoE 2x34B4K / 121.9 GB12280
Note: green Score (e.g. "73.2") means that the model is better than cloudyu/Truthful_DPO_cloudyu_Mixtral_34Bx2_MoE_60B.

Rank the Truthful DPO Cloudyu Mixtral 34Bx2 MoE 60B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227