Truthful DPO TomGrc FusionNet 34Bx2 MoE by cloudyu

 ยป  All LLMs  ยป  cloudyu  ยป  Truthful DPO TomGrc FusionNet 34Bx2 MoE   URL Share it on

  Autotrain compatible   Conversational   Dpo   Endpoints compatible   Mixtral   Moe   Region:us   Rl-tuned   Safetensors   Sharded   Tensorflow

Truthful DPO TomGrc FusionNet 34Bx2 MoE Benchmarks

Truthful DPO TomGrc FusionNet 34Bx2 MoE (cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE)

Truthful DPO TomGrc FusionNet 34Bx2 MoE Parameters and Internals

Model Type 
DPO, RL-TUNED
Training Details 
Data Sources:
jondurbin/truthy-dpo-v0.1
Methodology:
DPO Trainer TRL supports the DPO Trainer for training language models from preference data.
LLM NameTruthful DPO TomGrc FusionNet 34Bx2 MoE
Repository ๐Ÿค—https://huggingface.co/cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE 
Model Size60.8b
Required VRAM121.8 GB
Updated2025-06-01
Maintainercloudyu
Model Typemixtral
Model Files  4.9 GB: 1-of-25   4.8 GB: 2-of-25   4.9 GB: 3-of-25   4.8 GB: 4-of-25   4.9 GB: 5-of-25   4.8 GB: 6-of-25   4.9 GB: 7-of-25   5.0 GB: 8-of-25   5.0 GB: 9-of-25   5.0 GB: 10-of-25   5.0 GB: 11-of-25   5.0 GB: 12-of-25   5.0 GB: 13-of-25   5.0 GB: 14-of-25   5.0 GB: 15-of-25   5.0 GB: 16-of-25   5.0 GB: 17-of-25   5.0 GB: 18-of-25   5.0 GB: 19-of-25   5.0 GB: 20-of-25   5.0 GB: 21-of-25   5.0 GB: 22-of-25   5.0 GB: 23-of-25   5.0 GB: 24-of-25   2.8 GB: 25-of-25
Model ArchitectureMixtralForCausalLM
Licensemit
Context Length32768
Model Max Length32768
Transformers Version4.37.1
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size64000
Torch Data Typebfloat16

Best Alternatives to Truthful DPO TomGrc FusionNet 34Bx2 MoE

Best Alternatives
Context / RAM
Downloads
Likes
Mixtral 34Bx2 MoE 60B195K / 121.9 GB3613112
Yi 34Bx2 MoE 60B DPO195K / 121.8 GB36233
Bagel Hermes 2x34B195K / 121.9 GB7316
Yi 34Bx2 MoE 200K195K / 121.9 GB36062
Yi 34Bx2 MoE 60B195K / 121.9 GB361465
...34Bx2 MoE V0.1 Full Linear DPO195K / 121.8 GB132
FusionNet 34Bx2 MoE V0.1195K / 121.2 GB178
... Cloudyu Mixtral 34Bx2 MoE 60B195K / 121.8 GB220
FusionNet 34Bx2 MoE32K / 121.2 GB178
Nous Hermes 2 MoE 2x34B4K / 121.9 GB130
Note: green Score (e.g. "73.2") means that the model is better than cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE.

Rank the Truthful DPO TomGrc FusionNet 34Bx2 MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 47753 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227