19B MATH DPO by cloudyu

 ยป  All LLMs  ยป  cloudyu  ยป  19B MATH DPO   URL Share it on

  Autotrain compatible   Conversational   Dpo   Endpoints compatible   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/cloudyu/19B_MATH_DPO 

19B MATH DPO Benchmarks

19B MATH DPO (cloudyu/19B_MATH_DPO)

19B MATH DPO Parameters and Internals

Model Type 
dpo, moe
Additional Notes 
This model uses DPO (Direct Preference Optimization) and MoE (Mixture of Experts) with 19 billion parameters.
LLM Name19B MATH DPO
Repository ๐Ÿค—https://huggingface.co/cloudyu/19B_MATH_DPO 
Model Size19.2b
Required VRAM38.4 GB
Updated2025-06-01
Maintainercloudyu
Model Typemixtral
Model Files  5.0 GB: 1-of-8   4.9 GB: 2-of-8   5.0 GB: 3-of-8   5.0 GB: 4-of-8   4.9 GB: 5-of-8   5.0 GB: 6-of-8   5.0 GB: 7-of-8   3.6 GB: 8-of-8
Model ArchitectureMixtralForCausalLM
Licenseother
Context Length4096
Model Max Length4096
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to 19B MATH DPO

Best Alternatives
Context / RAM
Downloads
Likes
MixTAO 19B Pass32K / 38.1 GB192
Multimerge 19B Pass32K / 38 GB100
Lorge 2x7B UAMM32K / 38.2 GB160
Mistralmath 15B Pass32K / 38.5 GB110
TaoPassthrough 15B S32K / 38.4 GB130
Raccoon Small32K / 38.4 GB191
Mixtral 11Bx2 MoE 19B4K / 38.4 GB2737
...oundary Solar Chat 2x10.7B MoE4K / 38 GB1231
SunnyRain 2x10.7B4K / 38.4 GB100
Truthful DPO MoE 19B4K / 38.4 GB201
Note: green Score (e.g. "73.2") means that the model is better than cloudyu/19B_MATH_DPO.

Rank the 19B MATH DPO Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 47753 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227