4bit Quant TomGrc FusionNet 34Bx2 MoE V0.1 DPO by cloudyu

 ยป  All LLMs  ยป  cloudyu  ยป  4bit Quant TomGrc FusionNet 34Bx2 MoE V0.1 DPO   URL Share it on

  4-bit   4bit   Autotrain compatible   Bitsandbytes   Conversational   Endpoints compatible   Mixtral   Moe   Quantized   Region:us   Safetensors   Sharded   Tensorflow   Yi

4bit Quant TomGrc FusionNet 34Bx2 MoE V0.1 DPO Benchmarks

4bit Quant TomGrc FusionNet 34Bx2 MoE V0.1 DPO Parameters and Internals

Model Type
language model
Training Details
Methodology:Uses TRL's DPO Trainer: Direct Preference Optimization
Model Architecture:MoE (Mixture of Experts)
LLM Name4bit Quant TomGrc FusionNet 34Bx2 MoE V0.1 DPO
Repository ๐Ÿค—https://huggingface.co/cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO 
Model Size31.8b
Required VRAM35.6 GB
Updated2024-11-13
Maintainercloudyu
Model Typemixtral
Model Files  5.0 GB: 1-of-8   5.0 GB: 2-of-8   5.0 GB: 3-of-8   5.0 GB: 4-of-8   5.0 GB: 5-of-8   5.0 GB: 6-of-8   4.7 GB: 7-of-8   0.9 GB: 8-of-8
Quantization Type4bit
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length200000
Model Max Length200000
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size64000
Torch Data Typebfloat16
4bit Quant TomGrc FusionNet 34Bx2 MoE V0.1 DPO (cloudyu/4bit_quant_TomGrc_FusionNet_34Bx2_MoE_v0.1_DPO)

Best Alternatives to 4bit Quant TomGrc FusionNet 34Bx2 MoE V0.1 DPO

Best Alternatives
Context / RAM
Downloads
Likes
60B MoE Coder V2195K / 35.6 GB761

Rank the 4bit Quant TomGrc FusionNet 34Bx2 MoE V0.1 DPO Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 37901 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241110