FusionNet 34Bx2 MoE AWQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  FusionNet 34Bx2 MoE AWQ   URL Share it on

  4-bit   Autotrain compatible   Awq Base model:tomgrc/fusionnet 34...   Conversational   En   License:mit   Mixtral   Moe   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Rank the FusionNet 34Bx2 MoE AWQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
FusionNet 34Bx2 MoE AWQ (TheBloke/FusionNet_34Bx2_MoE-AWQ)

Best Alternatives to FusionNet 34Bx2 MoE AWQ

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Mixtral 34Bx2 MoE 60B AWQ195K / 32.8 GB74

FusionNet 34Bx2 MoE AWQ Parameters and Internals

LLM NameFusionNet 34Bx2 MoE AWQ
RepositoryOpen on ๐Ÿค— 
Model NameFusionNet 34Bx2 MoE
Model CreatorSuqin Zhang
Base Model(s)  FusionNet 34Bx2 MoE   TomGrc/FusionNet_34Bx2_MoE
Model Size8.9b
Required VRAM32.8 GB
Updated2024-07-07
MaintainerTheBloke
Model Typemixtral
Model Files  9.9 GB: 1-of-4   9.9 GB: 2-of-4   9.9 GB: 3-of-4   3.1 GB: 4-of-4
Supported Languagesen
AWQ QuantizationYes
Quantization Typeawq
Model ArchitectureMixtralForCausalLM
Licensemit
Context Length32768
Model Max Length32768
Transformers Version4.37.0.dev0
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size64000
Initializer Range0.02
Torch Data Typefloat16

What open-source LLMs or SLMs are you in search of? 34531 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801