Mixnueza 6x32M MoE by Isotonic

 ยป  All LLMs  ยป  Isotonic  ยป  Mixnueza 6x32M MoE   URL Share it on

  Autotrain compatible   Conversational   Dataset:c4 Dataset:cohereforai/aya datase... Dataset:databricks/databricks-... Dataset:euclaise/reddit-instru... Dataset:felladrin/chatml-ultra... Dataset:huggingfaceh4/ultracha... Dataset:izumi-lab/open-text-bo...   Dataset:skylion007/openwebtext Dataset:tiiuae/falcon-refinedw... Dataset:togethercomputer/redpa...   Dataset:wikimedia/wikipedia   Endpoints compatible   Felladrin/minueza-32m-base Felladrin/minueza-32m-ultracha...   Instruct   Lazymergekit   License:apache-2.0   Merge   Mergekit   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Rank the Mixnueza 6x32M MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Mixnueza 6x32M MoE (Isotonic/Mixnueza-6x32M-MoE)

Best Alternatives to Mixnueza 6x32M MoE

Best Alternatives
HF Rank
Mixnueza Chat 6x32M MoE2K / 0.3 GB8090

Mixnueza 6x32M MoE Parameters and Internals

LLM NameMixnueza 6x32M MoE
RepositoryOpen on ๐Ÿค— 
Model Size83.9m
Required VRAM0.3 GB
Model Typemixtral
Model Files  0.3 GB: 1-of-1
Model ArchitectureMixtralForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.39.3
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32002
Initializer Range0.02
Torch Data Typefloat32

What open-source LLMs or SLMs are you in search of? 35008 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024040901