Smol Llama 4x220M MoE by Isotonic

 ยป  All LLMs  ยป  Isotonic  ยป  Smol Llama 4x220M MoE   URL Share it on

  Autotrain compatible Bee-spoke-data/beecoder-220m-p... Bee-spoke-data/smol llama-220m... Bee-spoke-data/zephyr-220m-dpo... Bee-spoke-data/zephyr-220m-sft... Dataset:bigcode/the-stack-smol... Dataset:eleutherai/proof-pile-... Dataset:huggingfaceh4/ultracha... Dataset:huggingfaceh4/ultrafee...   Dataset:jeankaddour/minipile Dataset:mattymchen/refinedweb-... Dataset:pszemraj/simple wikipe...   Dataset:teknium/openhermes   Endpoints compatible   Lazymergekit   License:apache-2.0   Merge   Mergekit   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Rank the Smol Llama 4x220M MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Smol Llama 4x220M MoE (Isotonic/smol_llama-4x220M-MoE)

Best Alternatives to Smol Llama 4x220M MoE

Best Alternatives
HF Rank
...inyMixtral 4x220M UniversalNER2K / 2.4 GB130

Smol Llama 4x220M MoE Parameters and Internals

LLM NameSmol Llama 4x220M MoE
RepositoryOpen on ๐Ÿค— 
Model Size595.4m
Required VRAM1.2 GB
Model Typemixtral
Model Files  1.2 GB: 1-of-1
Model ArchitectureMixtralForCausalLM
Context Length2048
Model Max Length2048
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32128
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35008 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024040901