Smol Llama 4x220M MoE by Isotonic

 ยป  All LLMs  ยป  Isotonic  ยป  Smol Llama 4x220M MoE   URL Share it on

  Autotrain compatible Bee-spoke-data/beecoder-220m-p... Bee-spoke-data/smol llama-220m... Bee-spoke-data/zephyr-220m-dpo... Bee-spoke-data/zephyr-220m-sft... Dataset:bigcode/the-stack-smol... Dataset:eleutherai/proof-pile-... Dataset:huggingfaceh4/ultracha... Dataset:huggingfaceh4/ultrafee...   Dataset:jeankaddour/minipile Dataset:mattymchen/refinedweb-... Dataset:pszemraj/simple wikipe...   Dataset:teknium/openhermes   Endpoints compatible   Lazymergekit   Merge   Mergekit   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Smol Llama 4x220M MoE Benchmarks

Smol Llama 4x220M MoE Parameters and Internals

LLM NameSmol Llama 4x220M MoE
RepositoryOpen on ๐Ÿค— 
Model Size595.4m
Required VRAM1.2 GB
Updated2024-07-27
MaintainerIsotonic
Model Typemixtral
Model Files  1.2 GB: 1-of-1
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32128
Torch Data Typebfloat16
Smol Llama 4x220M MoE (Isotonic/smol_llama-4x220M-MoE)

Best Alternatives to Smol Llama 4x220M MoE

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
...inyMixtral 4x220M UniversalNER0.22K / 2.4 GB200

Rank the Smol Llama 4x220M MoE Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 34447 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072501