Calme 4x7B MoE V0.2 by MaziyarPanahi

 ยป  All LLMs  ยป  MaziyarPanahi  ยป  Calme 4x7B MoE V0.2   URL Share it on

  7b   Autotrain compatible   Calme   Conversational   Generated from trainer   Mistral   Mixtral   Moe   Region:us   Safetensors   Sharded   Tensorflow

Calme 4x7B MoE V0.2 Benchmarks

Calme 4x7B MoE V0.2 Parameters and Internals

LLM NameCalme 4x7B MoE V0.2
Repository ๐Ÿค—https://huggingface.co/MaziyarPanahi/Calme-4x7B-MoE-v0.2 
Model NameCalme-4x7B-MoE-v0.2
Model CreatorMaziyarPanahi
Model Size7b
Required VRAM48.3 GB
Updated2024-09-07
MaintainerMaziyarPanahi
Model Typemixtral
Model Files  9.9 GB: 1-of-5   9.9 GB: 2-of-5   10.0 GB: 3-of-5   10.0 GB: 4-of-5   8.5 GB: 5-of-5
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typebfloat16
Calme 4x7B MoE V0.2 (MaziyarPanahi/Calme-4x7B-MoE-v0.2)

Quantized Models of the Calme 4x7B MoE V0.2

Model
Likes
Downloads
VRAM
Calme 4x7B MoE V0.2 GGUF41558 GB

Best Alternatives to Calme 4x7B MoE V0.2

Best Alternatives
Context / RAM
Downloads
Likes
Multimaster 7B V632K / 142.5 GB31081
Mixtral 7B 8expert32K / 93.6 GB13195263
Laserxtral32K / 48.3 GB510178
MultiverseBuddy 15B MoE32K / 25.8 GB60
Mini Mixtral V0.232K / 25.8 GB673
Merged Model MoE32K / 53.3 GB41
Multilingual Mistral32K / 93.5 GB6592
Lumina 232K / 37.1 GB50
RogerWizard 12B MoE32K / 25.8 GB41
StarlingMaths 12B MoE32K / 25.8 GB50

Rank the Calme 4x7B MoE V0.2 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 35693 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803