Codestral 21B Pruned by TroyDoesAI

 ยป  All LLMs  ยป  TroyDoesAI  ยป  Codestral 21B Pruned   URL Share it on

  Ablated   Ablation   Accelerator   Accurate   Ai   Alignment   Architecture   Augmented   Autotrain compatible   Bad   Chatbot   Code   Coder   Context   Context obedient   Copilot   Diagram   Efficient   En   Endpoints compatible   Enthusiast   Flow   Generation   Graph   Idocoolstuff   Knowledge   Knowledge graph   Llama   Local   Lol   Lookingforwork   Map   Mermaid   Mistral   Open   Open source   Openforhire   Personal assistant   Pruned   Quant   Quantize   Rag   Region:us   Retrieval   Retrieval augmented generation   Safetensors   Sequence   Sharded   Small   Smaller   Source   Story   Summarization   Tags   Tensorflow   Troy andrew schultz   Troydoesai   Unaligned   Uncensored

Codestral 21B Pruned Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Codestral 21B Pruned (TroyDoesAI/Codestral-21B-Pruned)

Codestral 21B Pruned Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
research, commercial applications
Applications:
story generation, code generation, knowledge retrieval, chatbots, personal assistants
Primary Use Cases:
text generation, retrieval augmented generation
Limitations:
unaligned behavior, bad alignment
Considerations:
Consider memory usage optimization due to model size.
Additional Notes 
Every GB of saved memory counts when offloading to System RAM!
Supported Languages 
en (proficient)
Input Output 
Input Format:
text
Accepted Modalities:
text
Output Format:
text
Performance Tips:
Consider using quantization for efficiency.
Release Notes 
Version:
v0.1
Notes:
21.5B parameters, pruned from 22.2B by removing two layers.
LLM NameCodestral 21B Pruned
Repository ๐Ÿค—https://huggingface.co/TroyDoesAI/Codestral-21B-Pruned 
Model Size21b
Required VRAM43.1 GB
Updated2025-02-05
MaintainerTroyDoesAI
Model Typemistral
Model Files  4.9 GB: 1-of-9   4.9 GB: 2-of-9   5.0 GB: 3-of-9   5.0 GB: 4-of-9   4.9 GB: 5-of-9   5.0 GB: 6-of-9   5.0 GB: 7-of-9   4.9 GB: 8-of-9   3.5 GB: 9-of-9
Supported Languagesen
Model ArchitectureMistralForCausalLM
LicenseMNPL-0.1
Context Length32768
Model Max Length32768
Transformers Version4.41.1
Tokenizer ClassLlamaTokenizer
Vocabulary Size32768
Torch Data Typefloat16

Best Alternatives to Codestral 21B Pruned

Best Alternatives
Context / RAM
Downloads
Likes
Moist Theia 21B1000K / 40.8 GB281
Theia 21B V21000K / 40.8 GB5427
Theia 21B V11000K / 40.8 GB2830
Blendy0011000K / 40.8 GB50
NeMoist 21B V1a1000K / 40.8 GB482
Theia 21B V1 Pretrained1000K / 40.8 GB50
NeMoria 21B1000K / 40.9 GB3212
NeMoria 21B1000K / 40.9 GB1312
NeMoist 21B V0.51000K / 40.8 GB50
Note: green Score (e.g. "73.2") means that the model is better than TroyDoesAI/Codestral-21B-Pruned.

Rank the Codestral 21B Pruned Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227