MagpieLM 8B Chat V0.1 by Magpie-Align

 ยป  All LLMs  ยป  Magpie-Align  ยป  MagpieLM 8B Chat V0.1   URL Share it on

  Arxiv:2406.08464   Arxiv:2411.07133   Alignment-handbook   Autotrain compatible Base model:finetune:magpie-ali... Base model:magpie-align/magpie...   Conversational Dataset:magpie-align/magpielm-... Dataset:magpie-align/magpielm-...   Dpo   Endpoints compatible   Generated from trainer   Llama   Region:us   Safetensors   Sharded   Tensorflow   Trl

MagpieLM 8B Chat V0.1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
MagpieLM 8B Chat V0.1 (Magpie-Align/MagpieLM-8B-Chat-v0.1)

MagpieLM 8B Chat V0.1 Parameters and Internals

Model Type 
text generation, aligned SLM
Use Cases 
Areas:
research, commercial applications
Primary Use Cases:
instruction-following, general conversation tasks
Limitations:
Not designed for complex reasoning tasks, May generate unsafe or inappropriate content
Considerations:
Use to improve instruction-following,.
Supported Languages 
English (primary)
Training Details 
Data Sources:
Magpie-Align/MagpieLM-SFT-Data-v0.1, Magpie-Align/MagpieLM-DPO-Data-v0.1
Methodology:
Supervised Fine-tuning (SFT) and Direct Preference Optimization (DPO)
Context Length:
2048
Hardware Used:
multi-GPU
Model Architecture:
LlamaForCausalLM
Responsible Ai Considerations 
Accountability:
Developers are responsible for outputs.
Input Output 
Input Format:
text prompts structured in seq-2-seq format
Accepted Modalities:
text
Output Format:
text
Performance Tips:
Use Llama 3 chat template for the best performance.
LLM NameMagpieLM 8B Chat V0.1
Repository ๐Ÿค—https://huggingface.co/Magpie-Align/MagpieLM-8B-Chat-v0.1 
Base Model(s)  Magpie-Align/MagpieLM-8B-SFT-v0.1   Magpie-Align/MagpieLM-8B-SFT-v0.1
Model Size8b
Required VRAM16.1 GB
Updated2025-04-23
MaintainerMagpie-Align
Model Typellama
Model Files  5.0 GB: 1-of-4   5.0 GB: 2-of-4   4.9 GB: 3-of-4   1.2 GB: 4-of-4   0.0 GB
Model ArchitectureLlamaForCausalLM
Licensellama3.1
Context Length131072
Model Max Length131072
Transformers Version4.44.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|end_of_text|>
Vocabulary Size128256
Torch Data Typebfloat16

Best Alternatives to MagpieLM 8B Chat V0.1

Best Alternatives
Context / RAM
Downloads
Likes
...otron 8B UltraLong 4M Instruct4192K / 32.1 GB178094
UltraLong Thinking4192K / 16.1 GB652
...a 3.1 8B UltraLong 4M Instruct4192K / 32.1 GB17624
...otron 8B UltraLong 2M Instruct2096K / 32.1 GB121915
...a 3.1 8B UltraLong 2M Instruct2096K / 32.1 GB8759
...otron 8B UltraLong 1M Instruct1048K / 32.1 GB231138
...a 3.1 8B UltraLong 1M Instruct1048K / 32.1 GB138729
....1 1million Ctx Dark Planet 8B1048K / 32.3 GB101
...a 3 8B Instruct Gradient 1048K1024K / 16.1 GB24982679
A11024K / 16.1 GB8530
Note: green Score (e.g. "73.2") means that the model is better than Magpie-Align/MagpieLM-8B-Chat-v0.1.

Rank the MagpieLM 8B Chat V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 46599 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227