Llama 3 8B Magpie Air MT SFT V0.1 by Magpie-Align

 ยป  All LLMs  ยป  Magpie-Align  ยป  Llama 3 8B Magpie Air MT SFT V0.1   URL Share it on

  Arxiv:2406.08464   Autotrain compatible   Axolotl Base model:meta-llama/meta-lla...   Conversational   Endpoints compatible   Generated from trainer   License:llama3   Llama   Pytorch   Region:us   Safetensors   Sharded   Tensorflow

Rank the Llama 3 8B Magpie Air MT SFT V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Llama 3 8B Magpie Air MT SFT V0.1 (Magpie-Align/Llama-3-8B-Magpie-Air-MT-SFT-v0.1)

Best Alternatives to Llama 3 8B Magpie Air MT SFT V0.1

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
...Antino 3 ANITA 8B Inst DPO ITA75.128K / 16.1 GB659719
Llama 3 Merged Linear73.938K / 16.1 GB86216
...ama 3 SauerkrautLM 8B Instruct73.748K / 16.1 GB2941347
Llama3s Merged Linear73.668K / 16.1 GB13370
Llama 3 8B Ita73.658K / 16.1 GB2123620
Llama 3 Gutenberg 8B73.188K / 16.1 GB16176
Llama 3 8B Instruct V0.873.178K / 16 GB28651
Llama 3 Wissenschaft 8B V273.068K / 16.1 GB13391
NeuralLLaMa 3 8B ORPO V0.372.668K / 16.1 GB50040
NeuralLLaMa 3 8B DT V0.172.528K / 16.1 GB54031
Note: green Score (e.g. "73.2") means that the model is better than Magpie-Align/Llama-3-8B-Magpie-Air-MT-SFT-v0.1.

Llama 3 8B Magpie Air MT SFT V0.1 Parameters and Internals

LLM NameLlama 3 8B Magpie Air MT SFT V0.1
RepositoryOpen on ๐Ÿค— 
Base Model(s)  Meta Llama 3 8B   meta-llama/Meta-Llama-3-8B
Model Size8b
Required VRAM16.1 GB
Updated2024-06-24
MaintainerMagpie-Align
Model Typellama
Model Files  5.0 GB: 1-of-4   5.0 GB: 2-of-4   4.9 GB: 3-of-4   1.2 GB: 4-of-4   5.0 GB: 1-of-4   5.0 GB: 2-of-4   4.9 GB: 3-of-4   1.2 GB: 4-of-4   0.0 GB
Model ArchitectureLlamaForCausalLM
Licensellama3
Context Length8192
Model Max Length8192
Transformers Version4.40.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|end_of_text|>
Vocabulary Size128256
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 34902 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801