Lama3 Taide Sloth by zhe0

 ยป  All LLMs  ยป  zhe0  ยป  Lama3 Taide Sloth   URL Share it on

  Autotrain compatible Base model:finetune:taide/llam... Base model:taide/llama3-taide-...   Conversational   En   Endpoints compatible   Llama   Lora   Pytorch   Region:us   Safetensors   Sft   Sharded   Trl   Unsloth
Model Card on HF ๐Ÿค—: https://huggingface.co/zhe0/lama3_taide_sloth 

Lama3 Taide Sloth Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Lama3 Taide Sloth (zhe0/lama3_taide_sloth)

Lama3 Taide Sloth Parameters and Internals

Model Type 
text-generation-inference, transformers, unsloth, llama, trl, sft
Additional Notes 
This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.
LLM NameLama3 Taide Sloth
Repository ๐Ÿค—https://huggingface.co/zhe0/lama3_taide_sloth 
Base Model(s)  taide/Llama3-TAIDE-LX-8B-Chat-Alpha1   taide/Llama3-TAIDE-LX-8B-Chat-Alpha1
Model Size8b
Required VRAM16.1 GB
Updated2025-02-22
Maintainerzhe0
Model Files  0.2 GB   5.0 GB: 1-of-4   5.0 GB: 2-of-4   4.9 GB: 3-of-4   1.2 GB: 4-of-4
Supported Languagesen
Model ArchitectureAutoModelForCausalLM
Licenseapache-2.0
Model Max Length8192
Is Biasednone
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|reserved_special_token_250|>
PEFT TypeLORA
LoRA ModelYes
PEFT Target Modulesdown_proj|o_proj|q_proj|gate_proj|k_proj|v_proj|up_proj
LoRA Alpha16
LoRA Dropout0
R Param16

Best Alternatives to Lama3 Taide Sloth

Best Alternatives
Context / RAM
Downloads
Likes
Trillama 8B8K / 16.1 GB1793
AutogenJune20008K / 33.8 GB70
Llama3 8B8K / 16.1 GB50
Medllama3 V200K / 16.1 GB2563169
Autotrain Pvqlj Odah20K / 0.2 GB180
500tiao 100lun0K / 0.2 GB70
Codelica0K / 16.1 GB1110
ModeliCo 8B0K / 16.1 GB112
TalktoaiQT0K / 16.1 GB332
MotherEarth Proverbs 1.0 8B0K / 16.1 GB200
Note: green Score (e.g. "73.2") means that the model is better than zhe0/lama3_taide_sloth.

Rank the Lama3 Taide Sloth Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227