AI Train Repo by SnehaPriyaaMP

 ยป  All LLMs  ยป  SnehaPriyaaMP  ยป  AI Train Repo   URL Share it on

  Arxiv:1910.09700   4-bit   4bit   Autotrain compatible Base model:unsloth/llama-3-8b-...   Bitsandbytes   Endpoints compatible   Llama   Lora   Quantized   Region:us   Safetensors   Sharded   Tensorflow

AI Train Repo Parameters and Internals

LLM NameAI Train Repo
RepositoryOpen on ๐Ÿค— 
Base Model(s)  Llama 3 8B Bnb 4bit   unsloth/llama-3-8b-bnb-4bit
Model Size8b
Required VRAM5.8 GB
Model Files  0.2 GB   4.7 GB: 1-of-2   1.1 GB: 2-of-2
Quantization Type4bit
Model ArchitectureAutoModelForCausalLM
Is Biasednone
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|end_of_text|>
LoRA ModelYes
PEFT Target Modulesgate_proj|q_proj|v_proj|o_proj|down_proj|up_proj|k_proj
LoRA Alpha16
LoRA Dropout0
R Param16
AI Train Repo (SnehaPriyaaMP/AI-Train-Repo)

Best Alternatives to AI Train Repo

Best Alternatives
HF Rank
Lora Model 10.28K / 5.8 GB50
...3 Project Management Assistant0.30K / 0.2 GB860
...ama 3 8B Instruct Bnb Telcom 30.30K / 0.2 GB300
Model Name0.30K / 16.1 GB250
LLAMA3 Vuln Detection0.30K / 0.2 GB170
Llama3 8B Myfine0.20K / 16.1 GB190
Educate Ai V20.20K / 0.2 GB110
Llama300.20K / 16.1 GB180
Llama260.20K / 16.1 GB100
Llama240.20K / 16.1 GB100
Note: green Score (e.g. "73.2") means that the model is better than SnehaPriyaaMP/AI-Train-Repo.

Rank the AI Train Repo Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 34211 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024071601