LLM Name | Llama 3.2 3B Mathdaily Chatbot |
Repository ๐ค | https://huggingface.co/Atharva26/llama-3.2-3b-mathdaily-chatbot |
Model Size | 3b |
Required VRAM | 6.5 GB |
Updated | 2025-02-22 |
Maintainer | Atharva26 |
Model Files | |
Model Architecture | AutoModelForCausalLM |
Model Max Length | 131072 |
Is Biased | none |
Tokenizer Class | PreTrainedTokenizerFast |
Padding Token | [PAD] |
PEFT Type | LORA |
LoRA Model | Yes |
PEFT Target Modules | gate_proj|q_proj|down_proj|o_proj|v_proj|k_proj|up_proj |
LoRA Alpha | 32 |
LoRA Dropout | 0.05 |
R Param | 16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
Granite 3B Mup | 4K / 14 GB | 374 | 0 |
Mamba GPT 3B V2 | 0K / 6.8 GB | 1735 | 16 |
MM Alpaca 3B Lora | 0K / 0.2 GB | 7 | 0 |
Qwen2.5 3b Lora Model | 0K / 0.1 GB | 11 | 0 |
...ma 3.2 3B It Ecommerce ChatBot | 0K / 6.5 GB | 190 | 4 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐