Training Details |
|
LLM Name | Artigenz Coder DS 6.7B Dataset Size 52 Epochs 10 2024 06 11 06 26 45 3520976 |
Repository 🤗 | https://huggingface.co/vdavidr/Artigenz-Coder-DS-6.7B_dataset_size_52_epochs_10_2024-06-11_06-26-45_3520976 |
Model Size | 6.7b |
Required VRAM | 27.1 GB |
Updated | 2025-06-01 |
Maintainer | vdavidr |
Model Files | |
Generates Code | Yes |
Model Architecture | AutoModelForCausalLM |
Model Max Length | 16384 |
Is Biased | none |
Tokenizer Class | LlamaTokenizer |
Padding Token | <|end▁of▁sentence|> |
PEFT Type | LORA |
LoRA Model | Yes |
PEFT Target Modules | q_proj|v_proj|up_proj|gate_proj|o_proj|down_proj|k_proj |
LoRA Alpha | 16 |
LoRA Dropout | 0.1 |
R Param | 16 |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
...odeLlama Text To Sql Finetuned | 0K / 13.5 GB | 11 | 0 |
🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟