Trained Quant 4bit by RyanJT

 ยป  All LLMs  ยป  RyanJT  ยป  Trained Quant 4bit   URL Share it on

  Arxiv:1910.09700   4-bit   4bit   Adapter Base model:adapter:ryanjt/quan... Base model:ryanjt/quantized-gr...   Bitsandbytes   Finetuned   Instruct   Llama   Lora   Peft   Quantized   Region:us   Safetensors

Trained Quant 4bit Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Trained Quant 4bit (RyanJT/trained-quant-4bit)

Trained Quant 4bit Parameters and Internals

LLM NameTrained Quant 4bit
Repository ๐Ÿค—https://huggingface.co/RyanJT/trained-quant-4bit 
Base Model(s)  ... Granite 4bit 3B Code Instruct   RyanJT/quantized-granite-4bit-3b-code-instruct
Model Size3b
Required VRAM0 GB
Updated2024-08-15
MaintainerRyanJT
Instruction-BasedYes
Model Files  0.0 GB   2.0 GB   0.0 GB
Quantization Type4bit
Model ArchitectureAdapter
Model Max Length9223372036854775807
Is Biasednone
Tokenizer ClassGPT2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size49152
PEFT TypeLORA
LoRA ModelYes
PEFT Target Modulesq_proj|v_proj
LoRA Alpha32
LoRA Dropout0.1
R Param8

Best Alternatives to Trained Quant 4bit

Best Alternatives
Context / RAM
Downloads
Likes
Xenith 3B0K / 7.6 GB02
Note: green Score (e.g. "73.2") means that the model is better than RyanJT/trained-quant-4bit.

Rank the Trained Quant 4bit Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217