Zephyr Quiklang 3B GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Zephyr Quiklang 3B GPTQ   URL Share it on

  4-bit Base model:quantized:walmart-t... Base model:walmart-the-bag/zep...   Causal lm   Conversational   Custom code   Dataset:teknium/openhermes Dataset:unalignment/toxic-dpo-...   Feature-extraction   Gptq   Quantized   Region:us   Safetensors   Stablelm epoch

Zephyr Quiklang 3B GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Zephyr Quiklang 3B GPTQ (TheBloke/zephyr-quiklang-3b-GPTQ)

Zephyr Quiklang 3B GPTQ Parameters and Internals

Model Type 
stablelm_epoch
Additional Notes 
This is a finetune of StableLM-Zephyr-3B with two datasets (openhermes and toxic-dpo).
Training Details 
Data Sources:
unalignment/toxic-dpo-v0.1, teknium/openhermes
Data Volume:
10000 samples
Context Length:
1024
Hardware Used:
1xA6000-48GB
Input Output 
Input Format:
<|system|> {system_message}~~ <|user|> {prompt}~~ <|assistant|>
LLM NameZephyr Quiklang 3B GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/zephyr-quiklang-3b-GPTQ 
Model NameZephyr Quiklang 3B
Model Creatorwbag
Base Model(s)  Zephyr Quiklang 3B   Walmart-the-bag/zephyr-quiklang-3b
Model Size3b
Required VRAM1.8 GB
Updated2024-12-23
MaintainerTheBloke
Model Typestablelm_epoch
Model Files  1.8 GB
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureStableLMEpochForCausalLM
Licensemit
Context Length4096
Model Max Length4096
Transformers Version4.37.0.dev0
Tokenizer ClassGPTNeoXTokenizer
Padding Token<|endoftext|>
Vocabulary Size50304
Torch Data Typefloat16

Best Alternatives to Zephyr Quiklang 3B GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
Stable Code 3B GPTQ16K / 1.8 GB2112
Zephyr Quiklang 3B 4K GPTQ4K / 1.8 GB172
Dopeystableplats 3B V1 GPTQ4K / 1.8 GB890
Stablelm Zephyr 3B GPTQ4K / 1.8 GB3512
...ling Stable Lm 3B 4e1t V0 GPTQ4K / 1.8 GB181
Rocket 3B GPTQ4K / 1.8 GB260
Nous Capybara 3B V1.9 GPTQ4K / 1.8 GB218
Marx 3B V3 GPTQ4K / 1.8 GB242
Akins 3B GPTQ4K / 1.8 GB262
Stable Code 3B Mlx16K / 5.6 GB91
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/zephyr-quiklang-3b-GPTQ.

Rank the Zephyr Quiklang 3B GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40123 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217