Model Type |
| |||||||||
Use Cases |
| |||||||||
Additional Notes |
| |||||||||
Supported Languages |
| |||||||||
Training Details |
| |||||||||
Responsible Ai Considerations |
| |||||||||
Input Output |
|
LLM Name | QwQ 32B Preview |
Repository ๐ค | https://huggingface.co/Qwen/QwQ-32B-Preview |
Base Model(s) | |
Model Size | 32b |
Required VRAM | 65.5 GB |
Updated | 2025-04-19 |
Maintainer | Qwen |
Model Type | qwen2 |
Instruction-Based | Yes |
Model Files | |
Supported Languages | en |
Model Architecture | Qwen2ForCausalLM |
License | apache-2.0 |
Context Length | 32768 |
Model Max Length | 32768 |
Transformers Version | 4.43.1 |
Tokenizer Class | Qwen2Tokenizer |
Padding Token | <|endoftext|> |
Vocabulary Size | 152064 |
Torch Data Type | bfloat16 |
Errors | replace |
Model |
Likes |
Downloads |
VRAM |
---|---|---|---|
PathfinderAI | 0 | 12 | 65 GB |
...eview Gptqmodel 4bit Vortex V2 | 16 | 12 | 15 GB |
QwQ 32B Preview 6bit | 4 | 26 | 26 GB |
QwQ 32B Preview AWQ | 24 | 698 | 19 GB |
QwQ 32B Preview GPTQ 4bit | 3 | 306 | 16 GB |
...eview Gptqmodel 4bit Vortex V1 | 51 | 17 | 16 GB |
...Q 32B Preview Unsloth Bnb 4bit | 20 | 86 | 23 GB |
QwQ 32B Preview Bnb 4bit | 4 | 80 | 19 GB |
QwQ 32B Preview 3bit | 5 | 51 | 14 GB |
QwQ 32B Preview 4bit | 3 | 59 | 18 GB |
Best Alternatives |
Context / RAM |
Downloads |
Likes |
---|---|---|---|
...y Qwen2.5coder 32B V24.1q 200K | 195K / 65.8 GB | 13 | 2 |
...wen2.5 32B Inst BaseMerge TIES | 128K / 65.8 GB | 51 | 14 |
...wen2.5 32B Inst BaseMerge TIES | 128K / 65.8 GB | 3 | 2 |
Franqwenstein 35B | 128K / 69.8 GB | 10 | 8 |
Hamanasu Magnum QwQ 32B | 128K / 65.8 GB | 295 | 9 |
Hamanasu QwQ V2 RP | 128K / 65.8 GB | 161 | 5 |
Qwen2.5 32B Gokgok Step3 | 128K / 65.7 GB | 20 | 0 |
Qwen2.5 32B YOYO MIX | 128K / 65.7 GB | 24 | 2 |
Qwen2.5 32B Dark Days Stage2 | 128K / 65.8 GB | 25 | 0 |
QwQ Qwen2.5 Coder Instruct 32B | 128K / 65.8 GB | 43 | 0 |
๐ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐