QwQ R1 Distill 1.5B CoT by prithivMLmods

 ยป  All LLMs  ยป  prithivMLmods  ยป  QwQ R1 Distill 1.5B CoT   URL Share it on

  Autotrain compatible Base model:deepseek-ai/deepsee... Base model:finetune:deepseek-a...   Conversational   Dataset:ai-mo/numinamath-cot Dataset:amphora/qwq-longcot-13... Dataset:novasky-ai/sky-t1 data... Dataset:prithivmlmods/deepthin... Dataset:prithivmlmods/math-sol...   Deepseek   Distill   En   Endpoints compatible   Qwen2   Qwen2.5   Qwq   R1   Region:us   Safetensors

QwQ R1 Distill 1.5B CoT Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
QwQ R1 Distill 1.5B CoT (prithivMLmods/QwQ-R1-Distill-1.5B-CoT)

QwQ R1 Distill 1.5B CoT Parameters and Internals

LLM NameQwQ R1 Distill 1.5B CoT
Repository ๐Ÿค—https://huggingface.co/prithivMLmods/QwQ-R1-Distill-1.5B-CoT 
Base Model(s)  deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B   deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B
Model Size1.5b
Required VRAM3.5 GB
Updated2025-02-05
MaintainerprithivMLmods
Model Typeqwen2
Model Files  3.5 GB
Supported Languagesen
Model ArchitectureQwen2ForCausalLM
Licenseapache-2.0
Context Length131072
Model Max Length131072
Transformers Version4.47.1
Tokenizer ClassLlamaTokenizer
Padding Token<|vision_pad|>
Vocabulary Size151936
Torch Data Typebfloat16

Best Alternatives to QwQ R1 Distill 1.5B CoT

Best Alternatives
Context / RAM
Downloads
Likes
ReaderLM V2500K / 3.5 GB23287473
Reader Lm 1.5B250K / 3.1 GB8819584
DeepSeek R1 Distill Qwen 1.5B128K / 3.5 GB386547659
...Seek R1 Distill Qwen 1.5B ONNX128K /  GB3922337
Qwen2.5 1.5B128K / 3.1 GB37094161
...ek R1 ReDistill Qwen 1.5B V1.0128K / 3.6 GB30742
Stella En 1.5B V5128K / 6.2 GB581890211
DeepSeek R1 Distill Qwen 1.5B128K / 3.5 GB48586
AceInstruct 1.5B128K / 3.5 GB37410
Bellatrix 1.5B XElite128K / 3.5 GB2218
Note: green Score (e.g. "73.2") means that the model is better than prithivMLmods/QwQ-R1-Distill-1.5B-CoT.

Rank the QwQ R1 Distill 1.5B CoT Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227