SauerkrautLM 1.5B by VAGOsolutions

 ยป  All LLMs  ยป  VAGOsolutions  ยป  SauerkrautLM 1.5B   URL Share it on

  Autotrain compatible   Continuous pretraining   Conversational   De   Dpo   En   Endpoints compatible   Qwen2   Region:us   Safetensors   Sft   Spectrum

SauerkrautLM 1.5B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
SauerkrautLM 1.5B (VAGOsolutions/SauerkrautLM-1.5b)

SauerkrautLM 1.5B Parameters and Internals

Model Type 
demo, finetuned, multilingual
Use Cases 
Areas:
business applications, mobile deployment
Additional Notes 
Spectrum CPT targets 25% of the layers, enabling significant resource and cost savings
Supported Languages 
German (improved proficiency via CPT), English (original proficiency preserved)
Training Details 
Data Sources:
German Data, Qwen2-1.5B dataset
Data Volume:
6.1 billion German tokens
Methodology:
Continuous Pre-Training (CPT), Supervised Fine-Tuning (SFT), Direct Preference Optimization (DPO)
Training Time:
CPT cost 52 GPU-Rent
Model Architecture:
Spectrum with layer targeting at 25%
LLM NameSauerkrautLM 1.5B
Repository ๐Ÿค—https://huggingface.co/VAGOsolutions/SauerkrautLM-1.5b 
Model Size1.5b
Required VRAM3.1 GB
Updated2025-02-22
MaintainerVAGOsolutions
Model Typeqwen2
Model Files  3.1 GB
Supported Languagesde en
Model ArchitectureQwen2ForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.41.2
Tokenizer ClassQwen2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size151936
Torch Data Typebfloat16
Errorsreplace

Best Alternatives to SauerkrautLM 1.5B

Best Alternatives
Context / RAM
Downloads
Likes
ReaderLM V2500K / 3.5 GB26440512
Reader Lm 1.5B250K / 3.1 GB1295589
DeepSeek R1 Distill Qwen 1.5B128K / 3.5 GB1075846904
DeepScaleR 1.5B Preview128K / 7.1 GB14543462
...Seek R1 Distill Qwen 1.5B ONNX128K /  GB4638651
Qwen2.5 1.5B128K / 3.1 GB50561178
AceInstruct 1.5B128K / 3.5 GB59217
DeepSeek R1 Distill Qwen 1.5B128K / 3.5 GB1550610
Qwen2 1.5B128K / 3.1 GB10412288
Stella En 1.5B V5128K / 6.2 GB581890211
Note: green Score (e.g. "73.2") means that the model is better than VAGOsolutions/SauerkrautLM-1.5b.

Rank the SauerkrautLM 1.5B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227