Stablelm 2 1 6B by stabilityai

 ยป  All LLMs  ยป  stabilityai  ยป  Stablelm 2 1 6B   URL Share it on

  Arxiv:1607.06450   Arxiv:1910.02054   Arxiv:1910.07467   Arxiv:2101.00027   Arxiv:2104.09864   Arxiv:2204.06745   Arxiv:2206.11147   Arxiv:2305.06161   Arxiv:2305.14201   Arxiv:2307.09288   Arxiv:2309.09400   Arxiv:2309.16609   Arxiv:2402.17834   Autotrain compatible   Dataset:bigcode/starcoderdata   Dataset:carperai/pilev2-dev Dataset:dataprovenanceinitiati... Dataset:tiiuae/falcon-refinedw... Dataset:togethercomputer/redpa...   Dataset:uonlp/culturax   De   En   Endpoints compatible   Es   Fr   It   Nl   Pt   Region:us   Safetensors   Stablelm

Stablelm 2 1 6b Benchmarks

Stablelm 2 1 6B (stabilityai/stablelm-2-1_6b)

Stablelm 2 1 6B Parameters and Internals

Model Type 
causal-lm
Use Cases 
Areas:
research, commercial applications
Applications:
foundational base model for application-specific fine-tuning
Limitations:
May exhibit unreliable, unsafe, or other undesirable behaviors that must be corrected prior to deployment., Pre-training data may have contained offensive or inappropriate content.
Supported Languages 
en (intermediate), de (intermediate), es (intermediate), fr (intermediate), it (intermediate), nl (intermediate), pt (intermediate)
Training Details 
Data Sources:
tiiuae/falcon-refinedweb, togethercomputer/RedPajama-Data-1T, uonlp/CulturaX, CarperAI/pilev2-dev, bigcode/starcoderdata, DataProvenanceInitiative/Commercially-Verified-Licenses
Data Volume:
2 trillion tokens
Methodology:
Pre-trained on diverse multilingual and code datasets for two epochs
Context Length:
4096
Hardware Used:
512 NVIDIA A100 40GB GPUs (AWS P4d instances)
Model Architecture:
Decoder-only transformer similar to the LLaMA architecture with modifications
Input Output 
Input Format:
prompts in tokenized form
Accepted Modalities:
text
Output Format:
generated text
Performance Tips:
Fine-tuning the base model is recommended for downstream tasks.
LLM NameStablelm 2 1 6b
Repository ๐Ÿค—https://huggingface.co/stabilityai/stablelm-2-1_6b 
Model Size6b
Required VRAM3.3 GB
Updated2025-02-05
Maintainerstabilityai
Model Typestablelm
Model Files  3.3 GB
Supported Languagesen de es fr it nl pt
Model ArchitectureStableLmForCausalLM
Licenseother
Context Length4096
Model Max Length4096
Transformers Version4.38.0
Tokenizer ClassGPT2TokenizerFast
Vocabulary Size100352
Torch Data Typefloat16

Best Alternatives to Stablelm 2 1 6B

Best Alternatives
Context / RAM
Downloads
Likes
Stablelm 2 1 6b Chat4K / 6.6 GB438832
Stablelm 2 1 6b Sft Full4K / 3.3 GB450
StableGPT4 Micro 1.6B4K / 6.6 GB1221
Parrot 1 6B4K / 3.3 GB891
Stablelm 2 Zephyr 1 6b4K /  GB91
Stablelm 2 1 6b4K /  GB52
Stablelm 2 Zephyr 1 6b4K / 3.3 GB51
Stablelm 2 Zephyr 1 6b Q44K /  GB51
Stablelm 2 Zephyr 1 6b 4bit4K / 1.2 GB85
Stablelm 2 Zephyr 1 6b4K / 1 GB18828183
Note: green Score (e.g. "73.2") means that the model is better than stabilityai/stablelm-2-1_6b.

Rank the Stablelm 2 1 6B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42625 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227