Stablelm 2 1 6B Chat by stabilityai

 ยป  All LLMs  ยป  stabilityai  ยป  Stablelm 2 1 6B Chat   URL Share it on

  Arxiv:2305.18290   Autotrain compatible   Conversational Dataset:allenai/ultrafeedback ...   Dataset:hkust-nlp/deita-10k-v0 Dataset:huggingfaceh4/ultracha...   Dataset:intel/orca dpo pairs   Dataset:ldjnr/capybara   Dataset:meta-math/metamathqa Dataset:openchat/openchat shar...   Dataset:teknium/openhermes-2.5 Dataset:wizardlm/wizardlm evol...   En   Endpoints compatible   Instruct   Region:us   Safetensors   Sharded   Stablelm   Tensorflow

Stablelm 2 1 6b Chat Benchmarks

Stablelm 2 1 6B Chat (stabilityai/stablelm-2-1_6b-chat)

Stablelm 2 1 6B Chat Parameters and Internals

Model Type 
auto-regressive, transformer decoder
Use Cases 
Primary Use Cases:
Chat-like applications
Limitations:
Limited adversarial input training, Potential for misinformation and harmful outputs
Considerations:
Developers must evaluate the model for safety performance. Use input/output classifier to prevent harmful responses.
Supported Languages 
languages (English), proficiency ()
Training Details 
Data Sources:
HuggingFaceH4/ultrachat_200k, meta-math/MetaMathQA, WizardLM/WizardLM_evol_instruct_V2_196k, Open-Orca/SlimOrca, openchat/openchat_sharegpt4_dataset, LDJnr/Capybara, hkust-nlp/deita-10k-v0, teknium/OpenHermes-2.5, allenai/ultrafeedback_binarized_cleaned, Intel/orca_dpo_pairs, argilla/dpo-mix-7k
Methodology:
Direct Preference Optimization (DPO)
Model Architecture:
Transformer decoder
LLM NameStablelm 2 1 6b Chat
Repository ๐Ÿค—https://huggingface.co/stabilityai/stablelm-2-1_6b-chat 
Model Size6b
Required VRAM6.6 GB
Updated2025-02-05
Maintainerstabilityai
Model Typestablelm
Instruction-BasedYes
Model Files  5.0 GB: 1-of-2   1.6 GB: 2-of-2
Supported Languagesen
Model ArchitectureStableLmForCausalLM
Licenseother
Context Length4096
Model Max Length4096
Transformers Version4.39.3
Tokenizer ClassGPT2TokenizerFast
Padding Token<|endoftext|>
Vocabulary Size100352
Torch Data Typefloat32

Quantized Models of the Stablelm 2 1 6B Chat

Model
Likes
Downloads
VRAM
Vega 1 6B056 GB

Best Alternatives to Stablelm 2 1 6B Chat

Best Alternatives
Context / RAM
Downloads
Likes
Stablelm 2 Zephyr 1 6b4K / 3.3 GB51
Stablelm 2 Zephyr 1 6b 4bit4K / 1.2 GB85
Stablelm 2 Zephyr 1 6b4K / 1 GB18828183
Stablelm 2 Zephyr 1.6B GGUF4K / 0.7 GB103714
Note: green Score (e.g. "73.2") means that the model is better than stabilityai/stablelm-2-1_6b-chat.

Rank the Stablelm 2 1 6B Chat Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227