Stable Code 3B GPTQ by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Stable Code 3B GPTQ   URL Share it on

  Arxiv:1910.02054   Arxiv:2104.09864   Arxiv:2204.06745   Arxiv:2305.06161   Arxiv:2307.09288   Arxiv:2309.12284   Arxiv:2310.10631   4-bit   Autotrain compatible Base model:quantized:stability... Base model:stabilityai/stable-...   Code   Custom code   Dataset:bigcode/commitpackft   Dataset:bigcode/starcoderdata Dataset:bigcode/the-stack-gith... Dataset:eleutherai/proof-pile-...   Dataset:meta-math/metamathqa Dataset:tiiuae/falcon-refinedw...   En   Gptq   Model-index   Quantized   Region:us   Safetensors   Stablelm epoch

Stable Code 3B GPTQ Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Stable Code 3B GPTQ (TheBloke/stable-code-3b-GPTQ)

Stable Code 3B GPTQ Parameters and Internals

Model Type 
stablelm_epoch
Additional Notes 
The model supports long contexts and is trained for Fill in the Middle (FIM) capabilities. It uses a modified version of the GPTNeoX Tokenizer.
Supported Languages 
languages_supported (English, Code), proficiency_level (State-of-the-art performance on MultiPL-E metrics across multiple programming languages)
Training Details 
Data Sources:
tiiuae/falcon-refinedweb, bigcode/the-stack-github-issues, bigcode/commitpackft, bigcode/starcoderdata, EleutherAI/proof-pile-2, meta-math/MetaMathQA
Data Volume:
1.3 trillion tokens
Context Length:
16384
Hardware Used:
256 NVIDIA A100 40GB GPUs
Model Architecture:
Decoder-only transformer similar to LLaMA architecture
LLM NameStable Code 3B GPTQ
Repository ๐Ÿค—https://huggingface.co/TheBloke/stable-code-3b-GPTQ 
Model NameStable Code 3B
Model CreatorStability AI
Base Model(s)  Stable Code 3B   stabilityai/stable-code-3b
Model Size3b
Required VRAM1.8 GB
Updated2024-12-22
MaintainerTheBloke
Model Typestablelm_epoch
Model Files  1.8 GB
Supported Languagesen
GPTQ QuantizationYes
Quantization Typegptq
Model ArchitectureStableLMEpochForCausalLM
Licenseother
Context Length16384
Model Max Length16384
Transformers Version4.37.0.dev0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50304
Torch Data Typebfloat16

Best Alternatives to Stable Code 3B GPTQ

Best Alternatives
Context / RAM
Downloads
Likes
Zephyr Quiklang 3B 4K GPTQ4K / 1.8 GB182
Zephyr Quiklang 3B GPTQ4K / 1.8 GB171
Dopeystableplats 3B V1 GPTQ4K / 1.8 GB900
Stablelm Zephyr 3B GPTQ4K / 1.8 GB2112
...ling Stable Lm 3B 4e1t V0 GPTQ4K / 1.8 GB181
Rocket 3B GPTQ4K / 1.8 GB240
Nous Capybara 3B V1.9 GPTQ4K / 1.8 GB198
Marx 3B V3 GPTQ4K / 1.8 GB222
Akins 3B GPTQ4K / 1.8 GB222
Stable Code 3B Mlx16K / 5.6 GB91

Rank the Stable Code 3B GPTQ Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40123 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217