Replit V2 CodeInstruct 3B by teknium

 ยป  All LLMs  ยป  teknium  ยป  Replit V2 CodeInstruct 3B   URL Share it on

  Autotrain compatible   Code   Custom code Dataset:bigcode/the-stack-dedu... Dataset:sahil2801/codealpaca-2... Dataset:teknium/gpteacher-code...   Endpoints compatible   Instruct   Mpt   Pytorch   Region:us   Self-instruct   Sharded

Replit V2 CodeInstruct 3B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Replit V2 CodeInstruct 3B (teknium/Replit-v2-CodeInstruct-3B)

Replit V2 CodeInstruct 3B Parameters and Internals

Model Type 
code, instruct, self instruct
Supported Languages 
Markdown (proficient), Java (proficient), JavaScript (proficient), Python (proficient), TypeScript (proficient), PHP (proficient), SQL (proficient), JSX (proficient), reStructuredText (proficient), Rust (proficient), C (proficient), CSS (proficient), Go (proficient), C++ (proficient), HTML (proficient), Vue (proficient), Ruby (proficient), Jupyter Notebook (proficient), R (proficient), Shell (proficient)
Training Details 
Data Sources:
bigcode/the-stack-dedup, sahil2801/CodeAlpaca-20k, teknium/GPTeacher-CodeInstruct
Data Volume:
~25,000 code instruction/response pairs
Methodology:
fine-tuning
Context Length:
2000
Training Time:
1 hour
Hardware Used:
2x a100 80gb
Input Output 
Input Format:
### Instruction: ### Input: ### Response: or ### Instruction: ### Response:
Performance Tips:
Use specified sampler settings and tokenizer decode arguments for optimal performance.
LLM NameReplit V2 CodeInstruct 3B
Repository ๐Ÿค—https://huggingface.co/teknium/Replit-v2-CodeInstruct-3B 
Model Size3b
Required VRAM10.4 GB
Updated2024-12-21
Maintainerteknium
Model Typempt
Model Files  10.0 GB: 1-of-2   0.4 GB: 2-of-2
Supported Languagescode
Model ArchitectureMPTForCausalLM
Licensecc-by-sa-4.0
Model Max Length2000
Transformers Version4.29.2
Tokenizer ClassReplitLMTokenizer
Padding Token<|pad|>
Vocabulary Size32768
Torch Data Typefloat16

Quantized Models of the Replit V2 CodeInstruct 3B

Model
Likes
Downloads
VRAM
Replit V2 CodeInstruct 3B GGML33181 GB

Best Alternatives to Replit V2 CodeInstruct 3B

Best Alternatives
Context / RAM
Downloads
Likes
Replit Code V1.5 3B0K / 6.6 GB27008289
Replit Code V1 3B0K / 10.4 GB705724
Sea Lion 3B0K / 6.4 GB49516
Code Millenials 3B0K / 5.2 GB251
Mpt 3B 8K Instruct0K / 6.9 GB183
Glaive Function Calling V10K / 10.4 GB6167
...aive Function Calling V2 Small0K / 10.4 GB1613
Evol Replit V10K / 10.4 GB118
Replit CodeInstruct V30K / 10.4 GB92
Replit V1 CodeInstruct 3B0K / 10.4 GB1636
Note: green Score (e.g. "73.2") means that the model is better than teknium/Replit-v2-CodeInstruct-3B.

Rank the Replit V2 CodeInstruct 3B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40013 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217