Granite 20B Code Base 8K by ibm-granite

 ยป  All LLMs  ยป  ibm-granite  ยป  Granite 20B Code Base 8K   URL Share it on

  Arxiv:2405.04324   Autotrain compatible   Code   Codegen   Dataset:bigcode/starcoderdata Dataset:codeparrot/github-code...   Dataset:math-ai/stackmathqa Dataset:open-web-math/open-web...   Endpoints compatible   Ext 8k   Gpt bigcode   Granite   Model-index   Region:us   Safetensors   Sharded   Tensorflow

Granite 20B Code Base 8K Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Granite 20B Code Base 8K (ibm-granite/granite-20b-code-base-8k)

Granite 20B Code Base 8K Parameters and Internals

Model Type 
code generation, decoder-only, text-generation
Use Cases 
Areas:
enterprise use, software engineering productivity
Applications:
code generation, code explanation, code fixing, generating unit tests, generating documentation, addressing technical debt issues, vulnerability detection, code translation
Limitations:
Risks of problematic outputs, No safety alignment, Increased susceptibility to hallucination
Considerations:
Caution against complete reliance for crucial decisions
Supported Languages 
116 programming languages (comprehensive)
Training Details 
Data Sources:
Publicly available datasets from GitHub Code Clean, Starcoder data
Data Volume:
3 trillion tokens (Phase 1), 500 billion tokens (Phase 2)
Methodology:
Two-phase training strategy (comprehensive understanding, improved reasoning)
Hardware Used:
IBM's Vela and Blue Vela supercomputing clusters, NVIDIA A100 and H100 GPUs
Model Architecture:
Decoder-only code model
Safety Evaluation 
Risk Categories:
malicious utilization, unsafe code generation
Ethical Considerations:
The generated code is not guaranteed to work as intended, risks of malicious use.
Responsible Ai Considerations 
Mitigation Strategies:
HAP, PII, Malware Filtering
Release Notes 
Date:
May 6th, 2024
Notes:
Model released with decoder-only architecture suited for code generative tasks.
LLM NameGranite 20B Code Base 8K
Repository ๐Ÿค—https://huggingface.co/ibm-granite/granite-20b-code-base-8k 
Model Size20b
Required VRAM40 GB
Updated2024-12-21
Maintaineribm-granite
Model Typegpt_bigcode
Model Files  5.0 GB: 1-of-9   4.9 GB: 2-of-9   4.9 GB: 3-of-9   4.9 GB: 4-of-9   4.9 GB: 5-of-9   4.9 GB: 6-of-9   4.9 GB: 7-of-9   4.9 GB: 8-of-9   0.7 GB: 9-of-9
Context Length8k
Generates CodeYes
Model ArchitectureGPTBigCodeForCausalLM
Licenseapache-2.0
Model Max Length8192
Transformers Version4.41.2
Tokenizer ClassGPT2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size49152
Torch Data Typebfloat16
Activation Functiongelu

Best Alternatives to Granite 20B Code Base 8K

Best Alternatives
Context / RAM
Downloads
Likes
Granite 20B Code Instruct0K / 40 GB1020930
Granite 20B Functioncalling0K / 40 GB63527
Granite 20B Code Base R1.10K / 40 GB4652
Granite 20B Code Base0K / 40 GB225012
Granite 20B Code Instruct 8K0K / 40 GB98839
Granite 20B Code Instruct R1.10K / 40 GB801
Granite 20B Code Base FP80K / 20.4 GB140
Granite 20B Code Base GGUF0K / 12.8 GB80
Note: green Score (e.g. "73.2") means that the model is better than ibm-granite/granite-20b-code-base-8k.

Rank the Granite 20B Code Base 8K Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40013 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217