Granite 20B Code Base by ibm-granite

 ยป  All LLMs  ยป  ibm-granite  ยป  Granite 20B Code Base   URL Share it on

  Arxiv:2405.04324   Autotrain compatible   Code   Codegen   Dataset:bigcode/starcoderdata Dataset:codeparrot/github-code...   Dataset:math-ai/stackmathqa Dataset:open-web-math/open-web...   Endpoints compatible   Gpt bigcode   Granite   Model-index   Region:us   Safetensors   Sharded   Tensorflow

Granite 20B Code Base Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

Granite 20B Code Base Parameters and Internals

LLM NameGranite 20B Code Base
Repository ๐Ÿค—https://huggingface.co/ibm-granite/granite-20b-code-base 
Model Size20b
Required VRAM40 GB
Updated2024-09-01
Maintaineribm-granite
Model Typegpt_bigcode
Model Files  5.0 GB: 1-of-9   4.9 GB: 2-of-9   4.9 GB: 3-of-9   4.9 GB: 4-of-9   4.9 GB: 5-of-9   4.9 GB: 6-of-9   4.9 GB: 7-of-9   4.9 GB: 8-of-9   0.7 GB: 9-of-9
Generates CodeYes
Model ArchitectureGPTBigCodeForCausalLM
Licenseapache-2.0
Model Max Length9223372036854775807
Transformers Version4.38.1
Tokenizer ClassGPT2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size49152
Torch Data Typebfloat16
Activation Functiongelu_pytorch_tanh
Granite 20B Code Base (ibm-granite/granite-20b-code-base)

Quantized Models of the Granite 20B Code Base

Model
Likes
Downloads
VRAM
Granite 20B Code Base GGUF01612 GB

Best Alternatives to Granite 20B Code Base

Best Alternatives
Context / RAM
Downloads
Likes
Granite 20B Functioncalling0K / 40 GB82522
Granite 20B Code Instruct0K / 40 GB1020930
Granite 20B Code Base R1.10K / 40 GB8022
Granite 20B Code Base 8K0K / 40 GB501613
Granite 20B Code Instruct 8K0K / 40 GB226437
Granite 20B Code Instruct R1.10K / 40 GB2591
Granite 20B Code Base FP80K / 20.4 GB50
Granite 20B Code Base GGUF0K / 12.8 GB160
Note: green Score (e.g. "73.2") means that the model is better than ibm-granite/granite-20b-code-base.

Rank the Granite 20B Code Base Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 35693 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803