Fine Tuned Codegen 2B Verilog by shailja

 ยป  All LLMs  ยป  shailja  ยป  Fine Tuned Codegen 2B Verilog   URL Share it on

  Arxiv:2212.11140   Autotrain compatible   Code   Codegen   Dataset:shailja/verilog github   Endpoints compatible   Model-index   Pytorch   Region:us   Sharded

Fine Tuned Codegen 2B Verilog Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Fine Tuned Codegen 2B Verilog (shailja/fine-tuned-codegen-2B-Verilog)

Fine Tuned Codegen 2B Verilog Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
research, automation
Applications:
Verilog RTL code generation
Primary Use Cases:
Verilog teaching assistant
Limitations:
Generated code may not work as intended., Can be inefficient, contain bugs, or exploits.
Considerations:
Model is capable of generating Verilog snippets provided some context.
Additional Notes 
The purpose is to assist in Verilog code generation based on input prompts.
Supported Languages 
Verilog (proficient)
Training Details 
Data Sources:
Verilog Dataset
Data Volume:
~72B tokens
Methodology:
fine-tuning
Context Length:
2048
Training Time:
8 days
Hardware Used:
3 Tesla A100 GPUs
Model Architecture:
GPT-2 model with multi-query attention
Input Output 
Input Format:
Module header or partial Verilog code.
Accepted Modalities:
text
Output Format:
Verilog code snippet
Performance Tips:
Model performs best with contextual input.
LLM NameFine Tuned Codegen 2B Verilog
Repository ๐Ÿค—https://huggingface.co/shailja/fine-tuned-codegen-2B-Verilog 
Model Size2b
Required VRAM11.3 GB
Updated2025-02-22
Maintainershailja
Model Typecodegen
Model Files  10.0 GB: 1-of-2   1.3 GB: 2-of-2
Generates CodeYes
Model ArchitectureCodeGenForCausalLM
Licensebigcode-openrail-m
Transformers Version4.22.0.dev0
Tokenizer ClassGPT2Tokenizer
Vocabulary Size50295
Torch Data Typefloat32
Activation Functiongelu_new

Best Alternatives to Fine Tuned Codegen 2B Verilog

Best Alternatives
Context / RAM
Downloads
Likes
...Python 18K Alpaca Full Dataset0K / 5.6 GB1070
Archgen 2B V10K / 5.6 GB80
Salesforce Codegen 2B Multi Ov0K / 11.1 GB90
Nsql 2B0K / 11.3 GB879
Instruct Codegen 2B Multi0K / 11.3 GB131
Diff Codegen 2B V20K / 5.7 GB446
...en 2B Mono Instruct Py Revised0K / 11.3 GB21
Codegen 2B Mono Xlcost0K / 5.7 GB131
Codegen 2B Multi Xlcost0K / 5.7 GB1081
Codegen 2B Multi0K / 5.7 GB277136
Note: green Score (e.g. "73.2") means that the model is better than shailja/fine-tuned-codegen-2B-Verilog.

Rank the Fine Tuned Codegen 2B Verilog Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227