Fine Tuned Codegen 6B Verilog by shailja

 ยป  All LLMs  ยป  shailja  ยป  Fine Tuned Codegen 6B Verilog   URL Share it on

  Arxiv:2212.11140   Autotrain compatible   Code   Codegen   Dataset:shailja/verilog github   Endpoints compatible   Model-index   Pytorch   Region:us   Sharded

Fine Tuned Codegen 6B Verilog Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Fine Tuned Codegen 6B Verilog (shailja/fine-tuned-codegen-6B-Verilog)

Fine Tuned Codegen 6B Verilog Parameters and Internals

Model Type 
text-generation
Use Cases 
Primary Use Cases:
Teaching Assistant for Verilog HDL, Generating Verilog code snippets
Limitations:
Not an instruction model, Generated code is not guaranteed to work as intended
Considerations:
Pretrained dataset not filtered for permissive licenses
Additional Notes 
The model generates Verilog code snippets; generated content might require attribution due to the license of original datasets.
Supported Languages 
Verilog (High)
Training Details 
Data Sources:
shailja/Verilog_GitHub
Data Volume:
~72B
Methodology:
Fine-tuning
Training Time:
15 days
Hardware Used:
4 Tesla A100 GPUs
Model Architecture:
GPT-2 model with multi-query attention
Input Output 
Input Format:
Code prompt in Verilog
Accepted Modalities:
text
Output Format:
Generated Verilog code
Performance Tips:
Adding partial line of module header like 'module mux' improves output
LLM NameFine Tuned Codegen 6B Verilog
Repository ๐Ÿค—https://huggingface.co/shailja/fine-tuned-codegen-6B-Verilog 
Model Size6b
Required VRAM28.4 GB
Updated2025-03-12
Maintainershailja
Model Typecodegen
Model Files  9.9 GB: 1-of-3   9.8 GB: 2-of-3   8.7 GB: 3-of-3
Generates CodeYes
Model ArchitectureCodeGenForCausalLM
Licensebigcode-openrail-m
Transformers Version4.22.0.dev0
Tokenizer ClassGPT2Tokenizer
Vocabulary Size50295
Torch Data Typefloat32
Activation Functiongelu_new

Best Alternatives to Fine Tuned Codegen 6B Verilog

Best Alternatives
Context / RAM
Downloads
Likes
Nsql 6B0K / 28.4 GB17252
...egen 6B Nl Lora Adapter Merged0K / 14.3 GB191
...en 6B Mono Lora Adapter Merged0K / 14.3 GB130
Diff Codegen 6B V20K / 14.3 GB14036
Codegen 6B Multi0K / 14.3 GB250120
...en 6B Mono Instruct Py Revised0K / 28.4 GB162
...n 6B Mono Instruct Py Critique0K / 28.4 GB142
...gen 6B Nl Instruct Py Critique0K / 28.4 GB151
CodeGen RE0K / 28.8 GB222
Codegen 6B Nl0K / 14.3 GB20484
Note: green Score (e.g. "73.2") means that the model is better than shailja/fine-tuned-codegen-6B-Verilog.

Rank the Fine Tuned Codegen 6B Verilog Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 44902 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227