Llm4decompile 1.3B V1.5 by LLM4Binary

 »  All LLMs  »  LLM4Binary  »  Llm4decompile 1.3B V1.5   URL Share it on

  Autotrain compatible   Binary   Codegen   Decompile   Endpoints compatible   License:mit   Llama   Region:us   Safetensors

Rank the Llm4decompile 1.3B V1.5 Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Llm4decompile 1.3B V1.5 (LLM4Binary/llm4decompile-1.3b-v1.5)

Best Alternatives to Llm4decompile 1.3B V1.5

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
....3B Instruct Pruned50 Quant Ds16K /  GB70
Deepseek Coder 1.3B Instruct16K /  GB60
...th Deepseek FULL ArithHard 30K16K / 0 GB460
...ath Deepseek FULL ArithHardC1116K / 0 GB360
...epseek FULL ArithHard MixedMWP16K / 0 GB300
...h Deepseek Baseline FTMWP FULL16K / 0 GB290
...th Deepseek FULL ArithHard 50K16K / 0 GB170
...eek FULL ArithHard Low Lr 100K16K / 0 GB100
Deepseek Full Simplearithmetic16K / 0 GB100
...h Deepseek FULL ArithHard 100K16K / 0 GB90

Llm4decompile 1.3B V1.5 Parameters and Internals

LLM NameLlm4decompile 1.3B V1.5
RepositoryOpen on 🤗 
Model Size1.3b
Required VRAM2.7 GB
Updated2024-05-20
MaintainerLLM4Binary
Model Typellama
Model Files  2.7 GB
Generates CodeYes
Model ArchitectureLlamaForCausalLM
Licensemit
Context Length16384
Model Max Length16384
Transformers Version4.34.1
Tokenizer ClassLlamaTokenizer
Padding Token<|end▁of▁sentence|>
Vocabulary Size32256
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 34817 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801