Exaone 3.0 7.8B It by Bingsu

 ยป  All LLMs  ยป  Bingsu  ยป  Exaone 3.0 7.8B It   URL Share it on

  Autotrain compatible   Conversational   En   Endpoints compatible   Exaone   Gguf   Ko   Lg-ai   Llama   Q4   Quantized   Region:us   Safetensors   Sharded   Tensorflow

Exaone 3.0 7.8B It Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Exaone 3.0 7.8B It (Bingsu/exaone-3.0-7.8b-it)

Exaone 3.0 7.8B It Parameters and Internals

Model Type 
text generation, multimodal
Use Cases 
Areas:
research, commercial applications
Applications:
customer support chatbot, legal document summarization, medical information provision
Additional Notes 
Model tuned for high proficiency and accuracy in multilingual tasks especially in English and Korean.
Supported Languages 
en (proficient), ko (proficient)
Training Details 
Data Sources:
Bingsu
Methodology:
instruction tuning
Context Length:
4096
Hardware Used:
NVIDIA L4
Model Architecture:
LLaMA
Input Output 
Input Format:
array of messages with 'role' and 'content' keys
Accepted Modalities:
text
Output Format:
chat completion outputs with tokens count
LLM NameExaone 3.0 7.8B It
Repository ๐Ÿค—https://huggingface.co/Bingsu/exaone-3.0-7.8b-it 
Model Size7.8b
Required VRAM15.6 GB
Updated2025-01-22
MaintainerBingsu
Model Typellama
Model Files  15.6 GB   31.3 GB   4.8 GB   5.6 GB   8.3 GB   5.0 GB: 1-of-4   4.9 GB: 2-of-4   4.9 GB: 3-of-4   0.8 GB: 4-of-4
Supported Languagesen ko
GGUF QuantizationYes
Quantization Typeq4|gguf|q4_k|q5_k
Model ArchitectureLlamaForCausalLM
Licenseother
Context Length4096
Model Max Length4096
Transformers Version4.44.0
Tokenizer ClassGPT2Tokenizer
Padding Token[PAD]
Vocabulary Size102400
Torch Data Typebfloat16

Best Alternatives to Exaone 3.0 7.8B It

Best Alternatives
Context / RAM
Downloads
Likes
SG Raccoon Yi 200K 2.0 GPTQ195K / 29.2 GB140
Fundacional 100percent8K / 15.6 GB1170
...3.0 7.8B Instruct Llamafied 8K8K / 31.2 GB56

Rank the Exaone 3.0 7.8B It Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41728 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227