SeaLLMs V3.1.5B Chat by SeaLLMs

 ยป  All LLMs  ยป  SeaLLMs  ยป  SeaLLMs V3.1.5B Chat   URL Share it on

  Arxiv:2306.05179   Arxiv:2407.19672   Autotrain compatible   Conversational   En   Endpoints compatible   Id   Jv   Ms   Multilingual   Qwen2   Region:us   Safetensors   Sea   Ta   Th   Tl   Vi   Zh

SeaLLMs V3.1.5B Chat Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
SeaLLMs V3.1.5B Chat (SeaLLMs/SeaLLMs-v3-1.5B-Chat)

SeaLLMs V3.1.5B Chat Parameters and Internals

Model Type 
multilingual, causal, instruction-following
Use Cases 
Areas:
Research, Commercial applications
Applications:
Translation, World knowledge, Mathematical reasoning, Instruction following
Primary Use Cases:
Handling languages in SEA region, Follows human instructions effectively for task completion
Considerations:
Ensure local governance and regulations are followed.
Additional Notes 
Significantly enhanced instruction-following capability, especially in multi-turn settings.
Supported Languages 
en (high), zh (high), id (high), vi (high), th (high), ms (high), tl (high), ta (high), jv (high)
Safety Evaluation 
Findings:
reduced instances of hallucination
Risk Categories:
inaccurate generation, misleading generation, potentially harmful generation
Ethical Considerations:
Developers and stakeholders should perform their own red teaming and provide related security measures before deployment, and they must abide by and comply with local governance and regulations.
Responsible Ai Considerations 
Mitigation Strategies:
Enhanced safety measures for trustworthy outputs.
Input Output 
Input Format:
Multi-turn chat
Accepted Modalities:
text
Output Format:
text
LLM NameSeaLLMs V3.1.5B Chat
Repository ๐Ÿค—https://huggingface.co/SeaLLMs/SeaLLMs-v3-1.5B-Chat 
Model Size1.5b
Required VRAM3.1 GB
Updated2025-01-23
MaintainerSeaLLMs
Model Typeqwen2
Model Files  3.1 GB
Supported Languagesen zh id vi th ms tl ta jv
Model ArchitectureQwen2ForCausalLM
Licenseother
Context Length131072
Model Max Length131072
Transformers Version4.41.2
Tokenizer ClassQwen2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size151936
Torch Data Typebfloat16
Errorsreplace

Best Alternatives to SeaLLMs V3.1.5B Chat

Best Alternatives
Context / RAM
Downloads
Likes
ReaderLM V2500K / 3.5 GB14065368
Reader Lm 1.5B250K / 3.1 GB18504582
DeepSeek R1 Distill Qwen 1.5B128K / 3.5 GB25673270
...Seek R1 Distill Qwen 1.5B ONNX128K /  GB227511
Qwen2.5 1.5B128K / 3.1 GB7104856
Stella En 1.5B V5128K / 6.2 GB581890211
NxMobileLM 1.5B SFT128K / 3.1 GB512
DeepSeek R1 Distill Qwen 1.5B128K / 3.5 GB2613
Qwen2 1.5B128K / 3.1 GB2948584
Gte Qwen2 1.5B Instruct128K / 7.1 GB216037155
Note: green Score (e.g. "73.2") means that the model is better than SeaLLMs/SeaLLMs-v3-1.5B-Chat.

Rank the SeaLLMs V3.1.5B Chat Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41774 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227