Danube2 Singlish Finetuned by h2oai

 ยป  All LLMs  ยป  h2oai  ยป  Danube2 Singlish Finetuned   URL Share it on

  Autotrain compatible   Conversational   En   Finetuned   Gpt   H2o-llmstudio   Mistral   Region:us   Safetensors

Danube2 Singlish Finetuned Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Danube2 Singlish Finetuned (h2oai/danube2-singlish-finetuned)

Danube2 Singlish Finetuned Parameters and Internals

Model Type 
large language model, gpt
Use Cases 
Areas:
Research, Commercial applications
Primary Use Cases:
Singapore-specific text generation, Cultural and linguistic understanding
Limitations:
Potential biases, Not suitable for all ethical applications
Additional Notes 
H2O LLM Studio used for model tuning specifically to enhance cultural relevance for Singapore.
Supported Languages 
en (Highly proficient), sg (Localized to Singapore English (Singlish))
Training Details 
Data Sources:
H2O LLM Studio
Input Output 
Input Format:
<|prompt|>{prompt_text}~~<|answer|>
Accepted Modalities:
text
Output Format:
Return generated text based on input prompt context and model training.
LLM NameDanube2 Singlish Finetuned
Repository ๐Ÿค—https://huggingface.co/h2oai/danube2-singlish-finetuned 
Model Size1.8b
Required VRAM3.7 GB
Updated2025-02-05
Maintainerh2oai
Model Typemistral
Model Files  3.7 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Context Length8192
Model Max Length8192
Transformers Version4.38.2
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Danube2 Singlish Finetuned

Best Alternatives
Context / RAM
Downloads
Likes
H2o Danube 1.8B Base16K / 3.7 GB39542
H2o Danube 1.8B Chat16K / 3.7 GB53654
H2o Danube 1.8B Sft16K / 3.7 GB46311
Cypher Mini 1.8B16K / 3.7 GB592
Cypher CoT 1.8B16K / 3.7 GB1061
PixieZehirNano16K / 3.7 GB100
...1.8B Chat Sft Merge Fourier V116K / 7.3 GB51
H2o Danube2 1.8B Chat8K / 3.7 GB275661
H2o Danube2 1.8B Base8K / 3.7 GB69346
H2o Danube2 1.8B Sft8K / 3.7 GB4596
Note: green Score (e.g. "73.2") means that the model is better than h2oai/danube2-singlish-finetuned.

Rank the Danube2 Singlish Finetuned Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227