H2o Danube 1.8B Sft by h2oai

 ยป  All LLMs  ยป  h2oai  ยป  H2o Danube 1.8B Sft   URL Share it on

  Arxiv:2401.16818   Autotrain compatible   Conversational Dataset:huggingfaceh4/ultracha...   Dataset:meta-math/metamathqa   Dataset:open-orca/openorca   Dataset:openassistant/oasst2   En   Endpoints compatible   Gpt   H2o-llmstudio   License:apache-2.0   Mistral   Region:us   Safetensors

Rank the H2o Danube 1.8B Sft Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
H2o Danube 1.8B Sft (h2oai/h2o-danube-1.8b-sft)

Best Alternatives to H2o Danube 1.8B Sft

Best Alternatives
HF Rank
H2o Danube 1.8B Chat16K / 3.7 GB1139350
H2o Danube 1.8B Base16K / 3.7 GB1075041
Cypher Mini 1.8B16K / 3.7 GB25712
Cypher CoT 1.8B16K / 3.7 GB271
PixieZehirNano16K / 3.7 GB430
Finetuned Danube16K / 3.7 GB210
H2o Danube Oasst Chat16K / 3.7 GB170
Cypher Mini Laser 1.8B16K / 3.7 GB150
Cypher CoT Laser 1.8B16K / 3.7 GB140
H2oai H2o Danube 1.8B Chat16K / 3.7 GB80

H2o Danube 1.8B Sft Parameters and Internals

LLM NameH2o Danube 1.8B Sft
RepositoryOpen on ๐Ÿค— 
Model Size1.8b
Required VRAM3.7 GB
Model Typemistral
Model Files  3.7 GB
Supported Languagesen
Model ArchitectureMistralForCausalLM
Context Length16384
Model Max Length16384
Transformers Version4.36.1
Tokenizer ClassLlamaTokenizer
Padding Token<unk>
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35526 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20240042001