Hkcode Solar Youtube Merged by hyokwan

 ยป  All LLMs  ยป  hyokwan  ยป  Hkcode Solar Youtube Merged   URL Share it on

  Autotrain compatible   Conversational Dataset:hyokwan/llama3data hkc...   Endpoints compatible   Hkcode   Hyokwan   Instruct   Ko   Llama   Llama2   Merge   Moe   Region:us   Safetensors   Sharded   Solar   Tensorflow

Hkcode Solar Youtube Merged Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Hkcode Solar Youtube Merged (hyokwan/hkcode-solar-youtube-merged)

Hkcode Solar Youtube Merged Parameters and Internals

Model Type 
text generation
Additional Notes 
hkcode-solar-youtube-merged is a continued pretrained language model aimed at a specific department of a university.
Supported Languages 
ko (Korean)
Safety Evaluation 
Methodologies:
Meta Llama Guard 2, Code Shield
Ethical Considerations:
The core values of Llama 3 are openness, inclusivity and helpfulness. It respects the dignity and autonomy of all users, especially in terms of the values of free thought and expression. Testing conducted has been in English.
Responsible Ai Considerations 
Mitigation Strategies:
Encourage developers to tune and deploy safeguards tailored to their needs. Includes Meta Llama Guard 2 and Code Shield technologies.
LLM NameHkcode Solar Youtube Merged
Repository ๐Ÿค—https://huggingface.co/hyokwan/hkcode-solar-youtube-merged 
Model Size10.7b
Required VRAM21.4 GB
Updated2025-02-22
Maintainerhyokwan
Model Typellama
Instruction-BasedYes
Model Files  4.9 GB: 1-of-5   5.0 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   1.7 GB: 5-of-5
Supported Languagesko
Model ArchitectureLlamaForCausalLM
Licensemit
Context Length4096
Model Max Length4096
Transformers Version4.41.2
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Hkcode Solar Youtube Merged

Best Alternatives
Context / RAM
Downloads
Likes
SOLAR 10.7B Instruct V1.0 128K128K / 21.4 GB3194
SOLAR 10.7B V1.0 Instruct 16K16K / 21.4 GB632
SauerkrautLM SOLAR Instruct8K / 21.4 GB200946
MetaModelv28K / 21.4 GB20220
Kazemi 1.2 Solar8K / 21.4 GB00
SOLAR 10.7B Instruct V1.04K / 21.4 GB102273621
Solarmer34K / 21.4 GB230
Somer24K / 21.4 GB400
Somer4K / 21.4 GB130
ConfigurableSOLAR 10.7B4K / 21.4 GB38452
Note: green Score (e.g. "73.2") means that the model is better than hyokwan/hkcode-solar-youtube-merged.

Rank the Hkcode Solar Youtube Merged Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227