TinyKoWiki V1 by blueapple8259

 ยป  All LLMs  ยป  blueapple8259  ยป  TinyKoWiki V1   URL Share it on

  Autotrain compatible Dataset:eaglewatch/korean wiki...   Endpoints compatible   Ko   Llama   Region:us   Safetensors

TinyKoWiki V1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").

TinyKoWiki V1 Parameters and Internals

LLM NameTinyKoWiki V1
Repository ๐Ÿค—https://huggingface.co/blueapple8259/TinyKoWiki-v1 
Model Size38.9m
Required VRAM0.2 GB
Updated2024-10-18
Maintainerblueapple8259
Model Typellama
Model Files  0.2 GB
Supported Languagesko
Model ArchitectureLlamaForCausalLM
Licensemit
Context Length16384
Model Max Length16384
Transformers Version4.35.0
Tokenizer ClassGPT2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size52000
Torch Data Typefloat32
TinyKoWiki V1 (blueapple8259/TinyKoWiki-v1)

Best Alternatives to TinyKoWiki V1

Best Alternatives
Context / RAM
Downloads
Likes
TinyKo V5 A16K / 0.2 GB40430
TinyKo V5 C16K / 0.2 GB19110
TinyKo V5 B16K / 0.2 GB18980
TinyKo V416K / 0.2 GB19091
TinyKo V316K / 0.2 GB19193
TinyKo V216K / 0.2 GB20160
Note: green Score (e.g. "73.2") means that the model is better than blueapple8259/TinyKoWiki-v1.

Rank the TinyKoWiki V1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 36966 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072803