Polka 1.1B by eryk-mazus

 ยป  All LLMs  ยป  eryk-mazus  ยป  Polka 1.1B   URL Share it on

  Arxiv:2309.04662   Autotrain compatible Base model:eryk-mazus/tinyllam... Base model:finetune:eryk-mazus...   Dataset:allenai/madlad-400 Dataset:eryk-mazus/polka-pretr...   En   Endpoints compatible   Llama   Pl   Pytorch   Region:us   Safetensors
Model Card on HF ๐Ÿค—: https://huggingface.co/eryk-mazus/polka-1.1b 

Polka 1.1B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Polka 1.1B (eryk-mazus/polka-1.1b)

Polka 1.1B Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
text generation
Primary Use Cases:
Polish text generation
Limitations:
Potential for hallucination due to model size
Considerations:
Base model for instruction tuning, resulting in derived models like polka-1.1b-chat
Additional Notes 
Shares the tokenizer vocabulary extension for improved efficiency with Polish text. Capable of coherent Polish text generation, but may hallucinate due to model size.
Supported Languages 
pl (proficient), en (proficient)
Training Details 
Data Sources:
allenai/MADLAD-400, eryk-mazus/polka-pretrain-en-pl-v1
Data Volume:
5.7 billion Polish tokens
Methodology:
Pretraining with continued learning on a 10:1 Polish to English token ratio
Context Length:
2048
Training Time:
680 GPU hours
Hardware Used:
8 x RTX 4090
Model Architecture:
Enhanced TinyLlama-1.1B with additional training and extended tokenizer vocabulary
Input Output 
Performance Tips:
Supports a context size up to 2048 tokens; tune parameters for better performance.
LLM NamePolka 1.1B
Repository ๐Ÿค—https://huggingface.co/eryk-mazus/polka-1.1b 
Base Model(s)  eryk-mazus/tinyllama-with-custom-tokenizer   eryk-mazus/tinyllama-with-custom-tokenizer
Model Size1.1b
Required VRAM2.3 GB
Updated2025-02-16
Maintainereryk-mazus
Model Typellama
Model Files  2.3 GB   2.3 GB
Supported Languagespl en
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size43904
Torch Data Typebfloat16

Best Alternatives to Polka 1.1B

Best Alternatives
Context / RAM
Downloads
Likes
Coven Tiny 1.1b 32k Orpo Alpha32K / 2.2 GB1502
Test Mix 0132K / 2.2 GB1560
Palmer Merge Test 532K / 2.2 GB1460
...llama 1.1B 16K Instructions V432K / 2.2 GB110
TinyLlama 1.1B 32K Instruct32K / 2.2 GB30012
Palmer 002 32K32K / 2.2 GB1590
Tinyllama History Chat V1.132K / 2.2 GB1100
TinyLlama 1.1B 32K32K / 2.2 GB12128
TinyJ.O.S.I.E. 1.1B 32K Base32K / 2.2 GB301
Tinyllama 32k32K / 2.2 GB890
Note: green Score (e.g. "73.2") means that the model is better than eryk-mazus/polka-1.1b.

Rank the Polka 1.1B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43233 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227