Polka 1.1B Chat by eryk-mazus

 ยป  All LLMs  ยป  eryk-mazus  ยป  Polka 1.1B Chat   URL Share it on

  Arxiv:2305.18290   Autotrain compatible   Conversational Dataset:eryk-mazus/polka-dpo-v...   Generated from trainer   Llama   Pl   Polish   Region:us   Safetensors

Polka 1.1B Chat Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Polka 1.1B Chat (eryk-mazus/polka-1.1b-chat)

Polka 1.1B Chat Parameters and Internals

Model Type 
text generation, conversational AI
Use Cases 
Areas:
text generation, conversational applications
Applications:
chatbots, customer service
Primary Use Cases:
acting as a conversational assistant
Additional Notes 
The model uses a custom tokenizer extended for efficient Polish text generation.
Supported Languages 
polish (proficient)
Training Details 
Data Sources:
eryk-mazus/polka-pretrain-en-pl-v1, eryk-mazus/polka-dpo-v1
Data Volume:
5.7 billion tokens
Methodology:
Direct Preference Optimization (DPO)
Context Length:
4096
Model Architecture:
Custom, extended tokenizer for Polish text generation, fine-tuned with DPO.
Input Output 
Input Format:
ChatML format
Accepted Modalities:
text
Output Format:
text
LLM NamePolka 1.1B Chat
Repository ๐Ÿค—https://huggingface.co/eryk-mazus/polka-1.1b-chat 
Model Size1.1b
Required VRAM4.6 GB
Updated2025-02-18
Maintainereryk-mazus
Model Typellama
Model Files  4.6 GB
Supported Languagespl
Model ArchitectureLlamaForCausalLM
Licensemit
Context Length4096
Model Max Length4096
Transformers Version4.35.2
Tokenizer ClassLlamaTokenizer
Padding Token<|im_end|>
Vocabulary Size43904
Torch Data Typefloat32

Quantized Models of the Polka 1.1B Chat

Model
Likes
Downloads
VRAM
Polka 1.1B Chat 8bpw EXL2161 GB
Polka 1.1B Chat Gguf41090 GB

Best Alternatives to Polka 1.1B Chat

Best Alternatives
Context / RAM
Downloads
Likes
Coven Tiny 1.1b 32k Orpo Alpha32K / 2.2 GB1532
Test Mix 0132K / 2.2 GB1580
Palmer Merge Test 532K / 2.2 GB1490
...llama 1.1B 16K Instructions V432K / 2.2 GB950
Palmer 002 32K32K / 2.2 GB1620
TinyLlama 1.1B 32K Instruct32K / 2.2 GB20612
Tinyllama History Chat V1.132K / 2.2 GB1130
TinyLlama 1.1B 32K32K / 2.2 GB12628
TinyJ.O.S.I.E. 1.1B 32K Base32K / 2.2 GB301
Tinyllama 32k32K / 2.2 GB850
Note: green Score (e.g. "73.2") means that the model is better than eryk-mazus/polka-1.1b-chat.

Rank the Polka 1.1B Chat Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43267 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227