Viking 7B by LumiOpen

 ยป  All LLMs  ยป  LumiOpen  ยป  Viking 7B   URL Share it on

  Autotrain compatible   Da   Dataset:bigcode/starcoderdata Dataset:cerebras/slimpajama-62...   Dataset:mc4   En   Endpoints compatible   Fi   Is   Llama   Nn   No   Region:us   Safetensors   Sharded   Sv   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/LumiOpen/Viking-7B 

Viking 7B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Viking 7B (LumiOpen/Viking-7B)

Viking 7B Parameters and Internals

Model Type 
decoder-only, transformer
Use Cases 
Areas:
research, commercial applications
Limitations:
no meaningful proficiency in languages outside of English, Finnish, Swedish, Norwegian, Danish, Icelandic, and code
Considerations:
The model is partially trained, and ethical considerations should guide its application.
Supported Languages 
fi (fluent), en (fluent), da (fluent), sv (fluent), no (fluent), nn (fluent), is (fluent)
Training Details 
Data Sources:
cerebras/SlimPajama-627B, bigcode/starcoderdata, mc4
Data Volume:
2 trillion tokens
Methodology:
uses a LLaMA-like GPT architecture, rotary positional embeddings, flash attention
Context Length:
4096
Hardware Used:
256 AMD MI250X GPUs
Model Architecture:
decoder-only transformer
Safety Evaluation 
Ethical Considerations:
Viking may produce outputs that can be considered inaccurate, prejudiced, or controversial. Users should exercise discretion.
Responsible Ai Considerations 
Mitigation Strategies:
Special care should be taken when using outputs.
Input Output 
Accepted Modalities:
text
Release Notes 
Notes:
Training checkpoints available as branches in the repository, e.g., 100B, 200B, up to 2000B tokens.
LLM NameViking 7B
Repository ๐Ÿค—https://huggingface.co/LumiOpen/Viking-7B 
Model Size7b
Required VRAM15.1 GB
Updated2025-05-14
MaintainerLumiOpen
Model Typellama
Model Files  4.9 GB: 1-of-4   5.0 GB: 2-of-4   4.1 GB: 3-of-4   1.1 GB: 4-of-4
Supported Languagesfi en da sv no nn is
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.40.0
Tokenizer ClassBloomTokenizer
Padding Token<pad>
Vocabulary Size131072
Torch Data Typebfloat16

Quantized Models of the Viking 7B

Model
Likes
Downloads
VRAM
...sh Alpaca Tiny V2 7B EXL2 4bpw074 GB
Viking 7B EXL2 4bpw154 GB

Best Alternatives to Viking 7B

Best Alternatives
Context / RAM
Downloads
Likes
A6 L1024K / 16.1 GB2010
M1024K / 16.1 GB1270
1571024K / 16.1 GB1010
1241024K / 16.1 GB930
A3.41024K / 16.1 GB130
A5.41024K / 16.1 GB120
A2.41024K / 16.1 GB120
2 Very Sci Fi1024K / 16.1 GB3170
1621024K / 16.1 GB600
1181024K / 16.1 GB150
Note: green Score (e.g. "73.2") means that the model is better than LumiOpen/Viking-7B.

Rank the Viking 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 47340 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227