Falcon2 5.5B Norwegian by ssmits

 ยป  All LLMs  ยป  ssmits  ยป  Falcon2 5.5B Norwegian   URL Share it on

  Merged Model   Autotrain compatible Base model:finetune:tiiuae/fal...   Base model:tiiuae/falcon-11b   Conversational   Custom code   Endpoints compatible   Falcon   No   Region:us   Safetensors   Sharded   Tensorflow   Tiiuae/falcon-11b

Falcon2 5.5B Norwegian Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Falcon2 5.5B Norwegian (ssmits/Falcon2-5.5B-Norwegian)

Falcon2 5.5B Norwegian Parameters and Internals

Model Type 
text generation
Use Cases 
Areas:
Research, Foundation for finetuning
Applications:
Text generation, Summarization, Chatbot
Limitations:
Limited generalization to non-English languages, Biases from web data
Considerations:
Finetuning and risk mitigation recommended for production use
Additional Notes 
Pruned 50% of the layers, trained mostly on English and some European languages.
Supported Languages 
primary (Norwegian), additional_supported_languages ()
Training Details 
Data Sources:
wikimedia/wikipedia Norwegian (no) subset
Methodology:
Passthrough merge method, pruning based on layer similarity
Input Output 
Input Format:
text
Accepted Modalities:
text
Output Format:
text
Performance Tips:
For fast inference, use Text Generation Inference
LLM NameFalcon2 5.5B Norwegian
Repository ๐Ÿค—https://huggingface.co/ssmits/Falcon2-5.5B-Norwegian 
Base Model(s)  Falcon 11B   tiiuae/falcon-11B
Merged ModelYes
Model Size11b
Required VRAM10.9 GB
Updated2024-12-22
Maintainerssmits
Model Typefalcon
Model Files  0.9 GB: 1-of-12   1.0 GB: 2-of-12   0.9 GB: 3-of-12   0.9 GB: 4-of-12   1.0 GB: 5-of-12   0.9 GB: 6-of-12   0.9 GB: 7-of-12   1.0 GB: 8-of-12   0.9 GB: 9-of-12   0.9 GB: 10-of-12   1.0 GB: 11-of-12   0.6 GB: 12-of-12
Supported Languagesno
Model ArchitectureFalconForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.40.2
Is Biased0
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|endoftext|>
Vocabulary Size65024
Torch Data Typebfloat16

Best Alternatives to Falcon2 5.5B Norwegian

Best Alternatives
Context / RAM
Downloads
Likes
Falcon 11B8K / 22.1 GB21564213
Falcon2 5.5B Multilingual8K / 10.9 GB4124
Falcon2 5.5B Polish8K / 10.9 GB21961
Falcon2 5.5B German8K / 10.9 GB3980
Falcon2 11B8K / 6.6 GB180
Enron Falcon 11B8K / 7.6 GB141
Falcon2 5.5B Czech8K / 10.9 GB420
Falcon2 5.5B Portuguese8K / 10.9 GB270
Falcon2 5.5B Spanish8K / 10.9 GB220
Falcon2 5.5B Dutch8K / 10.9 GB291
Note: green Score (e.g. "73.2") means that the model is better than ssmits/Falcon2-5.5B-Norwegian.

Rank the Falcon2 5.5B Norwegian Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217