Falcon2 5.5B Romanian by ssmits

 ยป  All LLMs  ยป  ssmits  ยป  Falcon2 5.5B Romanian   URL Share it on

  Merged Model   Autotrain compatible Base model:finetune:tiiuae/fal...   Base model:tiiuae/falcon-11b   Conversational   Custom code   Endpoints compatible   Falcon   Region:us   Ro   Safetensors   Sharded   Tensorflow   Tiiuae/falcon-11b

Falcon2 5.5B Romanian Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Falcon2 5.5B Romanian (ssmits/Falcon2-5.5B-Romanian)

Falcon2 5.5B Romanian Parameters and Internals

Model Type 
text-generation
Use Cases 
Primary Use Cases:
Research on large language models, Foundation for further specialization and finetuning for specific use cases (e.g., summarization, text generation, chatbot)
Limitations:
Production use without adequate assessment of risks and mitigation, Any use cases which may be considered irresponsible or harmful
Additional Notes 
The model was pruned from Falcon-11B and adapted for Romanian.
Supported Languages 
Romanian (primary), English (also supported languages including German, Spanish, French, Italian, Portuguese, Polish, Dutch, Czech, Swedish)
Training Details 
Data Sources:
wikimedia/wikipedia Romanian (ro) subset
Methodology:
The model was pruned using the passthrough merge method, utilizing PruneMe by investigating layer similarity with 2000 samples. The layer ranges for pruning were determined based on this analysis to maintain performance while reducing model size.
Input Output 
Accepted Modalities:
text
Output Format:
text
Performance Tips:
For fast inference with Falcon, check-out Text Generation Inference on Hugging Face blog.
LLM NameFalcon2 5.5B Romanian
Repository ๐Ÿค—https://huggingface.co/ssmits/Falcon2-5.5B-Romanian 
Base Model(s)  Falcon 11B   tiiuae/falcon-11B
Merged ModelYes
Model Size11b
Required VRAM10.9 GB
Updated2025-02-22
Maintainerssmits
Model Typefalcon
Model Files  0.9 GB: 1-of-12   1.0 GB: 2-of-12   0.9 GB: 3-of-12   0.9 GB: 4-of-12   1.0 GB: 5-of-12   0.9 GB: 6-of-12   0.9 GB: 7-of-12   1.0 GB: 8-of-12   0.9 GB: 9-of-12   0.9 GB: 10-of-12   1.0 GB: 11-of-12   0.6 GB: 12-of-12
Supported Languagesro
Model ArchitectureFalconForCausalLM
Licenseapache-2.0
Context Length8192
Model Max Length8192
Transformers Version4.40.2
Is Biased0
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<|endoftext|>
Vocabulary Size65024
Torch Data Typebfloat16

Best Alternatives to Falcon2 5.5B Romanian

Best Alternatives
Context / RAM
Downloads
Likes
Falcon 11B8K / 22.1 GB31645212
Falcon2 5.5B Multilingual8K / 10.9 GB2144
Falcon2 5.5B Polish8K / 10.9 GB14941
Falcon2 5.5B Portuguese8K / 10.9 GB2040
Falcon2 11B8K / 6.6 GB520
Enron Falcon 11B8K / 7.6 GB141
Falcon2 5.5B Dutch8K / 10.9 GB791
Falcon2 5.5B Italian8K / 10.9 GB640
Falcon2 5.5B German8K / 10.9 GB350
Falcon2 5.5B Czech8K / 10.9 GB280
Note: green Score (e.g. "73.2") means that the model is better than ssmits/Falcon2-5.5B-Romanian.

Rank the Falcon2 5.5B Romanian Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227