EuroLLM 1.7B Instruct by utter-project

 ยป  All LLMs  ยป  utter-project  ยป  EuroLLM 1.7B Instruct   URL Share it on

  Arxiv:2409.16235   Ar   Autotrain compatible Base model:finetune:utter-proj... Base model:utter-project/eurol...   Bg   Ca   Conversational   Cs   Da   De   El   En   Endpoints compatible   Es   Et   Fi   Fr   Ga   Gl   Hi   Hr   Hu   Instruct   It   Ja   Ko   Llama   Lt   Lv   Mt   Nl   No   Pl   Pt   Region:us   Ro   Ru   Safetensors   Sk   Sl   Sv   Tr   Uk   Zh

EuroLLM 1.7B Instruct Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
EuroLLM 1.7B Instruct (utter-project/EuroLLM-1.7B-Instruct)

EuroLLM 1.7B Instruct Parameters and Internals

Model Type 
instruction tuned multilingual transformer LLM
Additional Notes 
Model is an instruction-tuned version focusing on translation and general instructions.
Supported Languages 
Bulgarian (NLP), Croatian (NLP), Czech (NLP), Danish (NLP), Dutch (NLP), English (NLP), Estonian (NLP), Finnish (NLP), French (NLP), German (NLP), Greek (NLP), Hungarian (NLP), Irish (NLP), Italian (NLP), Latvian (NLP), Lithuanian (NLP), Maltese (NLP), Polish (NLP), Portuguese (NLP), Romanian (NLP), Slovak (NLP), Slovenian (NLP), Spanish (NLP), Swedish (NLP), Arabic (NLP), Catalan (NLP), Chinese (NLP), Galician (NLP), Hindi (NLP), Japanese (NLP), Korean (NLP), Norwegian (NLP), Russian (NLP), Turkish (NLP), Ukrainian (NLP)
Training Details 
Data Sources:
Web data, parallel data, high-quality datasets
Data Volume:
4 trillion tokens
Methodology:
Instruction tuning on EuroBlocks focused on general instruction-following and machine translation
Hardware Used:
256 Nvidia H100 GPUs
Model Architecture:
Standard dense Transformer with grouped query attention, pre-layer normalization, RMSNorm, SwiGLU activation, and RoPE positional embeddings
LLM NameEuroLLM 1.7B Instruct
Repository ๐Ÿค—https://huggingface.co/utter-project/EuroLLM-1.7B-Instruct 
Base Model(s)  utter-project/EuroLLM-1.7B   utter-project/EuroLLM-1.7B
Model Size1b
Required VRAM3.3 GB
Updated2025-05-12
Maintainerutter-project
Model Typellama
Instruction-BasedYes
Model Files  3.3 GB
Supported Languagesen de es fr it pt pl nl tr sv cs el hu ro fi uk sl sk da lt lv et bg no ca hr ga mt gl zh ru ko ja ar hi
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length4096
Model Max Length4096
Transformers Version4.44.2
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size128000
Torch Data Typebfloat16

Best Alternatives to EuroLLM 1.7B Instruct

Best Alternatives
Context / RAM
Downloads
Likes
Llama 3.2 1B Instruct128K / 2.5 GB2693529926
Llama 3.2 1B Instruct128K / 2.5 GB12321970
Nano Imp 1B128K / 3 GB568
...enchmaxx Llama 3.2 1B Instruct128K / 0 GB713
BunderMaxx 1010128K / 2.5 GB141
Lancer 1 1B Instruct128K / 2.5 GB1821
...lpaca 3.0 HarmfulLLMLat Sauce2128K / 2.5 GB470
Llama 3.2 1b OrcaSun V1128K / 3 GB50
... 6 1B Instruct Abliterated LPL128K / 2.5 GB70
Llama 3.2 1b SunOrca V1128K / 3 GB120
Note: green Score (e.g. "73.2") means that the model is better than utter-project/EuroLLM-1.7B-Instruct.

Rank the EuroLLM 1.7B Instruct Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 47272 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227