Lynx Micro by four-two-labs

 ยป  All LLMs  ยป  four-two-labs  ยป  Lynx Micro   URL Share it on

  Autotrain compatible   Conversational   En   Endpoints compatible   Gemma   Region:us   Safetensors   Sharded   Sv   Tensorflow

Lynx Micro Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Lynx Micro (four-two-labs/lynx-micro)

Lynx Micro Parameters and Internals

Model Type 
Auto-regressive transformer
Additional Notes 
Model is small in terms of size and hasn't memorized as much information as larger models tend to do.
Supported Languages 
Swedish (proficient), English (proficient)
Training Details 
Data Sources:
Proprietary dataset comprising high quality Swedish instruct data, single turn and multi-turn conversations, and high quality Swedish-English translations.
Data Volume:
~1.35M examples
Methodology:
Training with Huggingface Accelerate and TRL.
Context Length:
8000
Hardware Used:
8xH100
Model Architecture:
Auto-regressive transformer
LLM NameLynx Micro
Repository ๐Ÿค—https://huggingface.co/four-two-labs/lynx-micro 
Model Size2b
Required VRAM5 GB
Updated2025-02-22
Maintainerfour-two-labs
Model Typegemma
Model Files  2.1 GB: 1-of-3   2.1 GB: 2-of-3   0.8 GB: 3-of-3
Supported Languagessv en
Model ArchitectureGemmaForCausalLM
Context Length8192
Model Max Length8192
Transformers Version4.40.2
Tokenizer ClassGemmaTokenizer
Padding Token<eos>
Vocabulary Size256000
Torch Data Typebfloat16

Best Alternatives to Lynx Micro

Best Alternatives
Context / RAM
Downloads
Likes
Gemma 1.1 2B It8K / 5.1 GB107608154
Codegemma 2B8K / 5.1 GB480578
Gemma Ko 1.1 2B It8K / 5.1 GB21821
EMO 2B8K / 5.1 GB40952
Octopus V28K / 5.1 GB1229880
LION Gemma 2B Sft V1.08K / 5.1 GB1490
Gemma2b Lungcancerqa8K / 3.1 GB812
... 2B Finetuned Sft Navarasa 2.08K / 10 GB24821
2B Or Not 2B8K / 5.1 GB7627
Gemma 2B Orpo8K / 5.1 GB11528
Note: green Score (e.g. "73.2") means that the model is better than four-two-labs/lynx-micro.

Rank the Lynx Micro Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227