ECE 1B Merge PRYMMAL by LilRg

 ยป  All LLMs  ยป  LilRg  ยป  ECE 1B Merge PRYMMAL   URL Share it on

  Merged Model   Base model:qwen/qwen2.5-1.5b Base model:qwen/qwen2.5-1.5b-i...   Instruct   Qwen/qwen2.5-1.5b   Qwen/qwen2.5-1.5b-instruct   Qwen2   Region:us   Safetensors   Sharded   Tensorflow

ECE 1B Merge PRYMMAL Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
ECE 1B Merge PRYMMAL (LilRg/ECE-1B-merge-PRYMMAL)

ECE 1B Merge PRYMMAL Parameters and Internals

Model Type 
text-generation
Additional Notes 
The model is a merge using LazyMergekit and specializes in text generation.
Training Details 
Methodology:
Merge of Qwen/Qwen2.5-1.5B-Instruct and Qwen/Qwen2.5-1.5B using LazyMergekit
Input Output 
Accepted Modalities:
text
LLM NameECE 1B Merge PRYMMAL
Repository ๐Ÿค—https://huggingface.co/LilRg/ECE-1B-merge-PRYMMAL 
Base Model(s)  Qwen/Qwen2.5-1.5B-Instruct   Qwen/Qwen2.5-1.5B   Qwen/Qwen2.5-1.5B-Instruct   Qwen/Qwen2.5-1.5B
Merged ModelYes
Model Size1b
Required VRAM3.6 GB
Updated2025-02-05
MaintainerLilRg
Model Typeqwen2
Instruction-BasedYes
Model Files  1.0 GB: 1-of-4   1.0 GB: 2-of-4   1.0 GB: 3-of-4   0.6 GB: 4-of-4
Model ArchitectureQwen2ForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.44.2
Tokenizer ClassQwen2Tokenizer
Padding Token<|endoftext|>
Vocabulary Size151936
Torch Data Typebfloat16
Errorsreplace

Best Alternatives to ECE 1B Merge PRYMMAL

Best Alternatives
Context / RAM
Downloads
Likes
Qwen2.5 1B Instruct32K / 2 GB870
PRYMMAL ECE 1B SLERP V132K / 9.7 GB80
ECE PRYMMAL1B FT V132K / 6.2 GB850
Sailor2 1B32K / 2 GB2476
Sailor2 1B Chat32K / 2 GB11614
NanoLM 1B Instruct V24K / 2.1 GB220
NanoLM 1B Instruct V1.14K / 2.1 GB50
Note: green Score (e.g. "73.2") means that the model is better than LilRg/ECE-1B-merge-PRYMMAL.

Rank the ECE 1B Merge PRYMMAL Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227