Metharme 1.3B by PygmalionAI

 ยป  All LLMs  ยป  PygmalionAI  ยป  Metharme 1.3B   URL Share it on

  Autotrain compatible   En   Endpoints compatible   Gpt neox   Pytorch   Region:us   Safetensors

Metharme 1.3B Benchmarks

Metharme 1.3B (PygmalionAI/metharme-1.3b)

Metharme 1.3B Parameters and Internals

LLM NameMetharme 1.3B
Repository ๐Ÿค—https://huggingface.co/PygmalionAI/metharme-1.3b 
Model Size1.3b
Required VRAM2.9 GB
Updated2025-02-22
MaintainerPygmalionAI
Model Typegpt_neox
Model Files  2.9 GB   2.9 GB
Supported Languagesen
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.30.0.dev0
Tokenizer ClassGPTNeoXTokenizer
Vocabulary Size50304
Torch Data Typebfloat16

Best Alternatives to Metharme 1.3B

Best Alternatives
Context / RAM
Downloads
Likes
...Ko Empathy Message Friend 1.3B2K / 5.4 GB690
Pgl Mtm1b 32K / 1 GB760
Pgl Mtm1b2K / 1.1 GB760
...olyglot Ko 1.3B Pretrained Asd2K / 5.4 GB250
SGPT 1.3B Insurance Epoch102K / 5.4 GB18711
KIT 1.3B2K / 5.4 GB1062
...glot Ko 1.3B Ao Instruct V0.912K / 5.4 GB1590
Pygmalion Free2K / 2.9 GB670
My Consulting Ai Model2K / 5.4 GB710
Koquality Polyglot 1.3B2K / 5.3 GB20340
Note: green Score (e.g. "73.2") means that the model is better than PygmalionAI/metharme-1.3b.

Rank the Metharme 1.3B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227