LLM Explorer: A Curated Large Language Model Directory and Analytics  // 

Medorca 7B Slerp by Technoculture

What open-source LLMs or SLMs are you in search of? 18732 in total.

 ยป  All LLMs  ยป  Technoculture  ยป  Medorca 7B Slerp   URL Share it on

  Merged Model   Autotrain compatible   Endpoints compatible   Epfl-llm/meditron-7b   License:apache-2.0   Llama   Microsoft/orca-2-7b   Region:us   Safetensors   Sharded   Tensorflow

Rank the Medorca 7B Slerp Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Medorca 7B Slerp (Technoculture/Medorca-7B-Slerp)

Best Alternatives to Medorca 7B Slerp

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Bagel DPO 7B V0.167.9532K / 14.4 GB225939
Internlm2 7B Llama66.9432K / 15.5 GB15995
Llama2 Init Mistral60.984K / 14.4 GB25510
A I 0xtom 7B Slerp60.4632K / 14.4 GB2580
AIRIC The Mistral59.9532K / 14.4 GB19413
Synatra RP Orca 2 7B V0.159.554K / 13.5 GB30576
Deepseek Llm 7B Chat59.274K / 13.9 GB713758
UltraQwen 7B59.1732K / 15.4 GB17712
...rnlm2 20B Llama 4.0bpw H6 EXL258.532K / 11 GB51
Mistral 7B Guanaco1k Ep258.1332K / 29 GB36423
Note: green Score (e.g. "73.2") means that the model is better than Technoculture/Medorca-7B-Slerp.

Medorca 7B Slerp Parameters and Internals

LLM NameMedorca 7B Slerp
RepositoryOpen on ๐Ÿค— 
Merged ModelYes
Model Size7b
Required VRAM13.5 GB
Updated2024-02-21
MaintainerTechnoculture
Model Typellama
Model Files  10.0 GB: 1-of-2   3.5 GB: 2-of-2
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.35.2
Tokenizer ClassLlamaTokenizer
Padding Token<PAD>
Vocabulary Size32017
Initializer Range0.02
Torch Data Typefloat16
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024022003