MonarchCoder 7B by abideen

 ยป  All LLMs  ยป  abideen  ยป  MonarchCoder 7B   URL Share it on

  Merged Model   Autotrain compatible Base model:mlabonne/alphamonar... Base model:syed-hasan-8503/tes...   Codegen   Conversational   En   Endpoints compatible   Mistral   Mlabonne/alphamonarch-7b   Model-index   Region:us   Safetensors   Sharded Syed-hasan-8503/tess-coder-7b-...   Tensorflow

MonarchCoder 7B Benchmarks

MonarchCoder 7B Parameters and Internals

Model Type 
text-generation
Use Cases 
Areas:
research, commercial applications
Applications:
reasoning, conversation, coding
Additional Notes 
MonarchCoder-7B is optimized for reasoning, conversation, and coding tasks based on alpha model's strengths.
Supported Languages 
en (native)
Training Details 
Methodology:
Slerp merge using LazyMergekit
Input Output 
Input Format:
A prompt formatted using a chat template.
Accepted Modalities:
text
Output Format:
Generated text.
Performance Tips:
Use optimal settings like temperature:0.7, top_k:50, top_p:0.95 for better performance.
LLM NameMonarchCoder 7B
Repository ๐Ÿค—https://huggingface.co/abideen/MonarchCoder-7B 
Base Model(s)  Syed-Hasan-8503/Tess-Coder-7B-Mistral-v1.0   AlphaMonarch 7B   Syed-Hasan-8503/Tess-Coder-7B-Mistral-v1.0   mlabonne/AlphaMonarch-7B
Merged ModelYes
Model Size7b
Required VRAM14.4 GB
Updated2024-11-22
Maintainerabideen
Model Typemistral
Model Files  2.0 GB: 1-of-8   1.9 GB: 2-of-8   2.0 GB: 3-of-8   2.0 GB: 4-of-8   1.9 GB: 5-of-8   1.9 GB: 6-of-8   1.9 GB: 7-of-8   0.8 GB: 8-of-8
Supported Languagesen
Generates CodeYes
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.37.2
Tokenizer ClassLlamaTokenizer
Padding Token</s>
Vocabulary Size32000
Torch Data Typebfloat16
MonarchCoder 7B (abideen/MonarchCoder-7B)

Best Alternatives to MonarchCoder 7B

Best Alternatives
Context / RAM
Downloads
Likes
SpydazWeb AI HumanAI RP512K / 14.4 GB961
SpydazWeb AI HumanAI 002512K / 14.4 GB171
Threebird 7B32K / 14.5 GB230
Jett W26 Abliterated32K / 14.4 GB101
Jett W2632K / 14.5 GB101
Mistral 7B Merged32K / 14.5 GB130
Ultra Llm Merged32K / 14.5 GB170
RP Coder SM332K / 14.5 GB920
The Trinity Coder 7B32K / 14.4 GB794
...Mistral 7B Instruct V0.2 Slerp32K / 14.5 GB91
Note: green Score (e.g. "73.2") means that the model is better than abideen/MonarchCoder-7B.

Rank the MonarchCoder 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 38199 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241110