Chikuma 10.7B by sethuiyer

 ยป  All LLMs  ยป  sethuiyer  ยป  Chikuma 10.7B   URL Share it on

  Autotrain compatible Base model:merge:openchat/open... Base model:merge:sethuiyer/syn... Base model:openchat/openchat-3... Base model:sethuiyer/synthiq-7...   Conversational   En   Endpoints compatible   Merge   Mistral   Model-index   Region:us   Safetensors   Sharded   Tensorflow
Model Card on HF ๐Ÿค—: https://huggingface.co/sethuiyer/Chikuma_10.7B 

Chikuma 10.7B Benchmarks

Chikuma 10.7B (sethuiyer/Chikuma_10.7B)

Chikuma 10.7B Parameters and Internals

Model Type 
text-generation
Additional Notes 
Chikuma is a combined model created using LazyMergekit from SynthIQ-7b and openchat-3.5-0106 models. It is named after the Chikuma River to metaphorically represent its characteristics. Recommended prompt templates and usage configurations are provided for optimal performance.
LLM NameChikuma 10.7B
Repository ๐Ÿค—https://huggingface.co/sethuiyer/Chikuma_10.7B 
Base Model(s)  SynthIQ 7B   openchat/openchat-3.5-0106   sethuiyer/SynthIQ-7b   openchat/openchat-3.5-0106
Model Size10.7b
Required VRAM21.4 GB
Updated2024-12-22
Maintainersethuiyer
Model Typemistral
Model Files  9.9 GB: 1-of-3   9.9 GB: 2-of-3   1.6 GB: 3-of-3
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.36.2
Tokenizer ClassLlamaTokenizer
Padding Token<|end_of_turn|>
Vocabulary Size32002
Torch Data Typebfloat16

Best Alternatives to Chikuma 10.7B

Best Alternatives
Context / RAM
Downloads
Likes
PiVoT 10.7B Mistral V0.232K / 21.4 GB51545
Barely Regal 10.7B32K / 21.5 GB190
Quintellect 10.7B32K / 21.4 GB903
Multi Verse Model 10.7B32K / 21.5 GB171
PrimaSumika 10.7B 128K32K / 21.5 GB261
Longcat 10.7B32K / 21.5 GB233
PiVoT 10.7B Mistral V0.2 RP32K / 21.4 GB7207
... Loyal Mistral Maid 32K V0.2 A32K / 21.5 GB291
Chikuma 10.7B V232K / 21.4 GB2221
Mark1 Revision 10.7B32K / 42.9 GB181
Note: green Score (e.g. "73.2") means that the model is better than sethuiyer/Chikuma_10.7B.

Rank the Chikuma 10.7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217