Lumosia MoE 4x10.7 GGUF by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Lumosia MoE 4x10.7 GGUF   URL Share it on

Base model:quantized:steelstor... Base model:steelstorage/lumosi...   Conversational   Dopeornope/solarc-m-10.7b   Gguf Jeonsworld/carbonvillain-en-10...   Kyujinpy/sakura-solar-instruct   Lazymergekit Maywell/pivot-10.7b-mistral-v0...   Merge   Mergekit   Mixtral   Moe   Quantized   Region:us

Lumosia MoE 4x10.7 GGUF Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Lumosia MoE 4x10.7 GGUF (TheBloke/Lumosia-MoE-4x10.7-GGUF)

Lumosia MoE 4x10.7 GGUF Parameters and Internals

Model Type 
mixture of experts, text generation
Use Cases 
Areas:
research
Additional Notes 
This is a very experimental model made with Mixture of Experts (MoE) technique.
Training Details 
Data Sources:
DopeorNope/SOLARC-M-10.7B, maywell/PiVoT-10.7B-Mistral-v0.2-RP, kyujinpy/Sakura-SOLAR-Instruct, jeonsworld/CarbonVillain-en-10.7B-v1
Methodology:
Mixture of Experts ensemble made with multiple Solar models
Context Length:
16000
Model Architecture:
Mixture of Experts (MoE)
Input Output 
Input Format:
### System: ### USER:{prompt} ### Assistant:
Accepted Modalities:
text
Output Format:
text generation
LLM NameLumosia MoE 4x10.7 GGUF
Repository ๐Ÿค—https://huggingface.co/TheBloke/Lumosia-MoE-4x10.7-GGUF 
Model NameLumosia MoE 4X10.7
Model CreatorSteel
Base Model(s)  Steelskull/Lumosia-MoE-4x10.7   Steelskull/Lumosia-MoE-4x10.7
Model Size10.7b
Required VRAM12 GB
Updated2024-12-22
MaintainerTheBloke
Model Typemixtral
Model Files  12.0 GB   15.8 GB   15.7 GB   15.6 GB   20.3 GB   20.4 GB   20.3 GB   24.8 GB   24.9 GB   24.8 GB   29.6 GB   38.4 GB
GGUF QuantizationYes
Quantization Typegguf
Model ArchitectureAutoModel
Licenseapache-2.0

Best Alternatives to Lumosia MoE 4x10.7 GGUF

Best Alternatives
Context / RAM
Downloads
Likes
Nous Hermes 2 SOLAR 10.7B GGUF0K / 4.5 GB2471110
SOLAR 10.7B Instruct V1.0 GGUF0K / 4.5 GB56482
Tess 10.7B V1.5B GGUF0K / 4 GB1426
OPEN SOLAR KO 10.7B GGUF0K / 4.1 GB920
Sensualize Solar 10.7B GGUF0K / 4.5 GB24410
SOLAR 10.7B V1.0 GGUF0K / 4.5 GB37712
PiVoT 10.7B Mistral V0.2 GGUF0K / 4.5 GB2775
...VoT 10.7B Mistral V0.2 RP GGUF0K / 4.5 GB2169
Solar 10.7B SLERP GGUF0K / 4.5 GB18914
CarbonVillain En 10.7B V4 GGUF0K / 4.5 GB466
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Lumosia-MoE-4x10.7-GGUF.

Rank the Lumosia MoE 4x10.7 GGUF Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40066 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241217