Lumosia MoE 4x10.7 GGUF by TheBloke

 ยป  All LLMs  ยป  TheBloke  ยป  Lumosia MoE 4x10.7 GGUF   URL Share it on

Base model:quantized:steelstor... Base model:steelstorage/lumosi...   Conversational   Dopeornope/solarc-m-10.7b   Gguf Jeonsworld/carbonvillain-en-10...   Kyujinpy/sakura-solar-instruct   Lazymergekit Maywell/pivot-10.7b-mistral-v0...   Merge   Mergekit   Mixtral   Moe   Quantized   Region:us

Lumosia MoE 4x10.7 GGUF Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Lumosia MoE 4x10.7 GGUF (TheBloke/Lumosia-MoE-4x10.7-GGUF)

Lumosia MoE 4x10.7 GGUF Parameters and Internals

Model Type 
mixture of experts, text generation
Use Cases 
Areas:
research
Additional Notes 
This is a very experimental model made with Mixture of Experts (MoE) technique.
Training Details 
Data Sources:
DopeorNope/SOLARC-M-10.7B, maywell/PiVoT-10.7B-Mistral-v0.2-RP, kyujinpy/Sakura-SOLAR-Instruct, jeonsworld/CarbonVillain-en-10.7B-v1
Methodology:
Mixture of Experts ensemble made with multiple Solar models
Context Length:
16000
Model Architecture:
Mixture of Experts (MoE)
Input Output 
Input Format:
### System: ### USER:{prompt} ### Assistant:
Accepted Modalities:
text
Output Format:
text generation
LLM NameLumosia MoE 4x10.7 GGUF
Repository ๐Ÿค—https://huggingface.co/TheBloke/Lumosia-MoE-4x10.7-GGUF 
Model NameLumosia MoE 4X10.7
Model CreatorSteel
Base Model(s)  Steelskull/Lumosia-MoE-4x10.7   Steelskull/Lumosia-MoE-4x10.7
Model Size10.7b
Required VRAM12 GB
Updated2025-02-05
MaintainerTheBloke
Model Typemixtral
Model Files  12.0 GB   15.8 GB   15.7 GB   15.6 GB   20.3 GB   20.4 GB   20.3 GB   24.8 GB   24.9 GB   24.8 GB   29.6 GB   38.4 GB
GGUF QuantizationYes
Quantization Typegguf
Model ArchitectureAutoModel
Licenseapache-2.0

Best Alternatives to Lumosia MoE 4x10.7 GGUF

Best Alternatives
Context / RAM
Downloads
Likes
Nous Hermes 2 SOLAR 10.7B GGUF0K / 4.5 GB1593111
Tess 10.7B V1.5B GGUF0K / 4 GB4307
SOLAR 10.7B Instruct V1.0 GGUF0K / 4.5 GB65382
Sensualize Solar 10.7B GGUF0K / 4.5 GB30810
OPEN SOLAR KO 10.7B GGUF0K / 4.1 GB780
SOLAR 10.7B V1.0 GGUF0K / 4.5 GB30812
CarbonVillain En 10.7B V4 GGUF0K / 4.5 GB726
Solar 10.7B SLERP GGUF0K / 4.5 GB14914
Frostwind 10.7B V1 GGUF0K / 4.5 GB1264
PiVoT 10.7B Mistral V0.2 GGUF0K / 4.5 GB1465
Note: green Score (e.g. "73.2") means that the model is better than TheBloke/Lumosia-MoE-4x10.7-GGUF.

Rank the Lumosia MoE 4x10.7 GGUF Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227