Lumimaid V0.2 123B by NeverSleep

 ยป  All LLMs  ยป  NeverSleep  ยป  Lumimaid V0.2 123B   URL Share it on

  Autotrain compatible   Conversational   Endpoints compatible   Instruct   Mistral   Not-for-all-audiences   Nsfw   Pytorch   Region:us   Sharded

Lumimaid V0.2 123B Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Lumimaid V0.2 123B (NeverSleep/Lumimaid-v0.2-123B)

Lumimaid V0.2 123B Parameters and Internals

Additional Notes 
This model is referred to as not suitable for all audiences and contains NSFW (Not Safe For Work) content. It is intended for specific audiences and use cases.
Training Details 
Data Sources:
Epiculous/Gnosis, ChaoticNeutrals/Luminous_Opus, ChaoticNeutrals/Synthetic-Dark-RP, ChaoticNeutrals/Synthetic-RP, Gryphe/Sonnet3.5-SlimOrcaDedupCleaned, Gryphe/Opus-WritingPrompts, meseca/writing-opus-6k, meseca/opus-instruct-9k, PJMixers/grimulkan_theory-of-mind-ShareGPT, NobodyExistsOnTheInternet/ToxicQAFinal, Undi95/toxic-dpo-v0.1-sharegpt, cgato/SlimOrcaDedupCleaned, kalomaze/Opus_Instruct_25k, Doctor-Shotgun/no-robots-sharegpt, Norquinal/claude_multiround_chat_30k, nothingiisreal/Claude-3-Opus-Instruct-15K
LLM NameLumimaid V0.2 123B
Repository ๐Ÿค—https://huggingface.co/NeverSleep/Lumimaid-v0.2-123B 
Model Size123b
Required VRAM217.2 GB
Updated2025-01-17
MaintainerNeverSleep
Model Typemistral
Instruction-BasedYes
Model Files  4.9 GB: 1-of-51   4.8 GB: 2-of-51   4.9 GB: 3-of-51   4.8 GB: 4-of-51   4.8 GB: 5-of-51   4.8 GB: 6-of-51   4.9 GB: 7-of-51   4.8 GB: 8-of-51   4.8 GB: 9-of-51   4.8 GB: 10-of-51   4.9 GB: 11-of-51   4.8 GB: 12-of-51   4.8 GB: 13-of-51   4.8 GB: 14-of-51   4.9 GB: 15-of-51   4.8 GB: 16-of-51   4.8 GB: 17-of-51   4.8 GB: 18-of-51   4.9 GB: 19-of-51   4.8 GB: 20-of-51   4.8 GB: 21-of-51   4.8 GB: 22-of-51   4.9 GB: 23-of-51   4.8 GB: 24-of-51   4.8 GB: 25-of-51   4.8 GB: 26-of-51   4.9 GB: 27-of-51   4.8 GB: 28-of-51   4.8 GB: 29-of-51   4.8 GB: 30-of-51   4.9 GB: 31-of-51   4.8 GB: 32-of-51   4.8 GB: 33-of-51   4.8 GB: 34-of-51   4.9 GB: 35-of-51   4.8 GB: 36-of-51   4.8 GB: 37-of-51   4.8 GB: 38-of-51   4.9 GB: 39-of-51   4.8 GB: 40-of-51   4.8 GB: 41-of-51   4.8 GB: 42-of-51   4.9 GB: 43-of-51   4.8 GB: 44-of-51   4.8 GB: 45-of-51
Model ArchitectureMistralForCausalLM
Licensecc-by-nc-4.0
Context Length131072
Model Max Length131072
Transformers Version4.44.0.dev0
Vocabulary Size32769
Torch Data Typebfloat16

Best Alternatives to Lumimaid V0.2 123B

Best Alternatives
Context / RAM
Downloads
Likes
Gigaberg Mistral Large 123B128K / 222 GB351
Cakrawala 123B128K / 222 GB883
Magnum V4 123B128K / 222 GB35523
Magnum V2 123B128K / 207.6 GB32256
Behemoth 123B V1128K / 221.6 GB4531
Behemoth 123B V2128K / 221.6 GB126
Lumikabra 123B V0.4128K / 216.7 GB10411
Tess 3 Mistral Large 2 123B128K / 217.2 GB5719
Magstral 123B128K / 221.6 GB70
ML MS Etheris 123B32K / 226.4 GB153
Note: green Score (e.g. "73.2") means that the model is better than NeverSleep/Lumimaid-v0.2-123B.

Rank the Lumimaid V0.2 123B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41473 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227