NeuralStar AlphaWriter 4x7b by OmnicromsBrain

 ยป  All LLMs  ยป  OmnicromsBrain  ยป  NeuralStar AlphaWriter 4x7b   URL Share it on

  Autotrain compatible Base model:fpham/karen theedit... Base model:merge:fpham/karen t... Base model:merge:mlabonne/alph... Base model:merge:omnicromsbrai... Base model:merge:sanjiwatsuki/... Base model:mlabonne/alphamonar... Base model:omnicromsbrain/neur... Base model:sanjiwatsuki/kunoic...   Conversational   Endpoints compatible Fpham/karen theeditor v2 stric...   Frankenmoe   Lazymergekit   Merge   Mergekit   Mixtral   Mlabonne/alphamonarch-7b   Model-index   Moe Omnicromsbrain/neuralstar-7b-l...   Region:us   Safetensors Sanjiwatsuki/kunoichi-dpo-v2-7...   Sharded   Tensorflow

NeuralStar AlphaWriter 4x7b Benchmarks

NeuralStar AlphaWriter 4x7b Parameters and Internals

LLM NameNeuralStar AlphaWriter 4x7b
RepositoryOpen on ๐Ÿค— 
Base Model(s)  AlphaMonarch 7B   ...TheEditor V2 STRICT Mistral 7B   Kunoichi DPO V2 7B   NeuralStar 7B Lazy   mlabonne/AlphaMonarch-7B   FPHam/Karen_TheEditor_V2_STRICT_Mistral_7B   SanjiWatsuki/Kunoichi-DPO-v2-7B   OmnicromsBrain/NeuralStar-7b-Lazy
Model Size24.2b
Required VRAM48.3 GB
Updated2024-07-27
MaintainerOmnicromsBrain
Model Typemixtral
Model Files  9.9 GB: 1-of-5   10.0 GB: 2-of-5   10.0 GB: 3-of-5   10.0 GB: 4-of-5   8.4 GB: 5-of-5
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.39.3
Tokenizer ClassLlamaTokenizer
Padding Token<s>
Vocabulary Size32000
Torch Data Typefloat16
NeuralStar AlphaWriter 4x7b (OmnicromsBrain/NeuralStar_AlphaWriter_4x7b)

Best Alternatives to NeuralStar AlphaWriter 4x7b

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
Beyonder 4x7B V30.332K / 48.3 GB385956
NeuralStar FusionWriter 4x7b0.332K / 48.3 GB70
Dzakwan MoE 4x7b Beta0.332K / 48.4 GB20590
Mera Mix 4x7B0.332K / 48.3 GB443417
Mixtral 4x7b Slerp0.332K / 96.8 GB2371
CognitiveFusion2 4x7B BF160.332K / 48.3 GB44903
...erges MoE 4x7b V10 Mixtralv0.30.332K / 48.3 GB2420
NeuralMona MoE 4x7B0.332K / 48.3 GB39170
Calme 4x7B MoE V0.10.332K / 48.3 GB34222
Calme 4x7B MoE V0.20.332K / 48.3 GB33102
Note: green Score (e.g. "73.2") means that the model is better than OmnicromsBrain/NeuralStar_AlphaWriter_4x7b.

Rank the NeuralStar AlphaWriter 4x7b Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 34447 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024072501