Mixtral Dna Conserved V0.1 by RaphaelMourad

 ยป  All LLMs  ยป  RaphaelMourad  ยป  Mixtral Dna Conserved V0.1   URL Share it on

  Autotrain compatible   Endpoints compatible   License:apache-2.0   Mixtral   Moe   Pretrained   Region:us   Safetensors

Mixtral Dna Conserved V0.1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mixtral Dna Conserved V0.1 (RaphaelMourad/mixtral-dna-conserved-v0.1)

Mixtral Dna Conserved V0.1 Parameters and Internals

LLM NameMixtral Dna Conserved V0.1
Repository ๐Ÿค—https://huggingface.co/RaphaelMourad/mixtral-dna-conserved-v0.1 
Model Size415.5m
Required VRAM0 GB
Updated2024-07-23
MaintainerRaphaelMourad
Model Typemixtral
Model Files  1.7 GB   0.0 GB   0.0 GB
Model ArchitectureMixtralForCausalLM
Licenseapache-2.0
Context Length512
Model Max Length512
Transformers Version4.37.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token[PAD]
Vocabulary Size4096
Torch Data Typefloat32

Rank the Mixtral Dna Conserved V0.1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227