Mixtral AI Cyber 2.0 by LeroyDyer

 ยป  All LLMs  ยป  LeroyDyer  ยป  Mixtral AI Cyber 2.0   URL Share it on

  Arxiv:2203.05482   128k context   Autotrain compatible Base model:finetune:leroydyer/... Base model:leroydyer/mixtral b...   Biology   Chemistry   Code   Custom code   Cyber-series   En   Endpoints compatible   Medical   Merge   Mergekit   Mistral   Music   Region:us   Safetensors   Sharded   Tensorflow

Mixtral AI Cyber 2.0 Benchmarks

Mixtral AI Cyber 2.0 (LeroyDyer/Mixtral_AI_Cyber_2.0)

Mixtral AI Cyber 2.0 Parameters and Internals

Model Type 
text-generation
Additional Notes 
This is a merge of pre-trained language models using the linear merge method with components learned from their training process. It includes a long context capability up to 128k tokens.
Supported Languages 
en (full)
Training Details 
Data Volume:
1500 steps
Methodology:
YaRN extension method
Context Length:
128000
LLM NameMixtral AI Cyber 2.0
Repository ๐Ÿค—https://huggingface.co/LeroyDyer/Mixtral_AI_Cyber_2.0 
Base Model(s)  LeroyDyer/Mixtral_AI_128K_B   Mixtral BioMedical   LeroyDyer/Mixtral_AI_128K_B   LeroyDyer/Mixtral_BioMedical
Model Size7.2b
Required VRAM14.4 GB
Updated2025-02-22
MaintainerLeroyDyer
Model Typemistral
Model Files  1.9 GB: 1-of-8   1.9 GB: 2-of-8   1.9 GB: 3-of-8   1.9 GB: 4-of-8   1.9 GB: 5-of-8   2.0 GB: 6-of-8   2.0 GB: 7-of-8   0.9 GB: 8-of-8
Supported Languagesen
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.38.2
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typefloat16

Best Alternatives to Mixtral AI Cyber 2.0

Best Alternatives
Context / RAM
Downloads
Likes
...Web AI HumanAI 012 INSTRUCT XA512K / 14.4 GB90
14512K / 14.5 GB3190
...Web AI HumanAI 012 INSTRUCT IA512K / 14.4 GB80
...Web AI HumanAI 011 INSTRUCT ML512K / 14.4 GB70
...dazWeb AI HumanAI 011 INSTRUCT512K / 14.4 GB141
SpydazWeb HumanAI M2512K /  GB380
SpydazWeb AI HumanAI 009 CHAT512K / 14.4 GB60
SpydazWeb AI HumanAI 007512K / 14.4 GB50
...Web AI HumanAI 012 INSTRUCT MX512K / 14.4 GB80
SpydazWeb AI HumanAI 010 CHAT512K / 14.4 GB70
Note: green Score (e.g. "73.2") means that the model is better than LeroyDyer/Mixtral_AI_Cyber_2.0.

Rank the Mixtral AI Cyber 2.0 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227