Mixtral AI CyberCoder 7B by LeroyDyer

 ยป  All LLMs  ยป  LeroyDyer  ยป  Mixtral AI CyberCoder 7B   URL Share it on

  Merged Model   Arxiv:2306.01708   Art   Autotrain compatible Base model:finetune:mistralai/... Base model:mistralai/mistral-7...   Code   Conversational   Cyber-series Dataset:cybernative/code vulne... Dataset:whiterabbitneo/wrn-cha... Dataset:whiterabbitneo/wrn-cha...   Endpoints compatible   Instruct   Mistral   Region:us   Safetensors   Sharded   Tensorflow

Mixtral AI CyberCoder 7b Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Mixtral AI CyberCoder 7B (LeroyDyer/Mixtral_AI_CyberCoder_7b)

Mixtral AI CyberCoder 7B Parameters and Internals

Model Type 
text generation, code generation
Use Cases 
Areas:
Research, Commercial applications
Applications:
Code generation, UML diagrams, Object-oriented planning
Additional Notes 
This model is under development and is being constantly retuned and updated for its use in producing code, functions, and applications. It has been fine-tuned on datasets dedicated to coding problems and other code-related tasks.
Training Details 
Data Sources:
CyberNative/Code_Vulnerability_Security_DPO, WhiteRabbitNeo/WRN-Chapter-1, WhiteRabbitNeo/WRN-Chapter-2
Methodology:
Merge using TIES merge method
Model Architecture:
This is a merge of pre-trained language models created using mergekit.
LLM NameMixtral AI CyberCoder 7b
Repository ๐Ÿค—https://huggingface.co/LeroyDyer/Mixtral_AI_CyberCoder_7b 
Base Model(s)  mistralai/Mistral-7B-Instruct-v0.2   LeroyDyer/Mixtral_AI_Cyber_3.0   LeroyDyer/Mixtral_AI_MultiToken   LeroyDyer/Mixtral_AI_Multi_TEST   mistralai/Mistral-7B-Instruct-v0.2   LeroyDyer/Mixtral_AI_Cyber_3.0   LeroyDyer/Mixtral_AI_MultiToken   LeroyDyer/Mixtral_AI_Multi_TEST
Merged ModelYes
Model Size7b
Required VRAM14.3 GB
Updated2025-02-22
MaintainerLeroyDyer
Model Typemistral
Instruction-BasedYes
Model Files  1.9 GB: 1-of-8   1.9 GB: 2-of-8   2.0 GB: 3-of-8   1.9 GB: 4-of-8   1.9 GB: 5-of-8   1.9 GB: 6-of-8   1.9 GB: 7-of-8   0.9 GB: 8-of-8
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length32768
Model Max Length32768
Transformers Version4.36.0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Torch Data Typebfloat16

Best Alternatives to Mixtral AI CyberCoder 7B

Best Alternatives
Context / RAM
Downloads
Likes
...Nemo Instruct 2407 Abliterated1000K / 24.5 GB462011
SpydazWeb AI HumanAI RP512K / 14.4 GB121
SpydazWeb AI HumanAI 002512K / 14.4 GB181
...daz Web AI ChatML 512K Project512K / 14.5 GB120
... Summarize 64K QLoRANET Merged128K / 4.1 GB120
...1 Summarize 64K LoRANET Merged128K / 14.4 GB110
Mistral 7B Instruct V0.232K / 14.4 GB33160952630
Mistral 7B Instruct V0.132K / 14.4 GB1916301573
...ity Instruct 7M Gen Mistral 7B32K / 14.4 GB37505
...ty Instruct 3M 0625 Mistral 7B32K / 14.4 GB37063
Note: green Score (e.g. "73.2") means that the model is better than LeroyDyer/Mixtral_AI_CyberCoder_7b.

Rank the Mixtral AI CyberCoder 7B Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227