Vikhr Nemo 12B Instruct R 21 09 24 by Vikhrmodels

 ยป  All LLMs  ยป  Vikhrmodels  ยป  Vikhr Nemo 12B Instruct R 21 09 24   URL Share it on

  Arxiv:2405.13929   Autotrain compatible Base model:finetune:mistralai/... Base model:mistralai/mistral-n...   Conversational Dataset:vikhrmodels/grandmaste... Dataset:vikhrmodels/grounded-r...   En   Endpoints compatible   Instruct   Mistral   Region:us   Ru   Safetensors   Sharded   Tensorflow

Vikhr Nemo 12B Instruct R 21 09 24 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Vikhr Nemo 12B Instruct R 21 09 24 (Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24)

Vikhr Nemo 12B Instruct R 21 09 24 Parameters and Internals

Model Type 
large language model, text generation, instruction following, multilingual, RAG-enabled
Use Cases 
Areas:
Research, Multilingual text generation, Reasoning, Summarization, Code generation, Dialogue management
Applications:
Multilingual applications requiring English and Russian support
Primary Use Cases:
RAG systems that dynamically search and retrieve information
Limitations:
Low safety response level
Considerations:
Use with low temperature settings
Additional Notes 
High proficiency in both Russian and English NLP tasks. Safeguards should be employed due to default low safety settings.
Supported Languages 
en (high), ru (high), others (some)
Training Details 
Data Sources:
Vikhrmodels/GrandMaster-PRO-MAX, Vikhrmodels/Grounded-RAG-RU-v2
Methodology:
SFT, SMPO, Rejection Sampling
Context Length:
128000
Input Output 
Input Format:
JSON and API format
Accepted Modalities:
text
Output Format:
JSON and Text
Performance Tips:
Use low temperature settings for best performance
LLM NameVikhr Nemo 12B Instruct R 21 09 24
Repository ๐Ÿค—https://huggingface.co/Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24 
Base Model(s)  Mistral Nemo Instruct 2407   mistralai/Mistral-Nemo-Instruct-2407
Model Size12b
Required VRAM24.5 GB
Updated2025-02-05
MaintainerVikhrmodels
Model Typemistral
Instruction-BasedYes
Model Files  4.9 GB: 1-of-5   4.9 GB: 2-of-5   4.9 GB: 3-of-5   4.9 GB: 4-of-5   4.9 GB: 5-of-5
Supported Languagesen ru
Model ArchitectureMistralForCausalLM
Licenseapache-2.0
Context Length1024000
Model Max Length1024000
Transformers Version4.44.2
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<pad>
Vocabulary Size131074
Torch Data Typebfloat16

Best Alternatives to Vikhr Nemo 12B Instruct R 21 09 24

Best Alternatives
Context / RAM
Downloads
Likes
SauerkrautLM Nemo 12B Instruct1000K / 24.5 GB1952722
Mistral Nemo Wissenschaft 12B1000K / 24.5 GB52167
MN Slush1000K / 24.5 GB30719
Magnum V4 12B1000K / 24.5 GB84439
ChatWaifu V1.41000K / 24.5 GB9719
ChatWaifu 12B V2.01000K / 24.5 GB5718
...tral Nemo Gutenberg Doppel 12B1000K / 24.5 GB615
SAINEMO ReMIX1000K / 24.5 GB99424
GodSlayer 12B ABYSS1000K / 24.5 GB845
Mistral Nemo Bophades 12B1000K / 24.5 GB488

Rank the Vikhr Nemo 12B Instruct R 21 09 24 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42565 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227