Nxcode CQ 7B Orpo by NTQAI

 ยป  All LLMs  ยป  NTQAI  ยป  Nxcode CQ 7B Orpo   URL Share it on

  Arxiv:2403.07691   Autotrain compatible   Code   Conversational   Endpoints compatible   License:mit   Qwen2   Region:us   Safetensors   Sharded   Tensorflow

Nxcode CQ 7B Orpo Benchmarks

Rank the Nxcode CQ 7B Orpo Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Nxcode CQ 7B Orpo (NTQAI/Nxcode-CQ-7B-orpo)

Best Alternatives to Nxcode CQ 7B Orpo

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
SuperCode64K / 14.4 GB80
Qwen Theia Workshop64K / 14.5 GB3550
Svelte64K / 14.5 GB470
CodeQwen Text To Rule3 Merged64K / 14.5 GB120
CodeQwen1.5 7B Chat64K / 14.6 GB20263178
CodeQwen1.5 7B64K / 14.6 GB752457
Biomistral 7B Instruct TIES32K / 1.2 GB230
Qwen 1.5 7B Layer Mix Bpw 2.232K / 4.7 GB100
Qwen 1.5 7B Layer Mix Bpw 2.532K / 4.8 GB80
Sailor 7B32K / 15.4 GB307027

Nxcode CQ 7B Orpo Parameters and Internals

LLM NameNxcode CQ 7B Orpo
RepositoryOpen on ๐Ÿค— 
Model Size7b
Required VRAM14.5 GB
Updated2024-05-22
MaintainerNTQAI
Model Typeqwen2
Model Files  8.0 GB: 1-of-2   6.5 GB: 2-of-2
Model ArchitectureQwen2ForCausalLM
Licensemit
Context Length65536
Model Max Length65536
Transformers Version4.39.3
Tokenizer ClassPreTrainedTokenizerFast
Padding Token<fim_pad>
Vocabulary Size92416
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 35549 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024042801