ArrowSmartPlus 3.6B Instant Sft by DataPilot

 »  All LLMs  »  DataPilot  »  ArrowSmartPlus 3.6B Instant Sft   URL Share it on

  Autotrain compatible   Finetuned   Gpt neox   Ja   Region:us   Safetensors   Sharded   Tensorflow

ArrowSmartPlus 3.6B Instant Sft Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
ArrowSmartPlus 3.6B Instant Sft (DataPilot/ArrowSmartPlus_3.6B_instant_sft)

ArrowSmartPlus 3.6B Instant Sft Parameters and Internals

Model Type 
causal language model
Use Cases 
Areas:
教育
Primary Use Cases:
中学、・高校の教育
Supported Languages 
ja (high)
Training Details 
Data Sources:
ウィキブック
Methodology:
ファインチューニング
Input Output 
Input Format:
raw Japanese sentences
Accepted Modalities:
text
Output Format:
text
LLM NameArrowSmartPlus 3.6B Instant Sft
Repository 🤗https://huggingface.co/DataPilot/ArrowSmartPlus_3.6B_instant_sft 
Model Size3.6b
Required VRAM14.3 GB
Updated2025-02-22
MaintainerDataPilot
Model Typegpt_neox
Model Files  4.9 GB: 1-of-3   5.0 GB: 2-of-3   4.4 GB: 3-of-3   0.0 GB
Supported Languagesja
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.38.2
Tokenizer ClassT5Tokenizer
Padding Token</s>
Vocabulary Size51200
Torch Data Typefloat32

Best Alternatives to ArrowSmartPlus 3.6B Instant Sft

Best Alternatives
Context / RAM
Downloads
Likes
Japanese GPT Neox 3.6B2K / 7.4 GB380598
...y Jimba Japanese Large Lm 3.6B2K / 7.1 GB640
...rrowSmartPlus 3.6B Instruction2K / 14.3 GB51
...rtPlus 3.6B Instant Sft JHSVer2K / 14.3 GB91
...T Neox 3.6B Instruction Sft V22K / 7.4 GB5463826
... Large Lm 3.6B Instruction Sft2K / 7.2 GB89025
Japanese Large Lm 3.6B2K / 7.2 GB71374
... GPT Neox 3.6B Instruction Ppo2K / 7.4 GB258770
... GPT Neox 3.6B Instruction Sft2K / 7.4 GB900101
...tion Sft 8bit 1g Actorder True2K / 2.8 GB843
Note: green Score (e.g. "73.2") means that the model is better than DataPilot/ArrowSmartPlus_3.6B_instant_sft.

Rank the ArrowSmartPlus 3.6B Instant Sft Capabilities

🆘 Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! 🌟

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43508 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227