Data Efficient Training Of LLMs V1 by ChiyuSONG

 ยป  All LLMs  ยป  ChiyuSONG  ยป  Data Efficient Training Of LLMs V1   URL Share it on

  Arxiv:2310.19651   Autotrain compatible   Baichuan   Custom code Dataset:chiyusong/dynamics-of-...   Endpoints compatible   Instruct   Pytorch   Region:us   Zh

Data Efficient Training Of LLMs V1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Data Efficient Training Of LLMs V1 (ChiyuSONG/data-efficient-training-of-LLMs-v1)

Data Efficient Training Of LLMs V1 Parameters and Internals

Model Type 
instruction tuning, large language model
Additional Notes 
The project explores efficient data usage for training large language models, focusing on the dynamics of instruction tuning to improve diverse capabilities.
Supported Languages 
zh (Proficient)
Training Details 
Data Sources:
ChiyuSONG/dynamics-of-instruction-tuning
Methodology:
Utilizes a progressive instruction tuning approach to enhance model abilities based on different factor influences.
Release Notes 
Version:
1
Notes:
Focus on instruction fine-tuning process, utilizing multiple capabilities categories such as creative writing, code generation, logical reasoning.
LLM NameData Efficient Training Of LLMs V1
Repository ๐Ÿค—https://huggingface.co/ChiyuSONG/data-efficient-training-of-LLMs-v1 
Model Size13b
Required VRAM29.1 GB
Updated2024-12-26
MaintainerChiyuSONG
Model Typebaichuan
Instruction-BasedYes
Model Files  29.1 GB
Supported Languageszh
Model ArchitectureBaichuanForCausalLM
Licensemit
Model Max Length4096
Transformers Version4.30.2
Tokenizer ClassBaichuanTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size125696
Torch Data Typebfloat16

Best Alternatives to Data Efficient Training Of LLMs V1

Best Alternatives
Context / RAM
Downloads
Likes
Blossom V2 Baichuan 13B0K / 26.5 GB131
Baichuan 13B Instruction0K / 26.5 GB306
Baichuan 13B Instruction GPTQ0K / 7.9 GB154

Rank the Data Efficient Training Of LLMs V1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 40303 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227