TinyLlama 1.1B 1.5T OpenOrca Alpha by jeff31415

 ยป  All LLMs  ยป  jeff31415  ยป  TinyLlama 1.1B 1.5T OpenOrca Alpha   URL Share it on

  Autotrain compatible   Dataset:bigcode/starcoderdata Dataset:cerebras/slimpajama-62...   Dataset:open-orca/openorca   En   Endpoints compatible   Llama   Pytorch   Region:us   Safetensors

TinyLlama 1.1B 1.5T OpenOrca Alpha Benchmarks

TinyLlama 1.1B 1.5T OpenOrca Alpha (jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha)

TinyLlama 1.1B 1.5T OpenOrca Alpha Parameters and Internals

Additional Notes 
This fine tune was done on the "early" version of tinyllama-1.5T which suffers from a bug in dataset processing. See https://github.com/jzhang38/TinyLlama/issues/67. However, its performance seems not to be damaged and still shows improvement.
Training Details 
Data Sources:
Open-Orca/OpenOrca, bigcode/starcoderdata, cerebras/SlimPajama-627B
Methodology:
Fine tuned on OpenOrca GPT4 subset for 1 epoch, using CHATML format
Training Time:
~16 hours to complete 1 epoch
Hardware Used:
1*RTX A5000, GPU from autodl.com
LLM NameTinyLlama 1.1B 1.5T OpenOrca Alpha
Repository ๐Ÿค—https://huggingface.co/jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha 
Model Size627b
Required VRAM2.2 GB
Updated2025-01-14
Maintainerjeff31415
Model Typellama
Model Files  2.2 GB   2.2 GB
Supported Languagesen
Model ArchitectureLlamaForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.31.0.dev0
Tokenizer ClassLlamaTokenizer
Beginning of Sentence Token<s>
End of Sentence Token</s>
Unk Token<unk>
Vocabulary Size32000
Torch Data Typefloat32

Best Alternatives to TinyLlama 1.1B 1.5T OpenOrca Alpha

Best Alternatives
Context / RAM
Downloads
Likes
TinyLlama V1.12K / 4.4 GB5278680
MicroLlama2K / 1.2 GB154642
CroissantLLMBase2K / 5.4 GB64831
TinyLlama V1.1 Math Code2K / 4.4 GB150910
TinyLlama 1.1B 1T OpenOrca2K / 2.2 GB2577
TinyLlama V1.1 Chinese2K / 4.4 GB1637
...inyLlama 1.1B 1T OpenOrca GPTQ2K / 0.8 GB172
TinyLlama 1.1B 1T OpenOrca AWQ2K / 0.8 GB162
Note: green Score (e.g. "73.2") means that the model is better than jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha.

Rank the TinyLlama 1.1B 1.5T OpenOrca Alpha Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 41362 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227