Tinyllama 3T 64K JSONExtractor by muzammil-eds

 ยป  All LLMs  ยป  muzammil-eds  ยป  Tinyllama 3T 64K JSONExtractor   URL Share it on

  Autotrain compatible   Endpoints compatible   Llama   Pytorch   Region:us

Rank the Tinyllama 3T 64K JSONExtractor Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  
Tinyllama 3T 64K JSONExtractor (muzammil-eds/tinyllama-3T-64k-JSONExtractor)

Best Alternatives to Tinyllama 3T 64K JSONExtractor

Best Alternatives
HF Rank
Context/RAM
Downloads
Likes
...yllama 3T 64K JSONExtractor V464K / 2.2 GB70
...Llama 1.1B 32K Instruct Bpw2.532K / 0.5 GB100
...nyLlama 1.1B 32K Instruct Bpw332K / 0.5 GB80
...Llama 1.1B 32K Instruct Bpw3.732K / 0.6 GB110
...Llama 1.1B 32K Instruct Bpw3.532K / 0.6 GB110
...nyLlama 1.1B 32K Instruct Bpw432K / 0.7 GB120
...Llama 1.1B 32K Instruct Bpw4.632K / 0.7 GB100
...Llama 1.1B 32K Instruct Bpw4.432K / 0.7 GB90
...Llama 1.1B 32K Instruct Bpw4.232K / 0.7 GB80
...nyLlama 1.1B 32K Instruct Bpw532K / 0.8 GB120

Tinyllama 3T 64K JSONExtractor Parameters and Internals

LLM NameTinyllama 3T 64K JSONExtractor
RepositoryOpen on ๐Ÿค— 
Model Size1.1b
Required VRAM2.2 GB
Updated2024-04-13
Maintainermuzammil-eds
Model Typellama
Model Files  2.2 GB
Model ArchitectureLlamaForCausalLM
Context Length65536
Model Max Length65536
Transformers Version4.34.0.dev0
Tokenizer ClassLlamaTokenizer
Vocabulary Size32000
Initializer Range0.02
Torch Data Typebfloat16

What open-source LLMs or SLMs are you in search of? 36560 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v2024040901