Pythia 160M Storytelling by jtatman

 ยป  All LLMs  ยป  jtatman  ยป  Pythia 160M Storytelling   URL Share it on

  Autotrain compatible   Axolotl Base model:eleutherai/pythia-1... Base model:finetune:eleutherai...   Conversational Dataset:jtatman/storywriting c...   Endpoints compatible   Generated from trainer   Gpt neox   Instruct   Pytorch   Region:us   Relora

Pythia 160M Storytelling Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Pythia 160M Storytelling (jtatman/pythia-160m-storytelling)

Pythia 160M Storytelling Parameters and Internals

Training Details 
Data Sources:
jtatman/storywriting_combined_instruct
Context Length:
2048
LLM NamePythia 160M Storytelling
Repository ๐Ÿค—https://huggingface.co/jtatman/pythia-160m-storytelling 
Base Model(s)  Pythia 160M Deduped   EleutherAI/pythia-160m-deduped
Model Size160m
Required VRAM0.3 GB
Updated2025-02-05
Maintainerjtatman
Model Typegpt_neox
Instruction-BasedYes
Model Files  0.3 GB   0.0 GB
Model ArchitectureGPTNeoXForCausalLM
Licenseapache-2.0
Context Length2048
Model Max Length2048
Transformers Version4.41.2
Tokenizer ClassGPTNeoXTokenizer
Padding Token[PAD]
Vocabulary Size50304
Torch Data Typebfloat16

Rank the Pythia 160M Storytelling Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227