Lovelace Medium Alpha1 by Avelina

 ยป  All LLMs  ยป  Avelina  ยป  Lovelace Medium Alpha1   URL Share it on

  Arxiv:2405.20053   Autotrain compatible   Dataset:eleutherai/pile   En   Endpoints compatible   Lsw transformer   Region:us   Safetensors

Lovelace Medium Alpha1 Benchmarks

nn.n% — How the model compares to the reference models: Anthropic Sonnet 3.5 ("so35"), GPT-4o ("gpt4o") or GPT-4 ("gpt4").
Lovelace Medium Alpha1 (Avelina/lovelace-medium-alpha1)

Lovelace Medium Alpha1 Parameters and Internals

Model Type 
Transformer-XL
Supported Languages 
en (native)
Training Details 
Data Sources:
EleutherAI/pile
Data Volume:
100B tokens
Methodology:
Direct Preference Heads
Context Length:
2048
Model Architecture:
Transformer-XL style model with 18 layers and FFN activation SwiGLU
Release Notes 
Version:
alpha1
Notes:
Model trained on 100B tokens of The Pile with 551M parameters.
LLM NameLovelace Medium Alpha1
Repository ๐Ÿค—https://huggingface.co/Avelina/lovelace-medium-alpha1 
Model Size551m
Required VRAM1.1 GB
Updated2025-02-22
MaintainerAvelina
Model Typelsw_transformer
Model Files  1.1 GB
Supported Languagesen
Model ArchitectureLSWTForCausalLM
Licensebsd-3-clause
Transformers Version4.37.2
Vocabulary Size50272
Torch Data Typefloat16

Rank the Lovelace Medium Alpha1 Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 43470 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227