GPT Sw3 6.7B V2 Instruct by AI-Sweden-Models

 ยป  All LLMs  ยป  AI-Sweden-Models  ยป  GPT Sw3 6.7B V2 Instruct   URL Share it on

  Autotrain compatible Base model:ai-sweden-models/gp... Base model:finetune:ai-sweden-...   Da Dataset:databricks/databricks-...   Dataset:laion/oig   Dataset:openassistant/oasst1   En   Endpoints compatible   Gpt2   Instruct   Is   No   Pytorch   Region:us   Safetensors   Sharded   Sv   Tensorflow

GPT Sw3 6.7B V2 Instruct Benchmarks

GPT Sw3 6.7B V2 Instruct (AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct)

GPT Sw3 6.7B V2 Instruct Parameters and Internals

Model Type 
language model, text generation
Use Cases 
Areas:
Nordic NLP ecosystem
Applications:
Pre-release for research and evaluation of the capabilities of Large Language Models for the Nordic languages.
Primary Use Cases:
GPT-SW3 can generate coherent text in multiple languages and perform text tasks by casting them as text generation tasks.
Limitations:
Bias, Safety, Generation diversity issues, Hallucination, Overrepresentation/Underrepresentation of certain viewpoints, Stereotypes, May generate inappropriate content
Supported Languages 
da (Unknown), sv (Unknown), no (Unknown), en (Unknown), is (Unknown)
Training Details 
Data Sources:
laion/OIG, databricks/databricks-dolly-15k, OpenAssistant/oasst1
Data Volume:
320B tokens
Methodology:
Trained with the NeMo Megatron GPT implementation.
Model Architecture:
Decoder-only transformer language model.
Input Output 
Input Format:
Raw text or instruction data in chat format.
Accepted Modalities:
text
Output Format:
Generated text
LLM NameGPT Sw3 6.7B V2 Instruct
Repository ๐Ÿค—https://huggingface.co/AI-Sweden-Models/gpt-sw3-6.7b-v2-instruct 
Base Model(s)  AI-Sweden-Models/gpt-sw3-6.7b-v2   AI-Sweden-Models/gpt-sw3-6.7b-v2
Model Size6.7b
Required VRAM28.1 GB
Updated2025-02-05
MaintainerAI-Sweden-Models
Model Typegpt2
Instruction-BasedYes
Model Files  10.0 GB: 1-of-3   10.0 GB: 2-of-3   8.1 GB: 3-of-3   10.0 GB: 1-of-3   10.0 GB: 2-of-3   8.1 GB: 3-of-3
Supported Languagesda sv no en is
Model ArchitectureGPT2LMHeadModel
Licenseother
Model Max Length2048
Transformers Version4.22.1
Tokenizer ClassGPTSw3Tokenizer
Vocabulary Size64000
Torch Data Typefloat32
Activation Functiongelu

Rank the GPT Sw3 6.7B V2 Instruct Capabilities

๐Ÿ†˜ Have you tried this model? Rate its performance. This feedback would greatly assist ML community in identifying the most suitable model for their needs. Your contribution really does make a difference! ๐ŸŒŸ

Instruction Following and Task Automation  
Factuality and Completeness of Knowledge  
Censorship and Alignment  
Data Analysis and Insight Generation  
Text Generation  
Text Summarization and Feature Extraction  
Code Generation  
Multi-Language Support and Translation  

What open-source LLMs or SLMs are you in search of? 42577 in total.

Our Social Media →  
Original data from HuggingFace, OpenCompass and various public git repos.
Release v20241227