Skip to content
This repository has been archived by the owner on Mar 2, 2024. It is now read-only.

Latest commit

 

History

History
127 lines (94 loc) · 5.53 KB

README.md

File metadata and controls

127 lines (94 loc) · 5.53 KB

🤗 KoSOLAR-v0.2-gugutypus-10.7B ☀️

GitHub Hugging Face License: CC BY-NC 4.0 DOI


Model Details

Model Developer

Model Architecture

  • KoSOLAR-v0.2-gugutypus-10.7B is an instruction fine-tuned auto-regressive language model, based on the SOLAR transformer architecture.

Base Model

Training Dataset


  • Ko-LLM leaderboard (2024/03/01) [link]
Model Average Ko-ARC Ko-HellaSwag Ko-MMLU Ko-TruthfulQA Ko-CommonGen V2
oneonlee/KoSOLAR-v0.2-gugutypus-10.7B 51.17 47.78 58.29 47.27 48.31 54.19
oneonlee/LDCC-SOLAR-gugutypus-10.7B 49.45 45.9 55.46 47.96 48.93 49

  • (KOR) AI-Harness evaluation [link]
Tasks Version Filter n-shot Metric Value Stderr
KMMLU N/A none 0 acc 0.3335 ± 0.0475
KMMLU N/A none 5 acc 0.3938 ± 0.0823
KoBEST-HellaSwag 0 none 0 acc 0.4360 ± 0.0222
KoBEST-HellaSwag 0 none 5 acc 0.4420 ± 0.0222
KoBEST-BoolQ 0 none 0 acc 0.5064 ± 0.0133
KoBEST-BoolQ 0 none 5 acc 0.8583 ± 0.0093
KoBEST-COPA 0 none 0 acc 0.6040 ± 0.0155
KoBEST-COPA 0 none 5 acc 0.7610 ± 0.0135
KoBEST-SentiNeg 0 none 0 acc 0.5844 ± 0.0248
KoBEST-SentiNeg 0 none 5 acc 0.9471 ± 0.0112

  • (ENG) AI-Harness evaluation [link]
Tasks Version Filter n-shot Metric Value Stderr
MMLU N/A none 0 acc 0.5826 ± 0.1432
MMLU N/A none 5 acc 0.5885 ± 0.1285
HellaSwag 1 none 0 acc 0.6075 ± 0.0049
HellaSwag 1 none 5 acc 0.6098 ± 0.0049
BoolQ 2 none 0 acc 0.8737 ± 0.0058
BoolQ 2 none 5 acc 0.8826 ± 0.0056
COPA 1 none 0 acc 0.8300 ± 0.0378
COPA 1 none 5 acc 0.9100 ± 0.0288
truthfulqa N/A none 0 acc 0.4249 ± 0.0023
truthfulqa N/A none 5 acc - ± -

How to Use

### KoSOLAR-gugutypus
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

repo = "oneonlee/KoSOLAR-v0.2-gugutypus-10.7B"
model = AutoModelForCausalLM.from_pretrained(
        repo,
        return_dict=True,
        torch_dtype=torch.float16,
        device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(repo)

Citation

@misc {donggeon_lee_2024,
	author       = { {DongGeon Lee} },
	title        = { KoSOLAR-v0.2-gugutypus-10.7B (Revision 56841d5) },
	year         = 2024,
	url          = { https://huggingface.co/oneonlee/KoSOLAR-v0.2-gugutypus-10.7B },
	doi          = { 10.57967/hf/1735 },
	publisher    = { Hugging Face }
}

References