Typhoon 2 Text
					Collection
				
Typhoon 2 Text ThaiLLM release by SCB 10X.
					โข 
				20 items
				โข 
				Updated
					
				โข
					
					5
Typhoon2-Qwen2.5-7B: Thai Large Language Model (Instruct)
Typhoon2-Qwen2.5-7B is a pretrained only Thai ๐น๐ญ large language model with 7 billion parameters, and it is based on Qwen2.5-7B.
For technical-report. please see our arxiv.
| Model | ThaiExam | ONET | IC | A-Level | TGAT | TPAT | M3Exam | Math | Science | Social | Thai | 
|---|---|---|---|---|---|---|---|---|---|---|---|
| Typhoon2 Qwen2.5 7B Base | 58.86% | 58.64% | 65.26% | 55.11% | 66.15% | 49.13% | 59.90% | 42.98% | 59.42% | 75.62% | 61.59% | 
| Qwen2.5 7B | 55.74% | 51.23% | 60.00% | 41.73% | 72.30% | 53.44% | 55.65% | 46.15% | 54.10% | 66.54% | 55.82% | 
| Typhoon1.5 Llama3 8B Base | 48.82% | 41.35% | 41.05% | 40.94% | 70.76% | 50.00% | 43.88% | 22.62% | 43.47% | 62.81% | 46.63% | 
This model is a pretrained base model. Thus, it may not be able to follow human instructions without using one/few-shot learning or instruction fine-tuning. The model does not have any moderation mechanisms, and may generate harmful or inappropriate responses.
https://twitter.com/opentyphoon
@misc{typhoon2,
      title={Typhoon 2: A Family of Open Text and Multimodal Thai Large Language Models}, 
      author={Kunat Pipatanakul and Potsawee Manakul and Natapong Nitarach and Warit Sirichotedumrong and Surapon Nonesung and Teetouch Jaknamon and Parinthapat Pengpun and Pittawat Taveekitworachai and Adisai Na-Thalang and Sittipong Sripaisarnmongkol and Krisanapong Jirayoot and Kasima Tharnpipitchai},
      year={2024},
      eprint={2412.13702},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2412.13702}, 
}