ThaiSafetyBench, a benchmark of 1,954 malicious Thai prompts designed to evaluate large language model (LLM) safety within Thai language.
ThaiSafetyBench Leaderboard