Qwen2.5-7B Fine-tuned for Arabic Culture QA

This model is a fine-tuned version of Qwen/Qwen2.5-7B-Instruct using LoRA (Low-Rank Adaptation) for Arabic culture and Islamic studies question answering.

Model Details

  • Base Model: Qwen/Qwen2.5-7B-Instruct
  • Fine-tuning Method: LoRA (Low-Rank Adaptation)
  • Training Dataset: UBC-NLP/palmx_2025_subtask1_culture
  • Task: Multiple-choice question answering about Arabic culture
  • Languages: Arabic, English

Performance

  • Validation Accuracy: XX.X% (update with your actual accuracy)
  • Baseline (Few-shot): 69.70%
  • Improvement: +X.X%

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer
from peft import PeftModel

# Load base model and tokenizer
base_model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen2.5-7B-Instruct")
tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen2.5-7B-Instruct")

# Load fine-tuned model
model = PeftModel.from_pretrained(base_model, "rafiulbiswas/qwen2.5-7b-arabic-culture-qa")

# Generate answer
prompt = '''<|im_start|>system
You are an expert in Arabic culture and Islamic studies. Answer the multiple-choice question by providing only the letter of the correct option (A, B, C, or D).<|im_end|>
<|im_start|>user
Question: What is the traditional Arabic greeting meaning "peace be upon you"?

A. Marhaba
B. As-salamu alaikum  
C. Ahlan wa sahlan
D. Habibi

Answer:<|im_end|>
<|im_start|>assistant
'''

inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=5)
answer = tokenizer.decode(outputs[0][inputs.input_ids.shape[1]:], skip_special_tokens=True)
print(answer)  # Expected: "B"

Training Details

  • Training Data: 2000 samples
  • Validation Data: 500 samples
  • Training Epochs: 3
  • Learning Rate: 2e-4
  • LoRA Rank: 16
  • LoRA Alpha: 32

Intended Use

This model is designed for:

  • Arabic culture and Islamic studies question answering
  • Educational applications
  • Cultural knowledge assessment
  • Research in multilingual QA systems

Limitations

  • Specialized for Arabic culture domain
  • May not generalize well to other domains
  • Requires careful prompt formatting for best results

Citation

If you use this model, please cite:

@misc{qwen25-arabic-culture-qa,
  title={Qwen2.5-7B Fine-tuned for Arabic Culture QA},
  author={Md.Rafiul Biswas, Kais Attia, Shimaa Ibrahim, Mabrouka Bessghaier, Firoj Alam, and Wajdi Zaghouani},
  year={2025},
  howpublished={\url{https://huggingface.co/rafiulbiswas/qwen2.5-7b-arabic-culture-qa}}
}
Downloads last month
27
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for rafiulbiswas/qwen2.5-7b-arabic-culture-qa

Base model

Qwen/Qwen2.5-7B
Adapter
(707)
this model

Dataset used to train rafiulbiswas/qwen2.5-7b-arabic-culture-qa