FactCG: Enhancing Fact Checkers with Graph-Based Multi-Hop Data
Paper
• 2501.17144 • Published
• 5
This is a fact-checking model from our work:
📃 FactCG: Enhancing Fact Checkers with Graph-Based Multi-Hop Data (NAACL2025, GitHub Repo)
You can load our model with the following example code:
from transformers import AutoTokenizer, AutoConfig, AutoModelForSequenceClassification
config = AutoConfig.from_pretrained("yaxili96/FactCG-DeBERTa-v3-Large", num_labels=2, finetuning_task="text-classification", revision='main', token=None, cache_dir="./cache")
config.problem_type = "single_label_classification"
tokenizer = AutoTokenizer.from_pretrained("yaxili96/FactCG-DeBERTa-v3-Large", use_fast=True, revision='main', token=None, cache_dir="./cache")
model = AutoModelForSequenceClassification.from_pretrained(
"yaxili96/FactCG-DeBERTa-v3-Large", config=config, revision='main', token=None, ignore_mismatched_sizes=False, cache_dir="./cache")
If you find the repository or FactCG helpful, please cite the following paper
@inproceedings{lei2025factcg,
title={FactCG: Enhancing Fact Checkers with Graph-Based Multi-Hop Data},
author={Lei, Deren and Li, Yaxi and Li, Siyao and Hu, Mengya and Xu, Rui and Archer, Ken and Wang, Mingyu and Ching, Emily and Deng, Alex},
journal={NAACL},
year={2025}
}
Base model
microsoft/deberta-v3-large