Intent Classification Model
Model Description
This repository contains a fine-tuned Transformer model for intent classification.
The model is built using Hugging Face transformers and stored in safetensors format, enabling efficient and safe loading.
It predicts an intent label from input text for tasks such as chatbot understanding, ticket routing, and text categorization.
How to Use
Install dependencies
pip install transformers==4.57.6
Load model
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
model_path = "cngchis/phi4-mini-intent"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForSequenceClassification.from_pretrained(model_path)
text = "I cannot log into my account"
inputs = tokenizer(text, return_tensors="pt", truncation=True, padding=True)
with torch.no_grad():
outputs = model(**inputs)
logits = outputs.logits
predicted_class = torch.argmax(logits, dim=1).item()
print(predicted_class)
Input Format (Recommended)
"I want to reset my password"
Output Format
The model outputs a class index, which can be mapped to intent labels:
3 → password_reset
1 → login_issue
5 → payment_problem
(You should define label mapping in your application.)
Model Details
- Architecture: Transformer-based classification model
- Task: Intent classification
- Format: PyTorch (safetensors)
- Library: Hugging Face Transformers
- Input: Natural language text
- Output: Single intent class
Notes
- Best performance when input format matches training data
- Requires label mapping for interpretation
- Works with GPU
- Supports batch inference via Transformers
Limitations
Not suitable for generative tasks Sensitive to domain shift (out-of-distribution text) Requires consistent intent label schema
Acknowledgements
Built using:
Hugging Face Transformers PyTorch Safetensors format
- Downloads last month
- 62
Model tree for cngchis/phi4-mini-intent
Base model
microsoft/Phi-4-mini-instruct Finetuned
unsloth/Phi-4-mini-instruct