YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Summarization Model (T5) β Ready for Hugging Face
This repo contains:
data/train.csv: your training data (two columns: text β label)train.py: a compact training script for fine-tuning FLAN-T5 for summarizationrequirements.txt: minimal dependencies
Quick Start (Locally or on a Hugging Face Space/Notebook)
pip install -r requirements.txt
python train.py --model_name google/flan-t5-small --epochs 3 --batch_size 8 --output_dir checkpoint
Tip: If you have a GPU, add
--fp16for faster training.
Customizing
- Change
--input_coland--target_colif your CSV headers differ. - Adjust
--max_input_lenand--max_target_lenbased on your data.
Export the Model
The trained model is saved under checkpoint/. Upload that folder to a Hugging Face Model repo.
Set your model's Inference widget to Text2Text and use:
Pipeline tag: text2text-generation
Library: transformers
Inference (Command Line)
After training:
python checkpoint/inference.py checkpoint < input.txt
Data Schema
- Input column: text
- Target (summary) column: label
If these are incorrect, update train.py flags accordingly.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support