Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Grogros
/
phi2-Instruct-reg2-3
like
0
Text Generation
Transformers
Safetensors
phi
Generated from Trainer
conversational
text-generation-inference
License:
mit
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
phi2-Instruct-reg2-3
11.1 GB
1 contributor
History:
6 commits
Grogros
Upload finetuning_config.yaml with huggingface_hub
e32fe55
verified
2 days ago
checkpoint-2000
Training in progress, step 2000, checkpoint
2 days ago
.gitattributes
Safe
1.52 kB
initial commit
2 days ago
README.md
1.15 kB
Model save
2 days ago
added_tokens.json
Safe
1.16 kB
Upload tokenizer
2 days ago
chat_template.jinja
Safe
295 Bytes
Upload tokenizer
2 days ago
config.json
692 Bytes
Training in progress, step 2000
2 days ago
finetuning_config.yaml
2.08 kB
Upload finetuning_config.yaml with huggingface_hub
2 days ago
generation_config.json
Safe
119 Bytes
Training in progress, step 2000
2 days ago
merges.txt
Safe
456 kB
Upload tokenizer
2 days ago
model-00001-of-00002.safetensors
5 GB
xet
Training in progress, step 2000
2 days ago
model-00002-of-00002.safetensors
564 MB
xet
Training in progress, step 2000
2 days ago
model.safetensors.index.json
Safe
35.8 kB
Training in progress, step 2000
2 days ago
special_tokens_map.json
Safe
1.07 kB
Upload tokenizer
2 days ago
tokenizer.json
Safe
3.57 MB
Upload tokenizer
2 days ago
tokenizer_config.json
8.25 kB
Upload tokenizer
2 days ago
training_args.bin
pickle
Detected Pickle imports (10)
"transformers.training_args.TrainingArguments"
,
"accelerate.state.PartialState"
,
"transformers.trainer_utils.SaveStrategy"
,
"transformers.trainer_utils.IntervalStrategy"
,
"transformers.trainer_utils.HubStrategy"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.training_args.OptimizerNames"
,
"torch.device"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
How to fix it?
5.78 kB
xet
Training in progress, step 2000
2 days ago
vocab.json
Safe
798 kB
Upload tokenizer
2 days ago