YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

training_arguments = SFTConfig( output_dir=output_dir, per_device_train_batch_size=16, gradient_accumulation_steps=1, optim="adamw_torch", save_steps=100, logging_steps=1, learning_rate=1e-4, bf16=True, # max_grad_norm=0.3, num_train_epochs=3, save_strategy="epoch", warmup_ratio=0.05, group_by_length=True, # lr_scheduler_type="cosine", gradient_checkpointing=True, gradient_checkpointing_kwargs = {"use_reentrant": True}, dataset_text_field="text", max_seq_length=1024, packing=False # report_to="wandb", )

baseline_model = 'zisisbatzos/2SFTs_llama3.2-3B'

Metrics: emotion_accuracy = 0.36554621848739494 correct_format = 0.7058823529411765

Downloads last month
-
Safetensors
Model size
3B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support