--- license: apache-2.0 base_model: unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit tags: - text-generation - job-shop-scheduling - optimization - llama - unsloth - jssp datasets: - ACCORD language: - en pipeline_tag: text-generation library_name: unsloth --- # JSSP LLaMA 8B Fine-tuned Model ## Model Description Job Shop Scheduling Problem (JSSP) 최적화를 위해 파인튜닝된 LLaMA 8B 모델입니다. ## Training Details - Base Model: unsloth/Meta-Llama-3.1-8B-Instruct-bnb-4bit - LoRA Rank: 64 - Epochs: 4 - Max Sequence Length: 40,000 - Dataset: ACCORD ## Usage ```python from unsloth import FastLanguageModel model, tokenizer = FastLanguageModel.from_pretrained( model_name="HYUNJINI/pfsp_test_1", max_seq_length=40000, load_in_4bit=True, dtype=torch.bfloat16, ) FastLanguageModel.for_inference(model) ```