STM32 Code Generator - Turkish Fine-tuned StarCoder2-3B

Model Özeti

Bu model, bigcode/starcoder2-3b temel alınarak 55,000 STM32 kod örneği ile fine-tune edilmiştir. STM32F4 serisi için HAL kütüphanesi kullanan gömülü sistem kodu üretir.

Performans

  • Test Sonucu: 17/20 (%85) başarı oranı
  • V1 Model: 6/20 (%30) → V2 Model: 17/20 (%85)
  • İyileşme: +%55

Test Senaryoları:

  • ✅ ADC okuma ve işleme
  • ✅ PWM motor kontrolü
  • ✅ UART iletişim
  • ✅ I2C/SPI protokolleri
  • ✅ Timer interrupt'lar
  • ✅ DMA kullanımı
  • ✅ Watchdog timer
  • ✅ Multi-sensor sistemler

Kullanım

from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
import torch

print("🔄 Model yükleniyor...")

base_model = AutoModelForCausalLM.from_pretrained(
    "bigcode/starcoder2-3b",
    torch_dtype=torch.float16,  
    device_map="auto"
)

model = PeftModel.from_pretrained(base_model, "MuratKomurcu/starcoder2-stm32-turkish")
tokenizer = AutoTokenizer.from_pretrained("MuratKomurcu/starcoder2-stm32-turkish")

print("✅ Model hazır!")

# TEST
prompt = '''### Instruction:
Write complete STM32F401RE LED blink code with main function

### Input:
LED on PA5, blink every 1000ms, include SystemClock_Config

### Response:
'''

Eğitim Detayları

  • Base Model: bigcode/starcoder2-3b
  • Dataset Size: 55,000 STM32 örnekleri
  • Method: LoRA fine-tuning (QLoRA 4-bit)
  • LoRA Rank: 64
  • Training Epochs: 5
  • Final Validation Loss: 0.0297
  • Hardware: NVIDIA A100 (40GB)
  • Training Time: ~36 saat

Hyperparameters:

  • Learning rate: 2e-4
  • Batch size: 8 (effective: 32 with gradient accumulation)
  • Weight decay: 0.01
  • LR Scheduler: Cosine
  • Warmup ratio: 0.1

Limitasyonlar

  • STM32F4 serisine odaklıdır
  • HAL kütüphanesi kullanır (LL veya register-level değil)
  • Bazı kompleks senaryolarda (RGB multi-channel PWM, dual button) daha az başarılı
  • Türkçe açıklamalar kabul eder ancak İngilizce promptlar daha iyi sonuç verir

Lisans

BigCode OpenRAIL-M

Atıf

@misc{starcoder2-stm32-turkish,
  author = {Murat},
  title = {STM32 Code Generator - Turkish Fine-tuned},
  year = {2024},
  publisher = {HuggingFace},
  howpublished = {\url{https://huggingface.co/MuratKomurcu/starcoder2-stm32-turkish}}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MuratKomurcu/starcoder2-stm32-turkish

Finetuned
(48)
this model

Space using MuratKomurcu/starcoder2-stm32-turkish 1

Evaluation results