PEFT How to use RazinAleks/llama-7b-hf-LoRa-GUI_class-fp16 with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("decapoda-research/llama-7b-hf")
model = PeftModel.from_pretrained(base_model, "RazinAleks/llama-7b-hf-LoRa-GUI_class-fp16")