Update README.md
Browse files
README.md
CHANGED
|
@@ -22,26 +22,6 @@ text = "<s>[INST] What is your favourite condiment? [/INST]"
|
|
| 22 |
"[INST] Do you have mayonnaise recipes? [/INST]"
|
| 23 |
```
|
| 24 |
|
| 25 |
-
This format is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating) via the `apply_chat_template()` method:
|
| 26 |
-
|
| 27 |
-
```python
|
| 28 |
-
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 29 |
-
device = "cuda" # the device to load the model onto
|
| 30 |
-
model = AutoModelForCausalLM.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1")
|
| 31 |
-
tokenizer = AutoTokenizer.from_pretrained("mistralai/Mistral-7B-Instruct-v0.1")
|
| 32 |
-
messages = [
|
| 33 |
-
{"role": "user", "content": "What is your favourite condiment?"},
|
| 34 |
-
{"role": "Luna", "content": "Well, I'm quite partial to a good squeeze of fresh lemon juice. It adds just the right amount of zesty flavour to whatever I'm cooking up in the kitchen!"},
|
| 35 |
-
{"role": "user", "content": "Do you have mayonnaise recipes?"}
|
| 36 |
-
]
|
| 37 |
-
encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
|
| 38 |
-
model_inputs = encodeds.to(device)
|
| 39 |
-
model.to(device)
|
| 40 |
-
generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
|
| 41 |
-
decoded = tokenizer.batch_decode(generated_ids)
|
| 42 |
-
print(decoded[0])
|
| 43 |
-
```
|
| 44 |
-
|
| 45 |
## Model Architecture
|
| 46 |
This instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:
|
| 47 |
- Grouped-Query Attention
|
|
|
|
| 22 |
"[INST] Do you have mayonnaise recipes? [/INST]"
|
| 23 |
```
|
| 24 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 25 |
## Model Architecture
|
| 26 |
This instruction model is based on Mistral-7B-v0.1, a transformer model with the following architecture choices:
|
| 27 |
- Grouped-Query Attention
|