🍓 Ice-Clem 🍓

Ice

Bienvenue sur la documentation de l'IA : Ice-Clem. ce petit modèle tres simple, a été crée et entraîné en 5 minutes, en réponse a une idée éclair, apparu lorsque j'étais assez fatiguée j'avoue 🤣.

Le but de ce modèle est de générer des combinaisons loufoques d'aliments qui ont rien a voir entre eux, pour vous faire imaginer des plats dégueulasse et vous faire sourir (ou rire j'espère).

le modèle a ete entraîné sur certains mots-clés (les ingrédients), et a partir d'un ingrédients de départ, génère totalement aléatoirement la suite des ingrédients pour faire votre plat loufoque (a ne pas concrétiser).

les noms des ingrédients sont rédigés en anglais.

🌸 Liste des ingrédients 🌸

Voici la listes des ingrédients de départ que vous pouvez utiliser :

pizza sushi pasta soup curry steak salad burger tacos noodles rice bread cake cookies pie chocolate vanilla strawberry spicy sour

le modèle générera la suite, de façon aléatoire mais intelligente. ;) 🔥

🩵 Exemple d'utilisation 🩵

from huggingface_hub import hf_hub_download
import torch
import json
import os

# Define your Hugging Face repository name and the filenames
repo_name = "Clemylia/Ice-Clem" # Make sure this matches the repository name you used
model_filename = "pytorch_model.bin"
word_to_index_filename = "word_to_index.json"
index_to_word_filename = "index_to_word.json"

# Download the files from the Hugging Face Hub
try:
    model_path = hf_hub_download(repo_id=repo_name, filename=model_filename)
    word_to_index_path = hf_hub_download(repo_id=repo_name, filename=word_to_index_filename)
    index_to_word_path = hf_hub_download(repo_id=repo_name, filename=index_to_word_filename)

    print(f"Downloaded model to: {model_path}")
    print(f"Downloaded word_to_index to: {word_to_index_path}")
    print(f"Downloaded index_to_word to: {index_to_word_path}")

    # Load the word_to_index and index_to_word mappings
    with open(word_to_index_path, 'r') as f:
        word_to_index = json.load(f)

    with open(index_to_word_path, 'r') as f:
        index_to_word = json.load(f)
        # Convert keys back to integers if they were saved as strings
        index_to_word = {int(k): v for k, v in index_to_word.items()}


    # Define the model architecture (must match the architecture used for training)
    # You'll need the same hyperparameters as before
    vocab_size = len(word_to_index)
    embedding_dim = 100 # Make sure this matches your training parameter
    hidden_dim = 256 # Make sure this matches your training parameter
    output_dim = vocab_size

    class FoodCombinerModel(torch.nn.Module):
        def __init__(self, vocab_size, embedding_dim, hidden_dim, output_dim):
            super(FoodCombinerModel, self).__init__()
            self.embedding = torch.nn.Embedding(vocab_size, embedding_dim)
            self.lstm = torch.nn.LSTM(embedding_dim, hidden_dim, batch_first=True)
            self.fc = torch.nn.Linear(hidden_dim, output_dim)

        def forward(self, x):
            embedded = self.embedding(x)
            lstm_out, _ = self.lstm(embedded)
            output = self.fc(lstm_out[:, -1, :])
            return output

    # Instantiate the model
    loaded_model = FoodCombinerModel(vocab_size, embedding_dim, hidden_dim, output_dim)

    # Load the saved state dictionary
    loaded_model.load_state_dict(torch.load(model_path))

    # Set the model to evaluation mode
    loaded_model.eval()

    print("Model loaded successfully!")

    # Now you can use the loaded_model for generation
    # Make sure the generate_combination function is defined in a previous cell and accessible

    # Generate a combination using the loaded model
    starting_phrase_loaded = "sushi" # You can use any word from your vocabulary
    generated_combination_loaded = generate_combination(loaded_model, starting_phrase_loaded, word_to_index, index_to_word)
    print(f"\nGenerated combination using loaded model starting with '{starting_phrase_loaded}': {generated_combination_loaded}")

    starting_phrase_loaded_2 = "pizza"
    generated_combination_loaded_2 = generate_combination(loaded_model, starting_phrase_loaded_2, word_to_index, index_to_word)
    print(f"Generated combination using loaded model starting with '{starting_phrase_loaded_2}': {generated_combination_loaded_2}")

except Exception as e:
    print(f"An error occurred: {e}")
    print("Please ensure the repository name is correct and the files exist on Hugging Face Hub.")
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including Clemylia/Ice-Clem