Here is how you can download the checkpoints from the studio after you have fine-tuned your model. Once you have
downloaded the checkpoints, you can unzip them and use the following inference engines to load the checkpoints and use them for inference.
You can use the Ollama library to load the checkpoints and use them for inference using the Ollama library.To create a model, you first need to create a Modelfile:
Copy
Ask AI
FROM path/to/finetuned/model/checkpoint
Then run the following command to create the model.
Copy
Ask AI
ollama create my-model -f Modelfile
Then you can use the model for inference:
Copy
Ask AI
from ollama import Clientclient = Client(host='http://localhost:11434')SYSTEM_PROMPT = """You are a helpful recipe assistant. You are to extract the generic ingredients from each of the recipes provided."""USER_PROMPT = """Title: Lemon Drizzle CakeIngredients: ["200g unsalted butter", "200g caster sugar", "4 eggs", "200g self-raising flour", "1 tsp baking powder", "zest of 1 lemon", "100ml lemon juice", "150g icing sugar"]Generic ingredients:"""conversation = [ {"role": "system", "content": SYSTEM_PROMPT}, {"role": "user", "content": USER_PROMPT},]res = client.chat(model='my-model', messages=conversation)print(res.message.content)