This is a quickstart guide for using the Chat Completions functionality.

Chat completions allow you to choose a model (e.g. llama3.2-3b or the name of a model you fine-tuned) to generate a response using a text-based prompt.

Keep in mind that Prem is designed to make the process of creating custom models easy and efficient. Most of the functionality like fine-tuning, evaluations, stats and playground will be available on the Prem platform. Once you’ve configured your settings, you’ll be able to use your custom models by simply using the Prem AI API or OpenAI SDKs.

If you want to learn about Datasets, Autonomous Fine-Tuning, Evaluations, Stats and Playground, please refer to those guides.

The quickest way to get started with Prem is to use either the PremAI SDK or OpenAI SDKs for chat completions as shown below.

You can also use the Prem AI API to generate chat completions.

With the API you can use any programming language that has an HTTP client.

We give you the option to use the Prem AI API, the PremAI SDK, or the OpenAI SDKs.

For API usage documentation, refer to the API Reference.

Create an API Key 🔑

Click the API Key button on the sidebar. Then click the + Create API Key button.

Afterwards, copy the API key and save it in a secure location.

Install the PremAI SDK

npm install premai

Chat Completions

import PremAI from 'premai';

const client = new PremAI({
  apiKey: process.env['PREMAI_API_KEY'], // This is the default and can be omitted
});

const response = await client.chat.completions({
    messages: [{
        role: 'user',
        content: 'Write a one-sentence bedtime story about a unicorn.'
    }],
    model: 'llama3.2-3b'
});

console.log(response.choices[0].message.content);

Chat Completion with Streaming

import PremAI from "premai";

const client = new PremAI({
    apiKey: process.env.PREMAI_API_KEY,
});

// Create a chat completion
const response = await client.chat.completions.create({
    kidmodel: "llama3.2-3b", //Or any other model you want to use
    messages: [{
        role: "user",
        content: "Write a one-sentence bedtime story about a unicorn."
    }],
    stream: true,
});

for await (const chunk of response) {
    process.stdout.write(chunk.choices[0].delta.content);
}

Next Steps

  • Explore the Client SDKs to learn how to use the PremAI SDK and OpenAI SDK.
  • Add your own Dataset as a first step to create your own custom models.
  • Next, Autonomously Fine-Tune your models using a dataset.
  • Then, use Evaluations to track your model’s performance.
  • Afterwards, keep up with your model’s performance with Stats.
  • Finally, test your fine-tuned models and pre-trained models in the Playground.
  • Bring your models to production or repeat any steps you need to improve your models or make new ones.

Read the Guides

Support