This Laravel package provides an easy-to-use interface for integrating OpenRouter into your Laravel applications. OpenRouter is a unified interface for Large Language Models (LLMs) that allows you to interact with various AI models through a single API.
You can install the package via composer:
composer require moe-mizrak/laravel-openrouter
You can publish the config file with:
php artisan vendor:publish --tag=laravel-openrouter
This is the contents of the published config file:
return [
'api_endpoint' => env('OPENROUTER_API_ENDPOINT', 'https://openrouter.ai/api/v1/'),
'api_key' => env('OPENROUTER_API_KEY'),
];
After publishing the package configuration file, you'll need to add the following environment variables to your .env file:
OPENROUTER_API_ENDPOINT=https://openrouter.ai/api/v1/
OPENROUTER_API_KEY=your_api_key
This package provides two ways to interact with the OpenRouter API:
LaravelOpenRouter
facadeOpenRouterRequest
class directly.Both methods utilize the ChatData
DTO class to structure the data sent to the API.
The ChatData
class is used to encapsulate the data required for making chat requests to the OpenRouter API. Here's a breakdown of the key properties:
MessageData
objects representing the chat messages. This field is XOR-gated with the prompt
field.messages
field.models
field.ResponseFormatData
class representing the desired format for the response.These properties control various aspects of the generated response (more info):
Only natively suported by OpenAI models. For others, we submit a YAML-formatted string with these tools at the end of the prompt.
ToolCallData
objects for function calling.
model
field.RouteType::FALLBACK
).ProviderPreferencesData
DTO object for configuring provider preferences.This is a sample chat data instance:
$chatData = new ChatData([
'messages' => [
new MessageData([
'role' => RoleType::USER,
'content' => [
new TextContentData([
'type' => TextContentData::ALLOWED_TYPE,
'text' => 'This is a sample text content.',
]),
new ImageContentPartData([
'type' => ImageContentPartData::ALLOWED_TYPE,
'image_url' => new ImageUrlData([
'url' => 'https://example.com/image.jpg',
'detail' => 'Sample image',
]),
]),
],
]),
],
'response_format' => new ResponseFormatData([
'type' => 'json_object',
]),
'stop' => ['stop_token'],
'stream' => true,
'max_tokens' => 1024,
'temperature' => 0.7,
'top_p' => 0.9,
'top_k' => 50,
'frequency_penalty' => 0.5,
'presence_penalty' => 0.2,
'repetition_penalty' => 1.2,
'seed' => 42,
'tool_choice' => 'auto',
'tools' => [
// ToolCallData instances
],
'logit_bias' => [
'50256' => -100,
],
'transforms' => ['middle-out'],
'models' => ['model1', 'model2'],
'route' => RouteType::FALLBACK,
'provider' => new ProviderPreferencesData([
'allow_fallbacks' => true,
'require_parameters' => true,
'data_collection' => DataCollectionType::ALLOW,
]),
]);
The LaravelOpenRouter
facade offers a convenient way to make OpenRouter API requests.
To send a chat request, create an instance of ChatData
and pass it to the chatRequest
method:
$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; // Your desired prompt or content
$model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models)
$messageData = new MessageData([
'content' => $content,
'role' => RoleType::USER,
]);
$chatData = new ChatData([
'messages' => [
$messageData,
],
'model' => $model,
'max_tokens' => 100, // Adjust this value as needed
]);
$chatResponse = LaravelOpenRouter::chatRequest($chatData);
Streaming chat request is also supported and can be used as following by using chatStreamRequest function:
$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; // Your desired prompt or content
$model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models)
$messageData = new MessageData([
'content' => $content,
'role' => RoleType::USER,
]);
$chatData = new ChatData([ 'messages' => [ $messageData, ], 'model' => $model, 'max_tokens' => 100, // Adjust this value as needed ]);
/*
// Waits until the promise completes if possible. $stream = $promise->wait(); // $stream is type of GuzzleHttp\Psr7\Stream
/*
// 2) Or Retrieve streamed raw response as it becomes available: while (! $stream->eof()) { $rawResponse = $stream->read(1024); // readByte can be set as desired, for better performance 4096 byte (4kB) can be used.
/*
* Optionally you can use filterStreamingResponse to filter raw streamed response, and map it into array of responseData DTO same as chatRequest response format.
*/
$response = LaravelOpenRouter::filterStreamingResponse($rawResponse);
}
You do **not** need to specify `'stream' = true` in ChatData since `chatStreamRequest` does it for you.
<details>
This is the expected sample rawResponse (raw response returned from OpenRouter stream chunk) `$rawResponse`:
```php
"""
: OPENROUTER PROCESSING\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"Title"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":": Quant"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"um Echo"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":": A Sym"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGG
"""
"""
IsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"phony of Code"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"\n\nIn"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":" the heart of"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":" the bustling"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistra
"""
"""
l-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":" city of Ne"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"o-Tok"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":"yo, a"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718885921,"choices":[{"index":0,"delta":{"role":"assistant","content":" brilliant young research"},"finish_reason":null}]}\n
\n
data: {"id":"gen-eWgGaEbIzFq4ziGGIsIjyRtLda54","model":"mistralai/mistral-7b-instruct:free","object":"chat.com
"""
...
: OPENROUTER PROCESSING\n
\n
data: {"id":"gen-C6Xym94jZcvJv2vVpxYSyw2tV1fR","model":"mistralai/mistral-7b-instruct:free","object":"chat.completion.chunk","created":1718887189,"choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}],"usage":{"prompt_tokens":23,"completion_tokens":100,"total_tokens":123}}\n
\n
data: [DONE]\n
Last data:
carries usage information of streaming.
data: [DONE]\n
returned from OpenRouter server when streaming is over.
This is the sample response after filterStreamingResponse:
[
ResponseData(
id: "gen-QcWgjEtiEDNHgomV2jjoQpCZlkRZ",
model: "mistralai/mistral-7b-instruct:free",
object: "chat.completion.chunk",
created: 1718888436,
choices: [
[
"index" => 0,
"delta" => [
"role" => "assistant",
"content" => "Title"
],
"finish_reason" => null
]
],
usage: null
),
ResponseData(
id: "gen-QcWgjEtiEDNHgomV2jjoQpCZlkRZ",
model: "mistralai/mistral-7b-instruct:free",
object: "chat.completion.chunk",
created: 1718888436,
choices: [
[
"index" => 0,
"delta" => [
"role" => "assistant",
"content" => "Quant"
],
"finish_reason" => null
]
],
usage: null
),
...
new ResponseData([
'id' => 'gen-QcWgjEtiEDNHgomV2jjoQpCZlkRZ',
'model' => 'mistralai/mistral-7b-instruct:free',
'object' => 'chat.completion.chunk',
'created' => 1718888436,
'choices' => [
[
'index' => 0,
'delta' => [
'role' => 'assistant',
'content' => '',
],
'finish_reason' => null,
],
],
'usage' => new UsageData([
'prompt_tokens' => 23,
'completion_tokens' => 100,
'total_tokens' => 123,
]),
]),
]
If you want to maintain conversation continuity meaning that historical chat will be remembered and considered for your new chat request, you need to send historical messages along with the new message:
$model = 'mistralai/mistral-7b-instruct:free';
$firstMessage = new MessageData([ 'role' => RoleType::USER, 'content' => 'My name is Moe, the AI necromancer.', ]);
$chatData = new ChatData([ 'messages' => [ $firstMessage, ], 'model' => $model, ]); // This is the chat which you want LLM to remember $oldResponse = LaravelOpenRouter::chatRequest($chatData);
/*
// Here adding historical response to new message $historicalMessage = new MessageData([ 'role' => RoleType::ASSISTANT, // set as assistant since it is a historical message retrieved previously 'content' => Arr::get($oldResponse->choices[0],'message.content'), // Historical response content retrieved from previous chat request ]); // This is your new message $newMessage = new MessageData([ 'role' => RoleType::USER, 'content' => 'Who am I?', ]);
$chatData = new ChatData([ 'messages' => [ $historicalMessage, $newMessage, ], 'model' => $model, ]);
$response = LaravelOpenRouter::chatRequest($chatData);
Expected response:
```php
$content = Arr::get($response->choices[0], 'message.content');
// content = You are Moe, a fictional character and AI Necromancer, as per the context of the conversation we've established. In reality, you are the user interacting with me, an assistant designed to help answer questions and engage in friendly conversation.
To retrieve the cost of a generation, first make a chat request
and obtain the generationId
. Then, pass the generationId to the costRequest
method:
$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; // Your desired prompt or content
$model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models)
$messageData = new MessageData([
'content' => $content,
'role' => RoleType::USER,
]);
$chatData = new ChatData([
'messages' => [
$messageData,
],
'model' => $model,
'max_tokens' => 100, // Adjust this value as needed
]);
$chatResponse = LaravelOpenRouter::chatRequest($chatData);
$generationId = $chatResponse->id; // generation id which will be passed to costRequest
$costResponse = LaravelOpenRouter::costRequest($generationId);
To retrieve rate limit and credits left on the API key:
$limitResponse = LaravelOpenRouter::limitRequest();
You can also inject the OpenRouterRequest
class in the constructor of your class and use its methods directly.
public function __construct(protected OpenRouterRequest $openRouterRequest) {}
Similarly, to send a chat request, create an instance of ChatData
and pass it to the chatRequest
method:
$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; // Your desired prompt or content
$model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models)
$messageData = new MessageData([
'content' => $content,
'role' => RoleType::USER,
]);
$chatData = new ChatData([
'messages' => [
$messageData,
],
'model' => $model,
'max_tokens' => 100, // Adjust this value as needed
]);
$response = $this->openRouterRequest->chatRequest($chatData);
Similarly, to retrieve the cost of a generation, create a chat request
to obtain the generationId
, then pass the generationId
to the costRequest
method:
$content = 'Tell me a story about a rogue AI that falls in love with its creator.';
$model = 'mistralai/mistral-7b-instruct:free'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models)
$messageData = new MessageData([
'content' => $content,
'role' => RoleType::USER,
]);
$chatData = new ChatData([
'messages' => [
$messageData,
],
'model' => $model,
'max_tokens' => 100, // Adjust this value as needed
]);
$chatResponse = $this->openRouterRequest->chatRequest($chatData);
$generationId = $chatResponse->id; // generation id which will be passed to costRequest
$costResponse = $this->openRouterRequest->costRequest($generationId);
Similarly, to retrieve rate limit and credits left on the API key:
$limitResponse = $this->openRouterRequest->limitRequest();
We welcome contributions! If you'd like to improve this package, simply create a pull request with your changes. Your efforts help enhance its functionality and documentation.
Laravel OpenRouter is an open-sourced software licensed under the MIT license.