The OpenRouter package for Laravel helps you integrate OpenRouter into your Laravel applications. OpenRouter is a unified interface for Large Language Models (LLMs) that allows you to interact with various AI models through a single API.
// Your desired prompt or content$content = 'Tell me a story about a rogue AI that falls in love with its creator.'; // The OpenRouter model you want to use (https://openrouter.ai/docs#models)$model = 'mistralai/mistral-7b-instruct:free'; $messageData = ; $chatData = new ChatData( messages: [ new MessageData( content: $content, role: RoleType::USER, ), ], model: $model, max_tokens: 100, // Adjust this value as needed); $chatResponse = LaravelOpenRouter::chatRequest($chatData);
The above example illustrates using the LaravelOpenRouter
Facade, but the package can also support streaming a chat request. Coupled with Laravel's Streamed Responses you can build frontends with any LLM that OpenRouter provides.
$promise = LaravelOpenRouter::chatStreamRequest($chatData);$stream = $promise->wait(); // $stream is type of GuzzleHttp\Psr7\Stream while (! $stream->eof()) { $rawResponse = $stream->read(1024); /* * Optionally you can use filterStreamingResponse to filter raw streamed response, and map it into array of responseData DTO same as chatRequest response format. */ $response = LaravelOpenRouter::filterStreamingResponse($rawResponse);}
You can learn more about this package, get full installation instructions, and view the source code on GitHub. To get started with OpenRouter, check out the OpenRouter Quickstart Guide.
To install this package you can run the following commands:
composer require moe-mizrak/laravel-openrouterphp artisan vendor:publish --tag=laravel-openrouter