How to Seamlessly Connect Chinese LLMs to Laravel AI SDK
This guide explains how to integrate domestic large language models such as Qwen, Hunyuan, and DeepSeek with Laravel AI SDK by configuring custom base URLs and API keys, providing step‑by‑step installation, model configuration, sample code, and a unified service class for backend applications.
Laravel AI SDK supports custom Base URL configuration, allowing developers to replace the default OpenAI endpoints with compatible domestic models that are more stable and cost‑effective in China.
1. Core Principle: Custom Base URL
The SDK lets you override base_url and api_key for any provider, enabling seamless use of Qwen, Hunyuan, or DeepSeek without changing application logic.
2. Install Laravel AI SDK
# Install the SDK (use a proxy if needed)
composer require laravel/ai
# Publish the configuration file
php artisan vendor:publish --provider="Laravel\Ai\AiServiceProvider"
# Run migrations (if any)
php artisan migrate3. Integrate Alibaba Qwen
Obtain an API key from the Alibaba Cloud Bailei console.
Add the following entries to .env:
# Qwen configuration
QWEN_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxx
QWEN_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1Update config/ai.php:
'providers' => [
'qwen' => [
'driver' => 'openai',
'key' => env('QWEN_API_KEY'),
'url' => env('QWEN_BASE_URL'),
],
],Available Qwen models: qwen-max: strongest reasoning, suitable for complex tasks and code generation. qwen-plus: balanced performance, good for everyday conversation and content creation. qwen-turbo: fastest and cheapest, ideal for high‑concurrency simple Q&A. qwen-long: ultra‑long context (1 M tokens), perfect for long‑document analysis. qwen-coder-plus: code‑focused model for completion and review.
Sample controller for basic chat and code review:
<?php
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use Laravel\Ai\Facades\AI;
class QwenController extends Controller {
// Basic conversation
public function chat(Request $request) {
$request->validate(['message' => 'required|string|max:2000']);
$response = AI::provider('qwen')
->model('qwen-plus')
->chat()
->send([
['role' => 'system', 'content' => 'You are a professional PHP consultant experienced with Laravel.'],
['role' => 'user', 'content' => $request->message],
]);
return response()->json([
'reply' => $response->content,
'model' => 'qwen-plus',
'tokens' => $response->usage->totalTokens ?? null,
]);
}
// Code review using the coder model
public function reviewCode(Request $request) {
$request->validate(['code' => 'required|string']);
$response = AI::provider('qwen')
->model('qwen-coder-plus')
->chat()
->send([
['role' => 'system', 'content' => 'You are a senior PHP code reviewer. Analyze security, performance, and maintainability.'],
['role' => 'user', 'content' => "Please review the following PHP code:
```php
{$request->code}
```"],
]);
return response()->json(['review' => $response->content]);
}
// Streaming response example
public function stream(Request $request) {
$message = $request->input('message', 'Introduce Laravel 11 new features');
return response()->stream(function () use ($message) {
$stream = AI::provider('qwen')
->model('qwen-plus')
->chat()
->stream([
['role' => 'user', 'content' => $message],
]);
foreach ($stream as $chunk) {
echo "data: " . json_encode(['content' => $chunk]) . "
";
ob_flush();
flush();
}
echo "data: [DONE]
";
}, 200, [
'Content-Type' => 'text/event-stream',
'Cache-Control' => 'no-cache',
'X-Accel-Buffering' => 'no',
]);
}
}4. Integrate Tencent Hunyuan (Yuanbao)
Log in to the Tencent Cloud console and enable the Hunyuan service.
Create an API key in the "API Key Management" section.
Add to .env:
# Hunyuan configuration
HUNYUAN_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxx
HUNYUAN_BASE_URL=https://api.hunyuan.cloud.tencent.com/v1Update config/ai.php:
'providers' => [
'hunyuan' => [
'driver' => 'openai',
'key' => env('HUNYUAN_API_KEY'),
'url' => env('HUNYUAN_BASE_URL'),
],
],Key Hunyuan models: hunyuan-turbos-latest: flagship model, 256 K context. hunyuan-large: large‑parameter model, 256 K context. hunyuan-standard: cost‑effective standard model, 256 K context. hunyuan-lite: lightweight, fast, 256 K context. hunyuan-code: code‑focused, 8 K context.
Sample controller for chat, summarization, and multi‑turn conversation follows the same pattern as the Qwen example, simply swapping the provider name and model.
5. Integrate DeepSeek
Register on the DeepSeek Open Platform, complete real‑name verification, and create an API key.
Add to .env:
# DeepSeek configuration
DEEPSEEK_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxx
DEEPSEEK_BASE_URL=https://api.deepseek.comUpdate config/ai.php with either the native driver or OpenAI‑compatible mode:
'providers' => [
// Native DeepSeek driver (recommended)
'deepseek' => [
'driver' => 'deepseek',
'key' => env('DEEPSEEK_API_KEY'),
],
// OpenAI‑compatible wrapper
'deepseek-compat' => [
'driver' => 'openai',
'key' => env('DEEPSEEK_API_KEY'),
'url' => env('DEEPSEEK_BASE_URL'),
],
],Key DeepSeek models: deepseek-chat: fast, suitable for everyday dialogue. deepseek-reasoner: reasoning mode, ideal for complex technical analysis.
Sample controller demonstrates basic chat, reasoning, and code generation using the DeepSeek models.
6. Complete Configuration File
<?php
return [
// Default provider
'default' => env('AI_DEFAULT_PROVIDER', 'deepseek'),
// Provider configurations
'providers' => [
// Qwen
'qwen' => [
'driver' => 'openai',
'key' => env('QWEN_API_KEY'),
'url' => env('QWEN_BASE_URL', 'https://dashscope.aliyuncs.com/compatible-mode/v1'),
],
// Hunyuan
'hunyuan' => [
'driver' => 'openai',
'key' => env('HUNYUAN_API_KEY'),
'url' => env('HUNYUAN_BASE_URL', 'https://api.hunyuan.cloud.tencent.com/v1'),
],
// DeepSeek
'deepseek' => [
'driver' => 'deepseek',
'key' => env('DEEPSEEK_API_KEY'),
],
// Fallback OpenAI (optional)
'openai' => [
'driver' => 'openai',
'key' => env('OPENAI_API_KEY'),
],
],
// Default model settings
'defaults' => [
'text' => env('AI_DEFAULT_TEXT_MODEL', 'deepseek-chat'),
],
];7. Unified AI Service Class
<?php
namespace App\Services;
use Laravel\Ai\Facades\AI;
class AiService {
protected array $modelMap = [
'qwen' => ['provider' => 'qwen', 'model' => 'qwen-plus'],
'hunyuan' => ['provider' => 'hunyuan', 'model' => 'hunyuan-turbos-latest'],
'deepseek'=> ['provider' => 'deepseek','model' => 'deepseek-chat'],
];
/**
* Send a chat request.
* @param string $message User message
* @param string $provider Provider key (qwen/hunyuan/deepseek)
* @param string $system System prompt
* @param array $history Previous messages
* @return string
*/
public function chat(string $message, string $provider = 'deepseek', string $system = 'You are a professional PHP assistant.', array $history = []): string {
$config = $this->modelMap[$provider] ?? $this->modelMap['deepseek'];
$messages = array_merge([
['role' => 'system', 'content' => $system],
], $history, [
['role' => 'user', 'content' => $message],
]);
$response = AI::provider($config['provider'])
->model($config['model'])
->chat()
->send($messages);
return $response->content;
}
/**
* Fallback across providers until one succeeds.
*/
public function chatWithFallback(string $message): string {
foreach (['deepseek', 'qwen', 'hunyuan'] as $provider) {
try {
return $this->chat($message, $provider);
} catch (\Exception $e) {
\Log::warning("AI provider [$provider] failed: " . $e->getMessage());
continue;
}
}
throw new \RuntimeException('All AI providers are unavailable.');
}
}8. Using the Service in a Controller
<?php
namespace App\Http\Controllers;
use App\Services\AiService;
use Illuminate\Http\Request;
class AiController extends Controller {
public function __construct(protected AiService $aiService) {}
public function chat(Request $request) {
$request->validate([
'message' => 'required|string|max:2000',
'provider' => 'nullable|in:qwen,hunyuan,deepseek',
]);
$reply = $this->aiService->chatWithFallback($request->message);
return response()->json(['reply' => $reply]);
}
}9. Creating an AI Agent (LaravelAssistant)
<?php
namespace App\Ai\Agents;
use Laravel\Ai\Contracts\Agent;
use Laravel\Ai\Contracts\Conversational;
use Laravel\Ai\Promptable;
use Stringable;
class LaravelAssistant implements Agent, Conversational {
use Promptable;
public string $provider = 'deepseek';
public string $model = 'deepseek-chat';
public function instructions(): Stringable|string {
return <<<<PROMPT
You are a professional Laravel assistant. You should:
1. Answer PHP/Laravel technical questions.
2. Provide complete, runnable code examples.
3. Highlight potential security risks.
4. Follow Laravel best practices.
PROMPT;
}
public function messages(): iterable {
return [];
}
}Usage example:
use App\Ai\Agents\LaravelAssistant;
use Laravel\Ai\Facades\AI;
$agent = app(LaravelAssistant::class);
$response = AI::agent($agent)->prompt('How to implement retry logic for failed Laravel queue jobs?');
echo $response->content;By leveraging Laravel AI SDK's custom Base URL mechanism, developers can quickly switch to stable, cost‑effective domestic LLMs while keeping the same Laravel‑centric codebase.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
Laravel Tech Community
Specializing in Laravel development, we continuously publish fresh content and grow alongside the elegant, stable Laravel framework.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
