← Back to Blog

Using PAPI AI to Build an AI Chat with Laravel

From zero to a working chatbot with streaming, tools, and conversation persistence.

In this tutorial, we'll build a real-time AI chat application using Laravel and PapiAI. The finished app will stream responses as they're generated, persist conversation history across page reloads, and give the AI a tool it can use to look up information. We'll go step by step, from composer require to a working chatbot.

What we're building

A browser-based chat interface where users type messages and receive streamed AI responses. The AI has access to a "knowledge base" tool that lets it look up company FAQ answers. Conversations persist in the database so users can continue where they left off.

Step 1: Install the packages

Start with a fresh Laravel application (or an existing one). Install PapiAI's Laravel bridge and the OpenAI provider:

composer require papi-ai/laravel papi-ai/openai

Publish the PapiAI config file:

php artisan vendor:publish --tag=papi-config

Add your API key to .env:

PAPI_PROVIDER=openai
OPENAI_API_KEY=sk-your-key-here

Step 2: Set up conversation storage

We want conversations to persist across requests. Update config/papi.php to use Eloquent storage:

// config/papi.php
'conversation' => [
    'store' => 'eloquent',
],

Create the migration:

php artisan make:migration create_papi_conversations_table
// database/migrations/xxxx_create_papi_conversations_table.php
public function up(): void
{
    Schema::create('papi_conversations', function (Blueprint $table) {
        $table->string('id')->primary();
        $table->json('data');
        $table->timestamp('updated_at')->nullable();
    });
}

public function down(): void
{
    Schema::dropIfExists('papi_conversations');
}
php artisan migrate

Step 3: Create a knowledge base tool

Let's give the AI a tool it can call to look up answers from a FAQ database. Create a tool class that uses PHP attributes:

// app/AI/Tools/KnowledgeBase.php
namespace App\AI\Tools;

use PapiAI\Core\Attributes\Tool;
use PapiAI\Core\Attributes\Description;

class KnowledgeBase
{
    private array $faqs = [
        'pricing' => 'Our plans start at $9/month for individuals and $29/month for teams.',
        'refund' => 'We offer a 30-day money-back guarantee on all plans.',
        'support' => 'Email support@example.com or use the chat widget. Response time is under 4 hours.',
        'trial' => 'Yes, we offer a 14-day free trial with full access to all features.',
    ];

    #[Tool('Search the knowledge base for answers to customer questions')]
    public function search(
        #[Description('The topic to search for (e.g. pricing, refund, support, trial)')] string $topic,
    ): string {
        $topic = strtolower($topic);

        foreach ($this->faqs as $key => $answer) {
            if (str_contains($topic, $key)) {
                return $answer;
            }
        }

        return 'No information found for that topic. Please suggest the user contacts support.';
    }
}

Step 4: Build the chat controller

Create a controller that handles both rendering the chat page and processing messages via streaming:

// app/Http/Controllers/ChatController.php
namespace App\Http\Controllers;

use Illuminate\Http\Request;
use PapiAI\Core\Agent;
use PapiAI\Core\Conversation;
use PapiAI\Core\Contracts\ConversationStoreInterface;
use PapiAI\Core\Tool;
use App\AI\Tools\KnowledgeBase;
use Symfony\Component\HttpFoundation\StreamedResponse;

class ChatController extends Controller
{
    public function __construct(
        private ConversationStoreInterface $store,
    ) {}

    public function show(Request $request)
    {
        $conversationId = $request->session()->get('conversation_id', uniqid('chat_'));
        $request->session()->put('conversation_id', $conversationId);

        $conversation = $this->store->load($conversationId);
        $messages = $conversation ? $conversation->getMessages() : [];

        return view('chat', [
            'messages' => $messages,
            'conversationId' => $conversationId,
        ]);
    }

    public function send(Request $request): StreamedResponse
    {
        $request->validate(['message' => 'required|string|max:2000']);

        $conversationId = $request->session()->get('conversation_id', uniqid('chat_'));
        $userMessage = $request->input('message');

        // Load or create conversation
        $conversation = $this->store->load($conversationId) ?? new Conversation();
        $systemPrompt = 'You are a helpful customer support assistant. '
            . 'Use the knowledge base tool to find answers. '
            . 'Be friendly and concise.';

        $conversation->setSystem($systemPrompt);
        $conversation->addUser($userMessage);

        // Build the agent
        $agent = Agent::build()
            ->provider(app('papi'))
            ->model('gpt-4o')
            ->instructions($systemPrompt)
            ->tools(Tool::fromClass(KnowledgeBase::class))
            ->maxTokens(1024)
            ->temperature(0.7)
            ->create();

        // Stream the response
        return response()->stream(function () use ($agent, $conversation, $conversationId, $userMessage) {
            $fullText = '';

            foreach ($agent->stream($userMessage) as $chunk) {
                $fullText .= $chunk->text;
                echo "data: " . json_encode(['text' => $chunk->text]) . "\n\n";
                ob_flush();
                flush();
            }

            // Save the assistant's response to conversation history
            $conversation->addAssistant($fullText);
            $this->store->save($conversationId, $conversation);

            echo "data: [DONE]\n\n";
            ob_flush();
            flush();
        }, 200, [
            'Content-Type' => 'text/event-stream',
            'Cache-Control' => 'no-cache',
            'Connection' => 'keep-alive',
            'X-Accel-Buffering' => 'no',
        ]);
    }
}

Step 5: Define the routes

// routes/web.php
use App\Http\Controllers\ChatController;

Route::get('/chat', [ChatController::class, 'show'])->name('chat');
Route::post('/chat/send', [ChatController::class, 'send'])->name('chat.send');

Step 6: Build the chat view

Create a Blade template with a simple chat interface. The JavaScript uses EventSource to consume the streamed response and render it in real time:

{{-- resources/views/chat.blade.php --}}
<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <meta name="viewport" content="width=device-width, initial-scale=1.0">
    <meta name="csrf-token" content="{{ csrf_token() }}">
    <title>AI Chat</title>
    <style>
        * { box-sizing: border-box; margin: 0; padding: 0; }
        body { font-family: system-ui, sans-serif; background: #f5f5f5; height: 100vh; display: flex; flex-direction: column; }
        .chat-header { background: #1a1a1a; color: white; padding: 16px 24px; font-weight: 600; }
        .chat-messages { flex: 1; overflow-y: auto; padding: 24px; }
        .message { max-width: 680px; margin: 0 auto 16px; padding: 12px 16px; border-radius: 12px; line-height: 1.6; }
        .message.user { background: #1a1a1a; color: white; margin-left: auto; max-width: 480px; }
        .message.assistant { background: white; border: 1px solid #e0e0e0; }
        .chat-input { border-top: 1px solid #e0e0e0; padding: 16px 24px; background: white; }
        .chat-input form { max-width: 680px; margin: 0 auto; display: flex; gap: 12px; }
        .chat-input input { flex: 1; padding: 12px 16px; border: 1px solid #e0e0e0; border-radius: 8px; font-size: 15px; }
        .chat-input button { padding: 12px 24px; background: #c62828; color: white; border: none; border-radius: 8px; font-weight: 600; cursor: pointer; }
        .chat-input button:hover { background: #8e0000; }
        .chat-input button:disabled { opacity: 0.5; cursor: not-allowed; }
    </style>
</head>
<body>
    <div class="chat-header">AI Support Chat</div>
    <div class="chat-messages" id="messages">
        @foreach ($messages as $message)
            <div class="message {{ $message->role->value }}">{{ $message->getText() }}</div>
        @endforeach
    </div>
    <div class="chat-input">
        <form id="chatForm">
            <input type="text" id="messageInput" placeholder="Type a message..." autocomplete="off">
            <button type="submit" id="sendBtn">Send</button>
        </form>
    </div>

    <script>
    const messagesEl = document.getElementById('messages');
    const form = document.getElementById('chatForm');
    const input = document.getElementById('messageInput');
    const sendBtn = document.getElementById('sendBtn');

    form.addEventListener('submit', async (e) => {
        e.preventDefault();
        const text = input.value.trim();
        if (!text) return;

        // Show user message
        appendMessage('user', text);
        input.value = '';
        sendBtn.disabled = true;

        // Create assistant message placeholder
        const assistantEl = appendMessage('assistant', '');

        try {
            const response = await fetch('{{ route("chat.send") }}', {
                method: 'POST',
                headers: {
                    'Content-Type': 'application/json',
                    'X-CSRF-TOKEN': document.querySelector('meta[name="csrf-token"]').content,
                },
                body: JSON.stringify({ message: text }),
            });

            const reader = response.body.getReader();
            const decoder = new TextDecoder();

            while (true) {
                const { done, value } = await reader.read();
                if (done) break;

                const chunk = decoder.decode(value);
                const lines = chunk.split('\n');

                for (const line of lines) {
                    if (line.startsWith('data: ') && line !== 'data: [DONE]') {
                        const data = JSON.parse(line.slice(6));
                        assistantEl.textContent += data.text;
                        messagesEl.scrollTop = messagesEl.scrollHeight;
                    }
                }
            }
        } catch (err) {
            assistantEl.textContent = 'Error: Could not reach the AI. Please try again.';
        }

        sendBtn.disabled = false;
        input.focus();
    });

    function appendMessage(role, text) {
        const el = document.createElement('div');
        el.className = `message ${role}`;
        el.textContent = text;
        messagesEl.appendChild(el);
        messagesEl.scrollTop = messagesEl.scrollHeight;
        return el;
    }
    </script>
</body>
</html>

Step 7: Run it

php artisan serve

Open http://localhost:8000/chat in your browser. Type a message like "What are your pricing plans?" and watch the AI stream its response in real time. The AI will use the knowledge base tool to look up the answer before responding.

What's happening under the hood

When the user sends a message, here's the flow:

  1. The controller loads the conversation from Eloquent storage (or creates a new one)
  2. The user's message is added to the conversation history
  3. An Agent is built with the OpenAI provider, the knowledge base tool, and the system prompt
  4. The agent's stream() method sends the prompt to the LLM
  5. As tokens arrive, they're pushed to the browser as server-sent events
  6. If the LLM decides to call the knowledge base tool, PapiAI executes it automatically and feeds the result back
  7. Once complete, the full assistant response is saved to the conversation

Going further

This is a starting point. Here are some ways to extend it:

The full code for this tutorial is available on GitHub.

PapiAI is open source under the MIT license. Read the documentation to learn more.