Skip to content

samthatcode/laravel_inertia_ai_chat

Repository files navigation

🧠 AI Chat Integration (Laravel + Vue.js)

Laravel Vue.js OpenRouter

A real-time chat application powered by the OpenRouter API (LLaMA 3).
Built with Laravel for the backend and Vue.js for the frontend.


📸 Demo

Chat UI Screenshot

✅ Features

  • Chat with an AI assistant in real-time
  • Session-based memory for contextual conversations
  • Vue-powered chat UI
  • OpenRouter API integration (LLaMA 3)

🔧 Backend Setup (Laravel)

.env

OPENROUTER_API_KEY=your_api_key_here

Route

// routes/api.php
Route::post('/ai-response', [\App\Http\Controllers\GetAiResponse::class, 'getAiResponse']);
Route::get('/conversation', [\App\Http\Controllers\GetAiResponse::class, 'getConversation']);
Route::post('/conversation/save', [\App\Http\Controllers\GetAiResponse::class, 'saveConversation']);

Controller Logic

The controller handles:

  • Validating prompt input
  • Maintaining session-based history
  • Sending chat history to OpenRouter API
  • Returning the latest assistant response

💡 Only the latest 10 messages are kept in the session to limit payload size.


💬 Frontend Setup (Vue.js)

1. Install Axios

npm install axios

🚀 Running the App

Laravel

php artisan serve

Vite (Vue)

npm run dev

Then visit:
http://localhost:8000 or your configured front-end route.


📡 API Endpoint

POST /api/ai-response

Request Body

{
    "prompt": "Hello, how are you?"
}

Response

{
    "response": "I'm doing great, how can I help you today?"
}

🧑‍💻 Developer

Made by: Version Control

Stack: Laravel 12 + Vue.js 3 + OpenRouter API

License

This project is open-sourced under the MIT license.

© 2025 Version Control. All rights reserved.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published