How to Build Your First AI-Powered Web App with Next.js 15 and Vercel AI SDK
![]() |
| How to Build Your First AI-Powered Web App with Next.js 15 and Vercel AI SDK |
The web development landscape is changing. It's no longer "interface development"; it's "intelligent experience development." And with the advent of Next.js 15, Artificial Intelligence in web applications has never been more "native."
Why Next.js 15 for AI-Enabled Web Apps?
With the advent of Next.js 15, a new range of features has enabled it to become the go-to choice for building AI-enabled web applications.
- Enhanced Server Components: Reduces the amount of JavaScript code sent to the client-side, resulting in a faster user experience with AI.
- Partial Prerendering (PPR): Allows static content to load in an instant while AI content loads dynamically.
- Caching: Better caching of AI content.
Step 1: Setting Up the Project
Let's start with a new project using Next.js 15. Open your terminal and execute:
npx create-next-app@latest my-ai-app --typescript --tailwind --eslintWhen you're setting up, be sure to select App Router (Recommended). After you've finished setting up, you should change to your project folder:
cd my-ai-appInstall Dependencies
We'll be needing to install the Vercel AI SDK and a provider (we're using OpenAI in this example):
npm install ai @ai-sdk/openai lucide-reactStep 2: Configure Environment Variables
You should now create a .env.local file in your project's root folder. For this project, you'll be needing an API key from OpenAI.
OPENAI_API_KEY=your_openai_api_key_here
Step 3: Create the AI Route Handler
For Next.js 15, you'll be needing to use Route Handlers to handle your AI requests on the server. This is to ensure your API key remains secure. Create a new file in app/api/chat/route.ts:
typescriptimport { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
// Allow streaming responses up to 30 seconds
export const maxDuration = 30;
export async function POST(req: Request) {
const { messages } = await req.json();
const result = await streamText({
model: openai('gpt-4o'),
messages,
});
return result.toDataStreamResponse();
}
This simple script takes the user's message history and sends it to the GPT-4o model, returning a streaming response that feels "live."
Step 4: Building the AI Chat Interface
Now, let's build a "Glassmorphism" inspired UI in app/page.tsx. We will use the useChat hook from the AI SDK, which handles the UI state and streaming automatically.
tsx file'use client';
import { useChat } from 'ai/react';
import { Send, Bot, User } from 'lucide-react';
export default function Chat() {
const { messages, input, handleInputChange, handleSubmit, isLoading } = useChat();
return (
<div className="flex flex-col w-full max-w-3xl mx-auto py-10 stretch h-screen">
<header className="text-center mb-8">
<h1 className="text-4xl font-bold bg-gradient-to-r from-blue-500 to-purple-600 bg-clip-text text-transparent">
AI Nexus Chat
</h1>
<p className="text-gray-500 mt-2">Next.js 15 + Vercel AI SDK</p>
</header>
<div className="flex-1 overflow-y-auto p-6 space-y-4 bg-gray-50/50 dark:bg-zinc-900/50 rounded-xl border border-gray-200 dark:border-zinc-800 backdrop-blur-md">
{messages.map(m => (
<div key={m.id} className={`flex ${m.role === 'user' ? 'justify-end' : 'justify-start'}`}>
<div className={`flex gap-3 max-w-[80%] p-4 rounded-2xl ${
m.role === 'user'
? 'bg-blue-600 text-white shadow-lg'
: 'bg-white dark:bg-zinc-800 text-zinc-800 dark:text-zinc-200 border border-gray-200 dark:border-zinc-700'
}`}>
{m.role === 'user' ? <User size={20} /> : <Bot size={20} />}
<p className="text-sm leading-relaxed">{m.content}</p>
</div>
</div>
))}
{isLoading && <div className="text-xs text-gray-400 animate-pulse">AI is thinking...</div>}
</div>
<form onSubmit={handleSubmit} className="mt-6 relative">
<input
className="w-full p-4 pr-12 rounded-full border border-gray-300 dark:border-zinc-700 dark:bg-zinc-800 focus:outline-none focus:ring-2 focus:ring-blue-500 transition-all"
value={input}
placeholder="Ask anything..."
onChange={handleInputChange}
/>
<button
type="submit"
className="absolute right-2 top-2 p-2 bg-blue-600 hover:bg-blue-700 text-white rounded-full transition-colors"
>
<Send size={20} />
</button>
</form>
</div>
);
}
Step 5: SEO & Performance Optimization
To improve the performance of the post (and the app), here are some advanced recommendations:
1. Edge Runtime
For low latency on a global scale, it’s possible to force the runtime of the API on the Edge. To do so, simply add the following to your route.ts:
export const runtime = 'edge';
2. Streaming Metadata
With the latest version of Next.js (version 15), it’s possible to stream the metadata. This is great for SEO because the "Static" parts of the webpage will be loaded immediately for search engines, while the "Dynamic" AI component will initialize afterwards.
3. Responsive Design
The above UI code uses Tailwind CSS to make sure the chat window looks super premium on mobile and desktop.
Conclusion: The Future of AI Coding
Creating with Next.js 15 and Vercel AI SDK is not just about developing a chatbot. It is about grasping the concept of how to bridge the gap between sophisticated LLMs and intuitive interfaces. With the above structure, you have created an app that is not only fast and scalable but also fits the 2026 web standards.
- Next.js 15 gives the best DX (Developer Experience) for React.
- Vercel AI SDK removes the complexity of streaming and state management.
- Clean UI/UX is what sets a basic script from a professional product.
