OneApp Docs
Platform Apps

oneapp-onstage

Main consumer application with AI-powered chat, document editing, presentations. Next.js 16, React 19, Vercel AI SDK v6. Streaming responses, real-time collaboration, microfrontend architecture.

Quick Start

Run the app in 2 minutes:

pnpm --filter oneapp-onstage dev

AI chat, auth, database included. Skip to Quick Start →

Why oneapp-onstage?

AI chat setup duplicated across projects. Authentication configuration scattered. Microfrontend hosting requires manual routing. Document editing lacks real-time collaboration. Database connections not pooled. Environment validation inconsistent. OAuth providers configured per app.

oneapp-onstage solves this with unified Next.js 16 app hosting AI chat, auth, microfrontends, and collaboration features.

Production-ready with Vercel AI SDK v6, Better Auth, Prisma ORM, Upstash Redis, microfrontend proxy, and streaming responses.

Use cases

  • AI-powered chat — Stream responses with tool calling, context management
  • Collaborative editing — Real-time document and presentation editing
  • Microfrontend host — Route requests to oneapp-backstage admin panel
  • User authentication — OAuth with GitHub, Google, Microsoft
  • File management — Upload to Vercel Blob with signed URLs

How it works

oneapp-onstage is a Next.js 16 App Router application:

// app/page.tsx - AI chat interface
import { useChat } from "ai/react";

export default function ChatPage() {
  const { messages, input, handleInputChange, handleSubmit } = useChat({
    api: "/api/chat",
  });

  return (
    <form onSubmit={handleSubmit}>
      {messages.map((msg) => (
        <div key={msg.id}>{msg.content}
      ))}
      <input value={input} onChange={handleInputChange} />
    </form>
  );
}

Uses Next.js 16 App Router for routing, Vercel AI SDK v6 for streaming, Prisma for database, Better Auth for authentication, and proxy for microfrontends.

Key features

AI streaming — Real-time responses with Vercel AI SDK v6

Microfrontends — Host oneapp-backstage admin panel

Authentication — OAuth with GitHub, Google, Microsoft

Real-time editing — Collaborative documents and presentations

File uploads — Vercel Blob storage with signed URLs

Database — Prisma ORM with Neon Postgres pooling

Quick Start

1. Install dependencies

pnpm install

2. Configure environment

.env.local
# Database (Neon Postgres)
DATABASE_URL="postgresql://user:pass@host/db?sslmode=require"
DATABASE_URL_UNPOOLED="postgresql://user:pass@host/db?sslmode=require"

# Auth
BETTER_AUTH_SECRET="your-32-char-secret-here"
NEXT_PUBLIC_APP_URL="http://localhost:3500"

# AI Gateway
AI_GATEWAY_API_KEY="your-gateway-key"
AI_GATEWAY_URL="https://gateway.ai.cloudflare.com/..."

# OAuth (at least one required)
GITHUB_CLIENT_ID="your-github-client-id"
GITHUB_CLIENT_SECRET="your-github-client-secret"

3. Run database migrations

pnpm --filter @repo/oneapp-shared prisma:migrate:dev

4. Start development server

pnpm --filter oneapp-onstage dev

Visit http://localhost:3500 to see the app.

Microfrontend mode

Run with all microfrontends (oneapp-backstage admin panel):

pnpm dev:oneapp

Starts host on :3500 and backstage on :3024


Technical Details

For Developers: Technical implementation details

Overview

PropertyValue
Locationplatform/apps/oneapp-onstage
Port3500
FrameworkNext.js 16 (App Router, Turbopack)
React19
StylingTailwind CSS 4

Tech Stack

Core Framework

  • Next.js 16 - App Router, Turbopack dev server, Server Actions
  • React 19 - Server Components, Suspense, Streaming
  • TypeScript 5 - Full type safety with strict mode

UI & Styling

  • Tailwind CSS 4 - CSS-first configuration with theme variables
  • @repo/uni-ui - Universal UI components (web + native)
  • @xyflow/react - Workflow and diagram visualization
  • Framer Motion - Animations and transitions

Backend Services

  • Prisma ORM - Type-safe database access with Neon Postgres
  • Better Auth - Authentication with OAuth providers
  • Vercel AI SDK v6 - Streaming responses, tool calling
  • Upstash Redis - Caching and rate limiting
  • Vercel Blob - File storage with signed URLs

Development

  • Vitest 4 - Unit and integration testing
  • ESLint 9 - Code quality with flat config
  • Prettier 3 - Code formatting

Project Structure

oneapp-onstage/
├── src/
│   ├── app/                    # Next.js App Router
│   │   ├── (auth)/            # Auth routes (login, signup)
│   │   │   ├── login/
│   │   │   └── signup/
│   │   ├── (dashboard)/       # Protected dashboard routes
│   │   │   ├── chat/
│   │   │   ├── documents/
│   │   │   └── presentations/
│   │   ├── api/               # API routes
│   │   │   ├── auth/          # Auth endpoints
│   │   │   ├── chat/          # Chat streaming
│   │   │   └── upload/        # File uploads
│   │   ├── layout.tsx         # Root layout
│   │   └── page.tsx           # Landing page
│   ├── components/            # React components
│   │   ├── chat/              # Chat UI components
│   │   ├── editor/            # Document editor
│   │   ├── presentations/     # Presentation tools
│   │   └── ui/                # Shared UI components
│   ├── db/                    # Database utilities
│   ├── hooks/                 # Custom React hooks
│   ├── lib/                   # Shared utilities
│   ├── tools/                 # AI tool definitions
│   └── types/                 # TypeScript definitions
├── env/
│   ├── server.ts              # Server environment validation
│   └── client.ts              # Client environment validation
├── public/                    # Static assets
├── tests/                     # Test files
└── next.config.mjs            # Next.js configuration

AI Chat

Streaming Responses

// app/api/chat/route.ts
import { createOpenAICompatible } from "@ai-sdk/openai-compatible";
import { streamText } from "ai";

export async function POST(req: Request) {
  const { messages } = await req.json();

  const model = createOpenAICompatible({
    baseURL: process.env.AI_GATEWAY_URL,
    apiKey: process.env.AI_GATEWAY_API_KEY
  });

  const result = await streamText({
    model: model("gpt-4"),
    messages
  });

  return result.toTextStreamResponse();
}

Tool Calling

import { streamText, tool } from "ai";
import { z } from "zod";

const result = await streamText({
  model,
  messages,
  tools: {
    getWeather: tool({
      description: "Get weather for a location",
      parameters: z.object({
        location: z.string().describe("City name")
      }),
      execute: async ({ location }) => {
        const weather = await fetchWeather(location);
        return { temperature: weather.temp, condition: weather.condition };
      }
    })
  }
});

Chat History

import { prisma } from "@repo/db-prisma";

// Save chat message
await prisma.message.create({
  data: {
    chatId,
    role: "user",
    content: userMessage
  }
});

// Load chat history
const messages = await prisma.message.findMany({
  where: { chatId },
  orderBy: { createdAt: "asc" }
});

Authentication

Better Auth Setup

// lib/auth.ts
import { betterAuth } from "better-auth";
import { prismaAdapter } from "better-auth/adapters/prisma";
import { prisma } from "@repo/db-prisma";

export const auth = betterAuth({
  database: prismaAdapter(prisma, {
    provider: "postgresql"
  }),
  emailAndPassword: {
    enabled: true
  },
  socialProviders: {
    github: {
      clientId: process.env.GITHUB_CLIENT_ID!,
      clientSecret: process.env.GITHUB_CLIENT_SECRET!
    },
    google: {
      clientId: process.env.GOOGLE_CLIENT_ID!,
      clientSecret: process.env.GOOGLE_CLIENT_SECRET!
    },
    microsoft: {
      clientId: process.env.MICROSOFT_CLIENT_ID!,
      clientSecret: process.env.MICROSOFT_CLIENT_SECRET!
    }
  }
});

Server-Side Auth

// app/(dashboard)/page.tsx
import { auth } from "@repo/auth/server";
import { headers } from "next/headers";

export default async function DashboardPage() {
  const session = await auth.api.getSession({
    headers: await headers(),
  });

  if (!session) {
    redirect("/login");
  }

  return <Dashboard user={session.user} />;
}

Client-Side Auth

"use client";
import { useSession, signOut } from "@repo/auth/client";

export function UserMenu() {
  const { data: session, isPending } = useSession();

  if (isPending) return <Skeleton />;
  if (!session) return <SignInButton />;

  return (
    <DropdownMenu>
      <DropdownMenuTrigger>{session.user.name}</DropdownMenuTrigger>
      <DropdownMenuContent>
        <DropdownMenuItem onClick={() => signOut()}>
          Sign Out
        </DropdownMenuItem>
      </DropdownMenuContent>
    </DropdownMenu>
  );
}

Database Access

Prisma Client

import { prisma } from "@repo/db-prisma";

// Find users
const users = await prisma.user.findMany({
  where: { isActive: true },
  include: { profile: true },
  orderBy: { createdAt: "desc" }
});

// Create user
const user = await prisma.user.create({
  data: {
    email: "user@example.com",
    name: "John Doe",
    profile: {
      create: {
        bio: "Software Developer"
      }
    }
  }
});

// Update user
await prisma.user.update({
  where: { id: userId },
  data: { name: "Jane Doe" }
});

Database Transactions

await prisma.$transaction(async (tx) => {
  const user = await tx.user.create({
    data: { email: "user@example.com" }
  });

  await tx.profile.create({
    data: {
      userId: user.id,
      bio: "New user"
    }
  });
});

Microfrontend Architecture

Proxy Configuration

// src/proxy.ts
export const microfrontends = {
  "oneapp-backstage": {
    // Routes handled by backstage
    routes: ["/backstage", "/backstage-preview"],
    // Static asset prefix
    assetPrefix: "/oneapp-backstage-assets",
    // Local development port
    localPort: 3024,
    // Production URL
    productionUrl: process.env.BACKSTAGE_URL
  }
};

Proxy Middleware

// middleware.ts
import { NextResponse } from "next/server";
import { microfrontends } from "./src/proxy";

export function middleware(request: NextRequest) {
  const { pathname } = request.nextUrl;

  // Check if route belongs to a microfrontend
  for (const [name, config] of Object.entries(microfrontends)) {
    if (config.routes.some((route) => pathname.startsWith(route))) {
      const targetUrl =
        process.env.NODE_ENV === "production" ? config.productionUrl : `http://localhost:${config.localPort}`;

      return NextResponse.rewrite(new URL(pathname, targetUrl));
    }
  }

  return NextResponse.next();
}

Running Microfrontends

# Start host app
pnpm --filter oneapp-onstage dev

# Start backstage microfrontend
pnpm --filter oneapp-backstage dev

# Or start all at once
pnpm dev:oneapp

File Uploads

Vercel Blob Storage

// app/api/upload/route.ts
import { put } from "@vercel/blob";

export async function POST(req: Request) {
  const formData = await req.formData();
  const file = formData.get("file") as File;

  const blob = await put(file.name, file, {
    access: "public",
    token: process.env.VERCEL_BLOB_READ_WRITE_TOKEN
  });

  return Response.json({ url: blob.url });
}

Client-Side Upload

"use client";
import { useState } from "react";

export function FileUpload() {
  const [uploading, setUploading] = useState(false);

  const handleUpload = async (e: React.ChangeEvent<HTMLInputElement>) => {
    const file = e.target.files?.[0];
    if (!file) return;

    setUploading(true);
    const formData = new FormData();
    formData.append("file", file);

    const response = await fetch("/api/upload", {
      method: "POST",
      body: formData,
    });

    const { url } = await response.json();
    setUploading(false);
    return url;
  };

  return <input type="file" onChange={handleUpload} disabled={uploading} />;
}

Caching with Redis

Upstash Redis

import { Redis } from "@upstash/redis";

const redis = new Redis({
  url: process.env.UPSTASH_REDIS_REST_URL!,
  token: process.env.UPSTASH_REDIS_REST_TOKEN!
});

// Cache user data
await redis.set(`user:${userId}`, user, { ex: 3600 }); // 1 hour

// Get cached data
const cachedUser = await redis.get(`user:${userId}`);

// Delete cache
await redis.del(`user:${userId}`);

Rate Limiting

import { Ratelimit } from "@upstash/ratelimit";
import { Redis } from "@upstash/redis";

const ratelimit = new Ratelimit({
  redis: Redis.fromEnv(),
  limiter: Ratelimit.slidingWindow(10, "10 s") // 10 requests per 10 seconds
});

export async function POST(req: Request) {
  const ip = req.headers.get("x-forwarded-for") ?? "127.0.0.1";

  const { success } = await ratelimit.limit(ip);

  if (!success) {
    return new Response("Too many requests", { status: 429 });
  }

  // Handle request
}

Environment Variables

Server Environment

// env/server.ts
import { createEnv } from "@t3-oss/env-nextjs";
import { z } from "zod";

export const env = createEnv({
  server: {
    DATABASE_URL: z.string().url(),
    DATABASE_URL_UNPOOLED: z.string().url().optional(),
    BETTER_AUTH_SECRET: z.string().min(32),
    AI_GATEWAY_API_KEY: z.string().min(1),
    AI_GATEWAY_URL: z.string().url(),
    GITHUB_CLIENT_ID: z.string().optional(),
    GITHUB_CLIENT_SECRET: z.string().optional(),
    VERCEL_BLOB_READ_WRITE_TOKEN: z.string().optional(),
    UPSTASH_REDIS_REST_URL: z.string().url().optional(),
    UPSTASH_REDIS_REST_TOKEN: z.string().optional()
  },
  experimental__runtimeEnv: {}
});

Client Environment

// env/client.ts
import { createEnv } from "@t3-oss/env-nextjs";
import { z } from "zod";

export const env = createEnv({
  client: {
    NEXT_PUBLIC_APP_URL: z.string().url(),
    NEXT_PUBLIC_OBSERVABILITY_DEBUG: z
      .string()
      .optional()
      .transform((v) => v === "true")
  },
  experimental__runtimeEnv: {
    NEXT_PUBLIC_APP_URL: process.env.NEXT_PUBLIC_APP_URL,
    NEXT_PUBLIC_OBSERVABILITY_DEBUG: process.env.NEXT_PUBLIC_OBSERVABILITY_DEBUG
  }
});

API Routes

Chat API

POST   /api/chat           # Stream chat completions
POST   /api/chat/history   # Get chat history
DELETE /api/chat/:id       # Delete conversation

Auth API

POST /api/auth/signin      # Sign in with email/password
POST /api/auth/signup      # Create account
POST /api/auth/signout     # Sign out
GET  /api/auth/session     # Get current session
POST /api/auth/oauth/:provider  # OAuth flow (github, google, microsoft)

File Upload

POST   /api/upload         # Upload file to Vercel Blob
DELETE /api/upload/:id     # Delete uploaded file
GET    /api/upload/:id     # Get file metadata

Documents

GET    /api/documents      # List user documents
POST   /api/documents      # Create document
GET    /api/documents/:id  # Get document
PATCH  /api/documents/:id  # Update document
DELETE /api/documents/:id  # Delete document

Testing

Unit Tests

import { render, screen } from "@testing-library/react";
import { ChatComponent } from "./chat-component";

describe("ChatComponent", () => {
  it("renders messages", () => {
    const messages = [
      { id: "1", role: "user", content: "Hello" },
      { id: "2", role: "assistant", content: "Hi there!" },
    ];

    render(<ChatComponent messages={messages} />);

    expect(screen.getByText("Hello")).toBeInTheDocument();
    expect(screen.getByText("Hi there!")).toBeInTheDocument();
  });
});

Integration Tests

import { prisma } from "@repo/db-prisma";
import { POST } from "./app/api/chat/route";

describe("Chat API", () => {
  it("streams chat response", async () => {
    const request = new Request("http://localhost/api/chat", {
      method: "POST",
      body: JSON.stringify({
        messages: [{ role: "user", content: "Hello" }]
      })
    });

    const response = await POST(request);
    expect(response.status).toBe(200);
    expect(response.headers.get("content-type")).toContain("text/plain");
  });
});

Run Tests

# All tests
pnpm --filter oneapp-onstage test

# Watch mode
pnpm --filter oneapp-onstage test:watch

# Coverage
pnpm --filter oneapp-onstage test:coverage

Deployment

Vercel

// vercel.json
{
  "buildCommand": "pnpm build",
  "outputDirectory": ".next",
  "framework": "nextjs",
  "installCommand": "pnpm install"
}

Environment Setup

  1. Database - Configure Neon Postgres connection strings
  2. Auth - Add Better Auth secret and OAuth credentials
  3. AI - Configure AI Gateway API key and URL
  4. Storage - Add Vercel Blob token
  5. Cache - Configure Upstash Redis credentials
  6. Observability - Add Sentry DSN

Build Command

# Install dependencies
pnpm install

# Run database migrations
pnpm --filter @repo/oneapp-shared prisma:migrate:deploy

# Build application
pnpm --filter oneapp-onstage build

Performance Optimization

Server Components

// Use Server Components by default for data fetching
export default async function DashboardPage() {
  // Fetched on server, no client bundle
  const data = await prisma.user.findMany();

  return <Dashboard data={data} />;
}

Streaming

import { Suspense } from "react";

export default function Page() {
  return (
    <div>
      <Suspense fallback={<Skeleton />}>
        <SlowComponent />
      </Suspense>

  );
}

Caching

// Cache fetch requests
const data = await fetch("https://api.example.com/data", {
  next: { revalidate: 3600 } // Cache for 1 hour
});

// Cache component output
import { unstable_cache } from "next/cache";

const getCachedData = unstable_cache(
  async (id) => {
    return await prisma.data.findUnique({ where: { id } });
  },
  ["data-cache"],
  { revalidate: 3600 }
);

Troubleshooting

Database Connection Issues

# Verify connection string
echo $DATABASE_URL

# Test connection
pnpm --filter @repo/oneapp-shared prisma:studio

# Reset database (development only)
pnpm --filter @repo/oneapp-shared prisma:reset

Auth Not Working

# Verify OAuth credentials
echo $GITHUB_CLIENT_ID
echo $GITHUB_CLIENT_SECRET

# Check trusted origins
echo $TRUSTED_ORIGINS

# Verify auth secret length
echo $BETTER_AUTH_SECRET | wc -c  # Should be >= 32

AI Chat Failing

# Check AI Gateway configuration
echo $AI_GATEWAY_URL
echo $AI_GATEWAY_API_KEY

# Test API endpoint
curl -X POST http://localhost:3500/api/chat \
  -H "Content-Type: application/json" \
  -d '{"messages":[{"role":"user","content":"Hello"}]}'

On this page