An MBTI-driven chat and training playground built with Next.js (app router). Users can:
- Browse 16 MBTI personas, each with a base prompt and user-contributed prompt fragments
- Chat with any persona (auth required), with one shared LLM API key
- Train persona behavior by submitting example pairs and adjusting parameters
This repository includes a simple JWT-based auth flow, SQLite persistence, and a theming system ("waibi" aka dark vs rational aka light).
- 16 MBTI personas landing:
app/mbti/page.tsx - Persona training detail:
app/mbti/[id]/page.tsx - Chat with persona (select model client-side):
app/chat/page.tsx→components/persona-chat.tsx - Persona prompts loading (file-based + user-contributed):
prompts/*.json+/api/persona/[code]/prompt - Auth (register/login/refresh/me) and Bearer token protection
- SQLite persistence for user-contributed prompts and per-user chat histories
- Single LLM API key for all personas, with user-scoped chat context separation
- SBTI new world board integration:
app/sbti/page.tsx - Dual theme system: "waibi" (dark) and "rational" (light)
npm installNo external database service is required. SQLite runs in-process.
mkdir -p dataCreate a .env.local file in the root directory (see Environment Variables section below).
npm run devVisit: http://localhost:3000
Create a .env.local file with the following variables:
# ==== LLM (OpenAI-compatible) ====
# Global upstream endpoint (shared by all personas)
LLM_BASE_URL=https://your-openai-compatible.example/v1/chat/completions
# Shared API key used by all personas
LLM_API_KEY=
# Global default model (can be overridden from the chat UI)
LLM_MODEL=gpt-4o
# ==== SQLite ====
# Optional, defaults to ./data/waibi.sqlite
SQLITE_PATH=./data/waibi.sqlite
# ==== JWT ====
JWT_SECRET=
JWT_REFRESH_SECRET=
JWT_EXPIRES_IN=30d
JWT_REFRESH_EXPIRES_IN=30d
# ==== Persona external business endpoints (optional) ====
# If set, they can be surfaced to the frontend for future per-persona routing.
PERSONA_API_INTJ=
PERSONA_API_INTP=
PERSONA_API_INFJ=
PERSONA_API_INFP=
PERSONA_API_ISTJ=
PERSONA_API_ISFJ=
PERSONA_API_ISTP=
PERSONA_API_ISFP=
PERSONA_API_ENTJ=
PERSONA_API_ENTP=
PERSONA_API_ENFJ=
PERSONA_API_ENFP=
PERSONA_API_ESTJ=
PERSONA_API_ESFJ=
PERSONA_API_ESTP=
PERSONA_API_ESFP=Notes:
LLM_BASE_URLandLLM_API_KEYare global and shared by all personas.- Chat context is isolated by user id on the server side.
- The chat request body may include
modelto overrideLLM_MODEL.
Key directories and files:
app/mbti/page.tsx: MBTI grid landing pageapp/mbti/[id]/page.tsx: Training detail for a selected persona (with UI-only progress simulation)components/persona-chat.tsx: Persona chat component (select persona + select/enter model)app/api/persona/[code]/chat: Calls upstream LLM with persona system prompt + user message, persists interactionapp/api/persona/[code]/prompt: Returns base prompt (from file) + contributed promptsmodel/Interaction.ts: Per-user, per-persona chat history storagemodel/UserPrompt.ts: User-contributed prompt storageprompts/*.json: Base system prompts per persona
- Register:
POST /api/auth/register - Login:
POST /api/auth/login - Refresh:
POST /api/auth/refresh - Me:
GET /api/auth/me
Store accessToken and refreshToken in localStorage. The app automatically attaches Authorization: Bearer <token> for protected routes.
GET /api/persona/[code]/prompt→{ base, contributed, api? }POST /api/persona/[code]/contribute(auth required) → submit prompt fragment{ text }GET /api/persona/[code]/history(auth required) → latest history for current userPOST /api/persona/[code]/chat(auth required) →{ reply, interactionId }
Codes are lowercase persona names (e.g., intj, enfp, ...).
Two theme modes are available:
- waibi: Dark theme
- rational: Light theme
The header includes a mode toggle; components adapt styles based on useVibe() hook.
npm run dev: Start development servernpm run build: Build for productionnpm run start: Start production servernpm run lint: Run Biome linternpm run format: Format code with Biome
- If you modify prompt files under
prompts/, restart dev server if needed. - Configure
LLM_API_KEYin.env.localand restart. - SQLite file location can be changed with
SQLITE_PATH. - Node.js version >= 24.0.0 is required.
The project includes a multi-stage Dockerfile for optimized production builds:
docker build -t waibi-web .
docker run -p 3000:3000 --env-file .env.local waibi-webThe Dockerfile uses Next.js standalone output for smaller image size and faster startup.
- Node.js >= 24.0.0
- SQLite (embedded, no external service required)
- OpenAI-compatible LLM API endpoint
MIT
基于 Next.js (app router) 构建的 MBTI 驱动聊天和训练平台。用户可以:
- 浏览 16 种 MBTI 人格类型,每种都有基础提示词和用户贡献的提示词片段
- 与任何人格类型聊天(需要认证),统一使用一个 LLM API 密钥
- 通过提交示例对话对和调整参数来训练人格行为
本项目包含基于 JWT 的身份验证流程、SQLite 持久化存储,以及主题系统("waibi" 暗色主题 vs "rational" 亮色主题)。
- 16 种 MBTI 人格类型展示页面:
app/mbti/page.tsx - 人格训练详情页:
app/mbti/[id]/page.tsx - 与人格聊天(客户端选择模型):
app/chat/page.tsx→components/persona-chat.tsx - 人格提示词加载(基于文件 + 用户贡献):
prompts/*.json+/api/persona/[code]/prompt - 身份验证(注册/登录/刷新/个人信息)和 Bearer token 保护
- SQLite 持久化存储用户贡献的提示词和每用户的聊天历史
- 所有人格统一使用一个 LLM API 密钥,并按用户隔离上下文
- 已接入 SBTI 新世界板块:
app/sbti/page.tsx - 双主题系统:"waibi"(暗色)和 "rational"(亮色)
npm install无需启动外部数据库服务,SQLite 以嵌入式方式运行。
mkdir -p data在项目根目录创建 .env.local 文件(参见下面的环境变量部分)。
npm run dev创建 .env.local 文件,包含以下变量:
# ==== LLM (OpenAI 兼容) ====
# 全局上游端点(所有人格类型共享)
LLM_BASE_URL=https://your-openai-compatible.example/v1/chat/completions
# 全局共享 API 密钥(所有人格类型共用)
LLM_API_KEY=
# 全局默认模型(可在聊天 UI 中覆盖)
LLM_MODEL=gpt-4o
# ==== SQLite ====
# 可选,默认值为 ./data/waibi.sqlite
SQLITE_PATH=./data/waibi.sqlite
# ==== JWT ====
JWT_SECRET=
JWT_REFRESH_SECRET=
JWT_EXPIRES_IN=30d
JWT_REFRESH_EXPIRES_IN=30d
# ==== 人格外部业务端点(可选) ====
# 如果设置,可以在前端展示,用于未来的人格特定路由。
PERSONA_API_INTJ=
PERSONA_API_INTP=
PERSONA_API_INFJ=
PERSONA_API_INFP=
PERSONA_API_ISTJ=
PERSONA_API_ISFJ=
PERSONA_API_ISTP=
PERSONA_API_ISFP=
PERSONA_API_ENTJ=
PERSONA_API_ENTP=
PERSONA_API_ENFJ=
PERSONA_API_ENFP=
PERSONA_API_ESTJ=
PERSONA_API_ESFJ=
PERSONA_API_ESTP=
PERSONA_API_ESFP=注意事项:
LLM_BASE_URL和LLM_API_KEY是全局共享配置。- 服务端会按用户 id 隔离聊天上下文。
- 聊天请求体可以包含
model字段来覆盖LLM_MODEL。
关键目录和文件:
app/mbti/page.tsx: MBTI 网格展示页面app/mbti/[id]/page.tsx: 选定人格的训练详情页(包含仅 UI 的进度模拟)components/persona-chat.tsx: 人格聊天组件(选择人格 + 选择/输入模型)app/api/persona/[code]/chat: 使用人格系统提示词 + 用户消息调用上游 LLM,持久化交互记录app/api/persona/[code]/prompt: 返回基础提示词(来自文件)+ 贡献的提示词model/Interaction.ts: 每用户、每人格的聊天历史存储model/UserPrompt.ts: 用户贡献的提示词存储prompts/*.json: 每个人格的基础系统提示词
- 注册:
POST /api/auth/register - 登录:
POST /api/auth/login - 刷新:
POST /api/auth/refresh - 个人信息:
GET /api/auth/me
将 accessToken 和 refreshToken 存储在 localStorage 中。应用会自动为受保护的路由附加 Authorization: Bearer <token>。
GET /api/persona/[code]/prompt→{ base, contributed, api? }POST /api/persona/[code]/contribute(需要认证)→ 提交提示词片段{ text }GET /api/persona/[code]/history(需要认证)→ 当前用户的最新历史记录POST /api/persona/[code]/chat(需要认证)→{ reply, interactionId }
代码为小写人格名称(例如:intj、enfp 等)。
提供两种主题模式:
- waibi: 暗色主题
- rational: 亮色主题
页头包含模式切换按钮;组件基于 useVibe() 钩子适配样式。
npm run dev: 启动开发服务器npm run build: 构建生产版本npm run start: 启动生产服务器npm run lint: 运行 Biome 代码检查npm run format: 使用 Biome 格式化代码
- 如果修改
prompts/目录下的提示词文件,可能需要重启开发服务器。 - 在
.env.local中配置LLM_API_KEY后重启服务。 - 可通过
SQLITE_PATH自定义 SQLite 文件位置。 - 需要 Node.js 版本 >= 24.0.0。
项目包含多阶段 Dockerfile,用于优化的生产构建:
docker build -t waibi-web .
docker run -p 3000:3000 --env-file .env.local waibi-webDockerfile 使用 Next.js standalone 输出,以获得更小的镜像体积和更快的启动速度。
- Node.js >= 24.0.0
- SQLite(嵌入式,无需外部服务)
- OpenAI 兼容的 LLM API 端点
MIT