OpenRouter LLM Integration is a Next.js-based web application that leverages the power of Large Language Models (LLMs) through the OpenRouter API. This project demonstrates how to create an interactive chat interface that connects to advanced AI models, providing a seamless and intelligent conversational experience.
- 🤖 Integration with OpenRouter API for access to cutting-edge LLMs
- 💬 Real-time chat interface with AI responses
- 🎨 Sleek and responsive design using Tailwind CSS
- 🔒 Secure handling of API keys and environment variables
- 🚀 Easy deployment with Vercel
- Node.js (v14 or later)
- npm or yarn
- An OpenRouter API key
- Clone the repository:
bash git clone https://github.com/your-username/openrouter-llm-integration.git cd openrouter-llm-integration
- Install dependencies:
bash npm install
or yarn install
-
Set up environment variables:
- Copy
.env.exampleto.env.local - Fill in your OpenRouter API key and other required variables
- Copy
-
Run the development server:
bash npm run dev
or yarn dev
- Open http://localhost:3000 in your browser to see the application.
- Enter your message in the chat input field.
- Press "Send" or hit Enter to submit your message.
- The AI will process your input and provide a response.
- Continue the conversation as desired.
This project is set up for easy deployment on Vercel:
- Push your code to a GitHub repository.
- Connect your GitHub account to Vercel.
- Select the repository and configure your environment variables.
- Deploy!
For more detailed instructions, check out the Next.js deployment documentation.
Contributions, issues, and feature requests are welcome! Feel free to check issues page.
This project is MIT licensed.
- OpenRouter for providing access to advanced LLMs
- Next.js for the awesome React framework
- Vercel for their excellent hosting platform
Made with ❤️ by Sharon https://myllm.news