A modern React-based chat interface for interacting with AI models through the EigenAI API.
- Chat Interface: Clean, terminal-inspired chat UI with real-time messaging
- Streaming Responses (optional): Toggle token-by-token streaming for faster perceived responses
- API Key Management: Secure client-side storage of API keys with visual indicators
- Responsive Design: Works on desktop and mobile devices
- Error Handling: Graceful error handling for API failures
- Loading States: Visual feedback during API requests
- Message History: Persistent chat history during session
- Seed Control (optional): Configure a numeric seed for deterministic sampling
- Node.js (version 14 or higher)
- npm or yarn
-
Navigate to the project directory:
cd /Users/scottconner/projects/determinal -
Install dependencies:
npm install
-
Start both the proxy server and React app:
npm run dev
Or start them separately:
# Terminal 1 - Start proxy server npm run server # Terminal 2 - Start React app npm start
-
Open your browser and navigate to
http://localhost:3000
- Set API Key: Click the "API Key" button in the top menu to set your EigenAI API key
- Optional: Enable Streaming: Click the "Streaming" button in the top menu to toggle streaming On/Off
- Optional: Set Seed: Click the "Seed" button to set a numeric seed (same seed + prompt + settings -> same output)
- Start Chatting: Type your message in the input field and press Enter or click Send
- View Responses: AI responses will appear in the chat interface
The application uses a proxy server to avoid CORS issues:
- Frontend:
http://localhost:3000(React app) - Proxy Server:
http://localhost:3002(Express server) - Backend API:
https://eigenai.eigencloud.xyz/v1/chat/completions - Model:
gpt-oss-120b-f16 - Max Tokens: 120
- Seed: 42
- You can change the seed from the UI. The seed persists in localStorage.
When streaming is enabled, the app uses a streaming proxy endpoint that forwards Server-Sent Events (SSE):
- Streaming Proxy:
POST http://localhost:3002/api/chat/completions/stream - The server sets
stream: trueand relays upstream SSE events to the client, which incrementally renders the response.
The proxy server handles the API calls and forwards them to the EigenAI endpoint with proper CORS headers.
src/
├── components/
│ ├── Header.js # Top navigation with API key indicator
│ ├── Header.css
│ ├── Chat.js # Main chat interface
│ ├── Chat.css
│ ├── Message.js # Individual message component
│ ├── Message.css
│ ├── ApiKeyDialog.js # API key input dialog
│ └── ApiKeyDialog.css
├── App.js # Main application component
├── App.css
├── index.js # Application entry point
└── index.css # Global styles
server.js # Express proxy server
package.json # Dependencies and scripts
- API keys are stored in browser localStorage
- No server-side storage of sensitive data
- All API calls are made directly from the client
npm start- Runs the React app in development modenpm run server- Runs the Express proxy servernpm run dev- Runs both the proxy server and React app concurrentlynpm run build- Builds the app for productionnpm test- Launches the test runnernpm eject- Ejects from Create React App (one-way operation)
- Chrome (latest)
- Firefox (latest)
- Safari (latest)
- Edge (latest)