A modern, feature-rich chat interface for Ollama with conversation storage and custom AI settings.
- Dual Interface: Native Electron app + Web browser access
- Conversation Storage: Save, load, and manage chat conversations
- Custom AI Settings: Per-conversation system prompts, creativity controls, and output styles
- Real-time Streaming: Live response streaming with WebSocket support
- Model Management: Download, remove, and test Ollama models
- Cross-Platform: Works on Windows, macOS, and Linux
- Network Sharing: Share web interface across your local network
Screenshots will be added soon
-
Clone the repository
git clone https://github.com/uxmauro/ezllama.git cd ezllama
-
Install dependencies
npm install
-
Start the application
npm run dev
-
Access the interfaces
- Electron App: Opens automatically
- Web Interface: Visit
http://localhost:3000
Each conversation supports individual settings:
- System Prompt: Custom instructions for AI behavior
- Output Style: Balanced, Creative, Precise, Casual, Professional
- Creativity (Temperature): Control response randomness (0.1-2.0)
- Context Window: Adjust memory size (1K-32K tokens)
- Response Length: Control output length
- Advanced Parameters: Top P, Top K, Repeat Penalty
ezllama/
├── src/
│ ├── main/ # Electron main process
│ │ ├── index.js # Main application logic
│ │ └── conversationStorage.js # File-based storage system
│ ├── preload/ # Electron preload scripts
│ └── renderer/ # Electron renderer (React)
├── web/ # Web client
│ └── src/ # React web application
├── resources/ # Application resources
└── out/ # Build output
npm run dev
- Start development mode (Electron + Web server)npm run build
- Build for productionnpm run build:win
- Build for Windowsnpm run build:mac
- Build for macOSnpm run build:linux
- Build for Linux
The web server exposes these endpoints:
GET /api/models
- List available modelsPOST /api/chat
- Send chat message with custom settingsGET /api/conversations
- List saved conversationsPOST /api/conversations
- Save conversationGET /api/conversations/:id
- Load specific conversationDELETE /api/conversations/:id
- Delete conversation
We welcome contributions! Please see our Contributing Guide for details.
- Fork the repository
- Create a feature branch:
git checkout -b feature-name
- Make your changes
- Test thoroughly
- Submit a pull request
This project is licensed under the MIT License - see the LICENSE file for details.
- Ollama for the amazing local AI platform
- Electron for cross-platform desktop apps
- React for the user interface
- Socket.IO for real-time communication
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Dark/Light theme toggle
- Export conversations to different formats
- Voice input/output
- Multi-language support
Have an idea? Open an issue and let us know!