Skip to content

uxmauro/ezllama

EzLlama

A modern, feature-rich chat interface for Ollama with conversation storage and custom AI settings.

🚀 Features

  • Dual Interface: Native Electron app + Web browser access
  • Conversation Storage: Save, load, and manage chat conversations
  • Custom AI Settings: Per-conversation system prompts, creativity controls, and output styles
  • Real-time Streaming: Live response streaming with WebSocket support
  • Model Management: Download, remove, and test Ollama models
  • Cross-Platform: Works on Windows, macOS, and Linux
  • Network Sharing: Share web interface across your local network

📸 Screenshots

Screenshots will be added soon

🛠️ Installation

Prerequisites

Quick Start

  1. Clone the repository

    git clone https://github.com/uxmauro/ezllama.git
    cd ezllama
  2. Install dependencies

    npm install
  3. Start the application

    npm run dev
  4. Access the interfaces

    • Electron App: Opens automatically
    • Web Interface: Visit http://localhost:3000

🎛️ Custom Settings

Each conversation supports individual settings:

  • System Prompt: Custom instructions for AI behavior
  • Output Style: Balanced, Creative, Precise, Casual, Professional
  • Creativity (Temperature): Control response randomness (0.1-2.0)
  • Context Window: Adjust memory size (1K-32K tokens)
  • Response Length: Control output length
  • Advanced Parameters: Top P, Top K, Repeat Penalty

📁 Project Structure

ezllama/
├── src/
│   ├── main/                 # Electron main process
│   │   ├── index.js         # Main application logic
│   │   └── conversationStorage.js  # File-based storage system
│   ├── preload/             # Electron preload scripts
│   └── renderer/            # Electron renderer (React)
├── web/                     # Web client
│   └── src/                 # React web application
├── resources/               # Application resources
└── out/                     # Build output

🔧 Development

Available Scripts

  • npm run dev - Start development mode (Electron + Web server)
  • npm run build - Build for production
  • npm run build:win - Build for Windows
  • npm run build:mac - Build for macOS
  • npm run build:linux - Build for Linux

API Endpoints

The web server exposes these endpoints:

  • GET /api/models - List available models
  • POST /api/chat - Send chat message with custom settings
  • GET /api/conversations - List saved conversations
  • POST /api/conversations - Save conversation
  • GET /api/conversations/:id - Load specific conversation
  • DELETE /api/conversations/:id - Delete conversation

🤝 Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

  1. Fork the repository
  2. Create a feature branch: git checkout -b feature-name
  3. Make your changes
  4. Test thoroughly
  5. Submit a pull request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

🙏 Acknowledgments

  • Ollama for the amazing local AI platform
  • Electron for cross-platform desktop apps
  • React for the user interface
  • Socket.IO for real-time communication

📞 Support

🗺️ Future Ideas

  • Dark/Light theme toggle
  • Export conversations to different formats
  • Voice input/output
  • Multi-language support

Have an idea? Open an issue and let us know!


About

Chat interface for Ollama with conversation storage and custom AI settings

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published