Monoscope lets you ingest and explore your logs, traces and metrics in S3 buckets. Query in natural language via LLMs. Monoscope also let's you create AI agents that run at an interval to automatically detect anomalies in your logs, metrics, and traces. The most important actions and logs and insight are sent as reports to your email every day or week.
Website β’ Discord β’ Twitter β’ Changelog β’ Documentation
π€ AI Anomaly Detection β’ π¬ Natural Language Search β’ β Star Us β’ π€ Contributing

Monoscope automatically detects anomalies in your logs, metrics, and traces using AI β no configuration required.
Monoscope is an open-source observability platform that uses artificial intelligence to understand and monitor your systems automatically. Unlike traditional monitoring tools that require extensive configuration and generate overwhelming alerts, Monoscope learns your system's normal behavior and only alerts you when something is genuinely wrong.
- Universal Data Ingestion: Native support for OpenTelemetry means compatibility with 750+ integrations out of the box
- AI-Powered Understanding: Our LLM engine understands context, not just thresholds
- Natural Language Interface: Query your data in plain English
- Cost-Effective Storage: Store years of data affordably with S3-compatible object storage
- Zero Configuration: Start getting insights immediately without complex setup
# Run with Docker (recommended)
docker run -p 8080:8080 monoscope/monoscope:latest
# Or clone and run locally
git clone https://github.com/monoscope-tech/monoscope.git
cd monoscope
docker-compose up
Visit http://localhost:8080
to access Monoscope. Full installation guide β
Monoscope is built on OpenTelemetry, the industry-standard observability framework. This means you get instant compatibility with 750+ integrations including all major languages, frameworks, and infrastructure components.
- Logs: Application logs, system logs, audit trails
- Metrics: Performance counters, business KPIs, custom metrics
- Traces: Distributed request flows, latency tracking, dependency mapping
# For Python applications
pip install opentelemetry-api opentelemetry-sdk
export OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:8080"
# For Node.js applications
npm install @opentelemetry/api @opentelemetry/sdk-node
export OTEL_EXPORTER_OTLP_ENDPOINT="http://localhost:8080"
# For Kubernetes clusters
helm install opentelemetry-collector open-telemetry/opentelemetry-collector \
--set config.exporters.otlp.endpoint="monoscope:8080"
Monoscope automatically correlates logs, metrics, and traces from the same service, giving you a complete picture of your system's behavior. No manual correlation or configuration required.
Monoscope's AI engine continuously learns your system's normal behavior patterns and automatically alerts you to genuine issues:
- Context-Aware Detection: Understands that high CPU during deployments is normal, but high CPU at 3 AM is not
- Seasonal Pattern Recognition: Learns daily, weekly, and monthly patterns in your data
- Cross-Signal Correlation: Detects anomalies by analyzing logs, metrics, and traces together
- Noise Reduction: Reduces alert fatigue by 90% compared to threshold-based monitoring
The AI runs continuously in the background, requiring no configuration or training from you.
Query your observability data using plain English instead of complex query languages:
- "Show me all errors in the payment service in the last hour"
- "What caused the spike in response time yesterday at 3 PM?"
- "Which services are consuming the most memory?"
- "Find all database queries taking longer than 1 second"
Monoscope translates your natural language into optimized queries across logs, metrics, and traces, returning relevant results with explanations.
LLM-based engine that understands context and identifies real issues, not just threshold violations |
Search logs and metrics using plain English - no complex query languages required |
Handle millions of events/sec with our custom TimeFusion storage engine |
Store years of data affordably with S3-compatible object storage |
![]() Log Explorer - Main View |
![]() Log Explorer - Detailed View |
![]() Dashboard Analytics |
Monoscope - Open Source Observability |
Monoscope combines high-performance data ingestion with intelligent AI analysis:
graph LR
A[Your Apps] -->|Logs/Metrics| B[Ingestion API]
B --> C[TimeFusion Engine]
C --> D[S3 Storage]
C --> E[LLM Pipeline]
E --> F[Anomaly Detection]
F --> G[Alerts & Dashboard]
- Language: Built in Haskell for reliability and performance
- Storage: S3-compatible object storage for cost-effective retention
- AI Engine: State-of-the-art LLMs for intelligent analysis
- Scale: Horizontally scalable architecture
Traditional monitoring tools require extensive configuration, generate overwhelming alerts, and still miss critical issues. You spend more time managing your monitoring than actually using it.
Monoscope uses AI to understand your system's behavior, automatically detect anomalies, and provide actionable insights - all without complex configuration.
- DevOps Teams reducing alert fatigue by 90%
- SREs catching issues before they impact users
- Engineering Leaders getting visibility across complex systems
- Startups implementing enterprise-grade observability on a budget
- Features
- Getting Started
- Prerequisites
- Installation
- Development Setup
- Testing
- Useful Links
- Contributing
- License
- π€ AI-Powered Anomaly Detection: Leverages LLMs to automatically identify and alert on unusual patterns
- βοΈ S3-Compatible Storage: Store logs, metrics and traces in any S3-compatible object storage
- π High Performance: Written in Haskell and rust for reliability and performance
- π Real-Time Analytics: Monitor your systems with minimal latency
- π Extensible: Easy to integrate with existing monitoring infrastructure
Prerequisites
Before installing Monoscope, ensure you have the following dependencies:
- Haskell: Install via GHCup
- PostgreSQL with TimescaleDB: For time-series data storage
- LLVM: Required for compilation
- Google Cloud SDK: For GCP integration (if using GCP)
- Install Haskell via GHCup
curl --proto '=https' --tlsv1.2 -sSf https://get-ghcup.haskell.org | sh
- Clone the Repository
git clone https://github.com/monoscope-tech/monoscope.git
cd monoscope
- Install System Dependencies
For macOS:
# Install LLVM
brew install llvm
# Install PostgreSQL with TimescaleDB
brew install postgresql
brew install timescaledb
# Install libpq
brew install libpq
For Linux (Ubuntu/Debian):
# Install LLVM
sudo apt-get install llvm
# Install PostgreSQL and TimescaleDB
# Follow instructions at: https://docs.timescale.com/install/latest/
# Install libpq
sudo apt-get install libpq-dev
- Configure Google Cloud (Optional)
If using Google Cloud integration:
gcloud auth application-default login
- Run Monoscope
stack run
Development Setup
- Create a Docker volume for PostgreSQL data:
docker volume create pgdata
- Run TimescaleDB in Docker:
make timescaledb-docker
- Configure pg_cron extension:
Add the following to your PostgreSQL configuration:
ALTER system SET cron.database_name = 'apitoolkit';
ALTER system SET shared_preload_libraries = 'pg_cron';
Then restart the TimescaleDB Docker container.
Install code formatting and linting tools:
# Code formatter
brew install ormolu
# Linter
brew install hlint
Useful commands:
# Format code
make fmt
# Run linter
make lint
π‘ Tip: For better IDE support, compile Haskell Language Server locally to avoid crashes, especially on macOS. See issue #2391.
To build the service worker:
workbox generateSW workbox-config.js
Running Tests
make test
# OR
stack test --ghc-options=-w
Unit tests don't require a database connection and run much faster. They include doctests and pure function tests.
make test-unit
# OR
stack test apitoolkit-server:unit-tests --ghc-options=-w
make live-test-unit
# OR
stack test apitoolkit-server:unit-tests --ghc-options=-w --file-watch
stack test --test-arguments "--match=SeedingConfig" apitoolkit-server:tests
# OR
stack test --ta "--match=SeedingConfig" apitoolkit-server:tests
- π¬ Discord - Chat with users and contributors
- π Issues - Report bugs or request features
- π¦ Twitter - Follow for updates
- π Blog - Tutorials and case studies
We welcome contributions to Monoscope! Please feel free to:
- Report bugs and request features via GitHub Issues
- Submit pull requests for bug fixes and new features
- Improve documentation and examples
- Share your use cases and feedback
Before contributing, please read our contributing guidelines and ensure your code passes all tests and linting checks.
Monoscope is open source software. Please see the LICENSE file for details.
- Kubernetes Operator
- Terraform Provider
- Mobile App
- Distributed Tracing Support
- Custom ML Model Training
See our public roadmap for more details.
Feature | Monoscope | Datadog | Elastic | Prometheus |
---|---|---|---|---|
AI Anomaly Detection | β Built-in | β Add-on | β | β |
Natural Language Search | β | β | β | β |
Cost-Effective Storage | β S3 | β Proprietary | β | β |
No Configuration Alerts | β | β | β | β |
Open Source | β | β | β | β |