Releases: AsyncFlow-Sim/AsyncFlow
AsyncFlow 0.1.1
🚀 AsyncFlow v0.1.1 — Event Injection & Improved Server Model
✨ New Features
-
Event Injection Engine
- Declarative input schema for network spikes and server outages.
- Strong validation: unique IDs, target type checks, time window ordering.
- Runtime scheduler to apply events deterministically during simulation.
-
Improved Server Model
- More accurate handling of CPU blocking, RAM residency, and I/O waits in the event loop.
- Clearer separation of queues and resource contention.
📊 Metrics & Observability
- Events now visible in charts (spike windows, server downtime).
- Extended per-server metrics (ready queue, I/O waits, RAM usage) with event overlays.
🛠 Developer Experience
- Examples updated with event-driven scenarios (spikes + outages).
- Documentation improved (YAML schema, validation rules, usage guides).
🐛 Fixes & Polish
- Input validation improved with Pydantic (safer configs).
- Minor documentation cleanups and consistency fixes.
Alpha version of AsyncFlow
AsyncFlow v0.1.0-alpha — Release Notes (pre-release)
Summary
First public alpha of AsyncFlow, a SimPy-based, event-loop-aware simulator for asynchronous distributed systems.
Model clients, servers, load balancers, and network edges; run reproducible simulations; and analyze latency, throughput, and per-server resource metrics with built-in plots.
Highlights
- Event-loop model: explicit CPU steps (blocking), I/O waits (non-blocking), and RAM residency per request.
- Topology graph: request generator → client → servers; optional load balancer (round-robin) → multi-server fan-out; configurable network edges (stochastic latency).
- Workload: stochastic traffic via simple random-variable configs (Poisson defaults).
Metrics
- Event:
RqsClock
(end-to-end latencies). - Sampled:
ready_queue_len
,event_loop_io_sleep
,ram_in_use
,edge_concurrent_connection
.
Analyzer API (ResultsAnalyzer
):
get_latency_stats()
,get_throughput_series()
- Plots:
plot_latency_distribution(ax)
,plot_throughput(ax)
- per-server:
plot_single_server_ready_queue(ax, id)
plot_single_server_io_queue(ax, id)
plot_single_server_ram(ax, id)
- compact:
plot_base_dashboard(...)
Examples
- YAML quickstart (single server).
- Pythonic builder (single server; load balancer + two servers).
- All figures saved next to the script for easy inspection.
Tooling & CI
- One-shot setup scripts:
scripts/dev_setup.(sh|ps1)
(installs Poetry, creates.venv
, installs dev deps, runs ruff/mypy/pytest). - Convenience scripts:
quality_check
,run_tests
,run_sys_tests
. - CI (GitHub Actions): lint + type-check + tests; system tests gate merges into
main
.
Compatibility
- Python 3.12+ (Linux/macOS/Windows).
Install from PyPI:
pip install asyncflow-sim
Known limitations (alpha)
- Simplified network (latency + optional drops; no bandwidth/payload yet).
- One event loop per server (no multi-process/node).
- Linear endpoint pipelines (no branching/fan-out inside an endpoint).
- Stationary workload; very short spikes may be missed if
sample_period_s
is large.
Roadmap (abridged)
- Monte Carlo engine (CI/CBs)
- Stochastic service times per step
- Expanded component library (cache/DB/APIGW)
- Fault/event injection
- Richer network (bandwidth, retries/timeouts)
- Branching and fan-out
- Basic autoscaling/backpressure
Notes
Pre-release: APIs and plot styles may change before 0.1.0.
Feedback and issues are very welcome—this release is meant to validate the API and UX before the stable cut.
Alpha version of AsyncFlow
The yml file for the quickstart in the readme had an issue, the issue is now solved, ensuring for interested people a smooth onboarding
Pypi released alpha version
AsyncFlow 0.1.0a1 — aligned with PyPI
Repository state aligned with the PyPI 0.1.0a1 build.
No functional/runtime changes.
- Packaging: minor pyproject.toml metadata/config tidy-up
- CI: main workflow now triggers on push to
main