Skip to content

StephenDMay/Loom

Repository files navigation

Loom: The Universal Development Agent Platform

Loom weaves feature requests into comprehensive coding prompts through LLM-driven sequential analysis. Named after the art of weaving disparate threads into cohesive fabric, Loom transforms raw ideas into structured, actionable development plans.

Designed for developers who want to automate the software development lifecycle while retaining complete control over the AI models used at every stage. The core philosophy is "Meet Developers Where They Are" - Loom is model-agnostic, allowing you to use any AI model or tool at any step of the process.

🎯 What It Does

Loom orchestrates a sequence of specialized AI agents to perform a series of tasks, starting with a simple feature request.

Example Workflow:

  1. You provide a task: python loom.py "implement JWT user authentication"
  2. Project Analysis Agent: Scans your codebase to understand existing patterns, tech stack, and conventions.
  3. Feature Research Agent: Uses the project context to research the best technical approaches and implementation strategies for the feature.
  4. Prompt Assembly Agent: Synthesizes all the gathered information into a detailed, context-aware coding prompt, ready for an implementation LLM.

The final output is a high-quality, actionable prompt that you can feed into your coding LLM of choice to get consistent and contextually-aware code.

🚀 Quick Start

1. Installation

# Clone the repository
git clone [repository-url] ~/Loom
cd ~/Loom

# Ensure Python 3.7+ is installed
python --version

# Install dependencies
pip install -r requirements.txt

2. API Key Setup

🔑 Get your free Gemini API key:

  1. Visit Google AI Studio
  2. Click "Create API key"
  3. Copy your API key

🔧 Set up your environment:

# Linux/Mac - add to your shell profile for persistence
export GEMINI_API_KEY="your-api-key-here"
echo 'export GEMINI_API_KEY="your-api-key-here"' >> ~/.bashrc

# Windows Command Prompt
setx GEMINI_API_KEY "your-api-key-here"

# Windows PowerShell
$env:GEMINI_API_KEY="your-api-key-here"
[Environment]::SetEnvironmentVariable("GEMINI_API_KEY", "your-api-key-here", "User")

3. Configuration

# Navigate to your project directory
cd /path/to/your/project

# Copy and configure the project settings
cp ~/Loom/dev-automation.config.json .
# Edit dev-automation.config.json with your project details

# Copy and configure the default template
cp ~/Loom/meta-prompt-template.md .
# Edit meta template to meet your project details, this is the default template used if specific ones are not provided at each step.

# Verify your API key is set
echo $GEMINI_API_KEY  # Should show your API key

4. Run Your First Agent Sequence

# Basic usage - runs the configured agent sequence
python ~/Loom/loom.py "implement user authentication system"

# Validate your configuration and API key
python ~/Loom/loom.py --validate-config

5. Optional: Create Global Command

Windows: Create loom.bat in a directory in your PATH:

@echo off
python "C:\path\to\Loom\loom.py" %*

Mac/Linux: Create a symlink or alias:

# Symlink approach
ln -s ~/Loom/loom.py /usr/local/bin/loom

# Or add alias to your shell profile
echo 'alias loom="python ~/Loom/loom.py"' >> ~/.bashrc

📁 Project Structure

~/Loom/
├── loom.py                         # Main entry point
├── agents/                         # Agent directory
│   ├── orchestrator.py             # Agent orchestration engine
│   ├── base_agent.py               # Base class for all agents
│   ├── project_analysis_agent/     # Analyzes codebase structure
│   ├── feature_research_agent/     # Researches implementation approaches
│   ├── prompt_assembly_agent/      # Assembles final coding prompts
│   └── issue_generator/             # Legacy issue generation
├── core/                           # Core system components
│   ├── config_manager.py           # Configuration management
│   ├── llm_manager.py              # LLM provider abstraction
│   └── context_manager.py          # Cross-agent context sharing
├── requirements.txt                # Python dependencies
└── README.md                       # This file

# Per-project files (created in your project directory):
your-project/
├── dev-automation.config.json      # Project configuration
└── generated-issues/               # Output directory
    ├── 20240714_123456_feature.md  # Generated specifications
    └── ...

🛠️ Configuration

Each project gets its own dev-automation.config.json file that configures the agent execution sequence and LLM providers.

Project Settings

{
  "project": {
    "name": "Loom",
    "context": "A flexible system that automates the software development lifecycle",
    "tech_stack": "Python, markdown",
    "architecture": "Open orchestration",
    "target_users": "Developers",
    "constraints": "Model API Differences, Context Management, Output Consistency"
  },
  "agent_execution_order": [
    "project-analysis-agent",
    "feature-research-agent", 
    "prompt-assembly-agent"
  ]
}

LLM Provider Settings

{
  "llm_settings": {
    "default_provider": "gemini",
    "model": "gemini-2.0-flash-exp",
    "temperature": 0.6,
    "max_tokens": 8192,
    "output_format": "structured",
    "research_depth": "standard"
  }
}

GitHub Integration

{
  "github": {
    "repo_owner": "your-username",
    "repo_name": "your-repo",
    "default_project": "Your-Project-Board-Name",
    "default_labels": ["auto-generated", "needs-review", "enhancement"]
  },
  "automation": {
    "auto_create_issues": true,       # Enable automatic GitHub issue creation
    "auto_assign": false
  }
}

Template Customization

{
  "templates": {
    "ui_feature": "Focus on user experience, responsive design...",
    "api_feature": "Focus on performance, security, scalability...",
    "data_feature": "Focus on data processing, ETL, validation...",
    "perf_feature": "Focus on optimization, caching, performance..."
  }
}

🎪 Usage Examples

Basic Agent Execution

# Run the configured agent sequence
python loom.py "implement OAuth2 authentication"
python loom.py "add real-time notifications"
python loom.py "optimize database query performance"

Configuration Validation

# Validate your LLM providers and configuration
python loom.py --validate-config

🤖 Agent System

Loom's power comes from its orchestrated multi-agent architecture. Each agent specializes in a specific aspect of the development workflow:

Agent Types

Project Analysis Agent: Scans your codebase to understand:

  • Existing patterns and conventions
  • Technology stack and dependencies
  • Architecture and file structure
  • Coding standards and practices

Feature Research Agent: Conducts comprehensive research on:

  • Best practices for the requested feature
  • Implementation approaches and alternatives
  • Integration considerations with existing codebase
  • Potential risks and mitigation strategies

Prompt Assembly Agent: Synthesizes information to create:

  • Context-aware coding prompts
  • Detailed implementation specifications
  • Code examples following project conventions
  • Ready-to-use prompts for any LLM

Agent Orchestration

The AgentOrchestrator manages the execution sequence:

  1. Loads agents dynamically from the agents/ directory
  2. Executes them in the order specified in agent_execution_order
  3. Manages context sharing between agents via ContextManager
  4. Handles LLM provider abstraction through LLMManager

Agents communicate through a shared context, allowing later agents to build upon the work of earlier ones.

🔧 GitHub Integration

Automatic Issue Creation

When auto_create_issues is enabled in your config:

# This will automatically create a GitHub issue with full specification
python loom.py "implement user roles and permissions"

Manual Issue Creation

# Generate specification file
python loom.py "implement user roles and permissions"

# Use the generated file with GitHub CLI
gh issue create --body-file generated-issues/YYYY-MM-DD-HHMMSS-feature-slug.md --label "enhancement"

GitHub CLI Setup

# Install GitHub CLI and authenticate
gh auth login

# Ensure you have project scope for automatic issue creation
gh auth refresh --scopes repo,project

🎨 Multi-Project Workflow

The beauty of this system is that you can use one installation across multiple projects:

# Project A
cd ~/projects/my-web-app
cp ~/Loom/dev-automation.config.json .  # Copy and customize config
python ~/Loom/loom.py "add user authentication"

# Project B  
cd ~/projects/my-mobile-app
cp ~/Loom/dev-automation.config.json .  # Copy and customize config
python ~/Loom/loom.py "implement offline sync"

# Each project gets its own config and generated-issues folder

🔍 Output Format

Each generated issue includes:

  • Executive Summary: What the feature does and why it matters
  • Codebase Analysis: Integration points and architectural impact
  • Domain Research: User workflows and industry best practices
  • Technical Approach: Implementation strategy with alternatives
  • Implementation Specification: Detailed technical requirements
  • Risk Assessment: Technical and business risks with mitigation
  • Project Details: Effort estimates, dependencies, acceptance criteria
  • GitHub Issue Template: Ready-to-use issue content

🛡️ Dependencies

Required

  • Python 3.7+: Core runtime
  • google-generativeai: Python package for Gemini API access
  • Gemini API Key: Free API key from Google AI Studio

Optional

  • GitHub CLI (gh): For automated issue creation
  • Git: For repository context (auto-detected)

API Key Setup

  1. Get your Gemini API key: Visit Google AI Studio
  2. Set environment variable:
    # Linux/Mac
    export GEMINI_API_KEY="your-api-key-here"
    echo 'export GEMINI_API_KEY="your-api-key-here"' >> ~/.bashrc
    
    # Windows
    setx GEMINI_API_KEY "your-api-key-here"
  3. Verify setup: python loom.py --validate-config

🔧 Troubleshooting

Common Issues

"GEMINI_API_KEY environment variable not set"

# Verify your API key is set
echo $GEMINI_API_KEY

# If empty, set it:
export GEMINI_API_KEY="your-api-key-here"

"google-generativeai package not installed"

pip install google-generativeai

"Gemini API call failed"

  • Check your API key is valid at Google AI Studio
  • Verify you haven't exceeded API quotas
  • Ensure you have internet connectivity

Configuration validation fails

# Run validation to see specific issues
python loom.py --validate-config

🚧 Roadmap

  • Multiple LLM Support - Add OpenAI, Claude, and local model providers
  • Template Marketplace - Share and discover project-specific templates
  • Progress Tracking - Monitor implementation progress and outcomes
  • Team Collaboration - Shared configurations and team workflows
  • IDE Integration - VSCode extension for in-editor issue generation
  • CI/CD Integration - Trigger issue generation from repository events

🤝 Contributing

This is a universal system designed to work across any project type. Contributions welcome for:

  • New template categories
  • LLM provider integrations
  • Output format improvements
  • Cross-platform compatibility
  • Documentation and examples

📄 License

MIT License - Use this system for any project, commercial or personal.


🎯 The Goal: Weave feature ideas into well-researched, context-aware coding prompts through intelligent agent orchestration. Transform raw concepts into actionable development plans that understand your codebase, follow your patterns, and integrate seamlessly with your workflow.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages