Skip to content

Composable Guile Scheme library for LLM integration featuring functional APIs, type-safe bindings, and meta-programming support for AI-powered development

Notifications You must be signed in to change notification settings

aygp-dr/scheme-llm-toolkit

Repository files navigation

Guile Scheme LLM Integration Toolkit

https://img.shields.io/badge/Guile-Scheme-blue.svg https://img.shields.io/badge/License-MIT-green.svg https://img.shields.io/github/issues/aygp-dr/scheme-llm-toolkit.svg

Guile Scheme LLM Integration Toolkit

A powerful Guile Scheme library for integrating Large Language Models into functional programming workflows, emphasizing composability and type safety.

Overview

This toolkit provides idiomatic Scheme interfaces for LLM integration:

  • Composable prompt construction using S-expressions
  • Type-safe API bindings for multiple LLM providers
  • Functional streaming and batch processing
  • Integration with existing Scheme AI frameworks
  • Meta-programming support for code generation

Intersections with Existing Projects

  • ollama-topic-forge: Provides the foundational LLM integration patterns
  • pseudo-llm-macro: Macro system for LLM-powered code generation
  • aibrainrot-zeddev: Development environment integration examples
  • multi-framework-agent-lab: Agent framework comparison and analysis

Features

Functional LLM Interface

;; Composable prompt construction
(define-prompt weather-query
  `(system "You are a helpful weather assistant.")
  `(user ,(format #f "What's the weather like in ~a?" city)))

;; Streaming responses with functional processing
(llm-stream (ollama "llama2")
            (weather-query "Boston")
            #:on-token (λ (token) (display token))
            #:on-complete (λ (response) (process-weather response)))

Type-Safe API Bindings

;; Provider abstraction with consistent interface
(define-provider openai
  #:api-key (getenv "OPENAI_API_KEY")
  #:model "gpt-4"
  #:max-tokens 1000)

(define-provider ollama  
  #:base-url "http://localhost:11434"
  #:model "llama2")

;; Unified interface across providers
(llm-complete provider prompt #:temperature 0.7)

Meta-Programming Integration

;; LLM-powered macro expansion
(define-syntax llm-generate
  (syntax-rules ()
    ((llm-generate description)
     (let ((code (llm-complete ollama-code-model description)))
       (eval-string code)))))

;; Usage: Generate code at compile-time
(llm-generate "Create a function that calculates fibonacci numbers")

Architecture

The toolkit is organized into composable modules:

(llm core)
Core LLM abstraction layer
(llm providers)
Provider-specific implementations
(llm streaming)
Functional streaming interfaces
(llm prompts)
Prompt construction DSL
(llm types)
Type definitions and contracts
(llm agents)
Multi-agent conversation patterns

Provider Support

Currently Supported

  • Ollama: Local model hosting with full feature support
  • OpenAI: GPT-3.5/GPT-4 with streaming and function calling
  • Anthropic: Claude models with conversation management
  • Hugging Face: Transformers library integration

Planned Support

  • Google Gemini: Multimodal capabilities
  • Mistral AI: European AI provider
  • Local Models: Direct transformers.scm integration

Usage Examples

Basic Completion

(use-modules (llm core) (llm providers ollama))

(define response
  (llm-complete (make-ollama #:model "llama2")
                "Explain recursion in Scheme"))

(display response)

Conversation Management

(use-modules (llm conversation))

(define chat (make-conversation))

(conversation-add! chat 'user "Hello, I'm learning Scheme")
(conversation-add! chat 'assistant 
  (llm-complete provider (conversation->prompt chat)))

(conversation-add! chat 'user "Can you explain macros?")
(define response 
  (llm-complete provider (conversation->prompt chat)))

Function Calling

(use-modules (llm functions))

(define-llm-function get-weather
  "Get current weather for a city"
  ((city string? "The city name")))

(define tools (list get-weather))

(llm-complete-with-tools provider
                         "What's the weather in Boston?"
                         tools)

Quick Start

# Clone and setup
git clone https://github.com/aygp-dr/scheme-llm-toolkit.git
cd scheme-llm-toolkit

# Check dependencies (all core functionality tested)
make install-guile-deps

# Run dependency check
guile3 experiments/000-deps-check/check.scm

# Test JSON functionality
guile3 -L src experiments/001-json-test/test-json.scm

# Set up provider configurations (optional)
cp config/providers.example.scm config/providers.scm
# Edit config/providers.scm with your API keys for external providers

System Requirements

  • Guile 3.0+ (tested on 3.0.10)
  • guile-json (install manually if auto-install fails)
  • curl or wget (for HTTP requests)
  • Optional: Ollama (for local LLM testing)

FreeBSD Installation Notes

# Install dependencies
pkg install guile3 guile-json curl

# The toolkit uses guile3 shebang for FreeBSD compatibility

Configuration

;; config/providers.scm
(define-module (config providers))

(define ollama-config
  `((base-url . "http://localhost:11434")
    (models . ("llama2" "codellama" "mistral"))))

(define openai-config
  `((api-key . ,(getenv "OPENAI_API_KEY"))
    (organization . ,(getenv "OPENAI_ORG"))
    (models . ("gpt-4" "gpt-3.5-turbo"))))

License

MIT License - Functional LLM integration for the Scheme ecosystem.

About

Composable Guile Scheme library for LLM integration featuring functional APIs, type-safe bindings, and meta-programming support for AI-powered development

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •