A powerful Guile Scheme library for integrating Large Language Models into functional programming workflows, emphasizing composability and type safety.
This toolkit provides idiomatic Scheme interfaces for LLM integration:
- Composable prompt construction using S-expressions
- Type-safe API bindings for multiple LLM providers
- Functional streaming and batch processing
- Integration with existing Scheme AI frameworks
- Meta-programming support for code generation
- ollama-topic-forge: Provides the foundational LLM integration patterns
- pseudo-llm-macro: Macro system for LLM-powered code generation
- aibrainrot-zeddev: Development environment integration examples
- multi-framework-agent-lab: Agent framework comparison and analysis
;; Composable prompt construction
(define-prompt weather-query
`(system "You are a helpful weather assistant.")
`(user ,(format #f "What's the weather like in ~a?" city)))
;; Streaming responses with functional processing
(llm-stream (ollama "llama2")
(weather-query "Boston")
#:on-token (λ (token) (display token))
#:on-complete (λ (response) (process-weather response)))
;; Provider abstraction with consistent interface
(define-provider openai
#:api-key (getenv "OPENAI_API_KEY")
#:model "gpt-4"
#:max-tokens 1000)
(define-provider ollama
#:base-url "http://localhost:11434"
#:model "llama2")
;; Unified interface across providers
(llm-complete provider prompt #:temperature 0.7)
;; LLM-powered macro expansion
(define-syntax llm-generate
(syntax-rules ()
((llm-generate description)
(let ((code (llm-complete ollama-code-model description)))
(eval-string code)))))
;; Usage: Generate code at compile-time
(llm-generate "Create a function that calculates fibonacci numbers")
The toolkit is organized into composable modules:
(llm core)
- Core LLM abstraction layer
(llm providers)
- Provider-specific implementations
(llm streaming)
- Functional streaming interfaces
(llm prompts)
- Prompt construction DSL
(llm types)
- Type definitions and contracts
(llm agents)
- Multi-agent conversation patterns
- Ollama: Local model hosting with full feature support
- OpenAI: GPT-3.5/GPT-4 with streaming and function calling
- Anthropic: Claude models with conversation management
- Hugging Face: Transformers library integration
- Google Gemini: Multimodal capabilities
- Mistral AI: European AI provider
- Local Models: Direct transformers.scm integration
(use-modules (llm core) (llm providers ollama))
(define response
(llm-complete (make-ollama #:model "llama2")
"Explain recursion in Scheme"))
(display response)
(use-modules (llm conversation))
(define chat (make-conversation))
(conversation-add! chat 'user "Hello, I'm learning Scheme")
(conversation-add! chat 'assistant
(llm-complete provider (conversation->prompt chat)))
(conversation-add! chat 'user "Can you explain macros?")
(define response
(llm-complete provider (conversation->prompt chat)))
(use-modules (llm functions))
(define-llm-function get-weather
"Get current weather for a city"
((city string? "The city name")))
(define tools (list get-weather))
(llm-complete-with-tools provider
"What's the weather in Boston?"
tools)
# Clone and setup
git clone https://github.com/aygp-dr/scheme-llm-toolkit.git
cd scheme-llm-toolkit
# Check dependencies (all core functionality tested)
make install-guile-deps
# Run dependency check
guile3 experiments/000-deps-check/check.scm
# Test JSON functionality
guile3 -L src experiments/001-json-test/test-json.scm
# Set up provider configurations (optional)
cp config/providers.example.scm config/providers.scm
# Edit config/providers.scm with your API keys for external providers
- Guile 3.0+ (tested on 3.0.10)
- guile-json (install manually if auto-install fails)
- curl or wget (for HTTP requests)
- Optional: Ollama (for local LLM testing)
# Install dependencies
pkg install guile3 guile-json curl
# The toolkit uses guile3 shebang for FreeBSD compatibility
;; config/providers.scm
(define-module (config providers))
(define ollama-config
`((base-url . "http://localhost:11434")
(models . ("llama2" "codellama" "mistral"))))
(define openai-config
`((api-key . ,(getenv "OPENAI_API_KEY"))
(organization . ,(getenv "OPENAI_ORG"))
(models . ("gpt-4" "gpt-3.5-turbo"))))
MIT License - Functional LLM integration for the Scheme ecosystem.