Skip to content

[Feature]: Support injecting Langfuse prompt with trace metadata during LLM calls #12981

@restato

Description

@restato

The Feature

Support fetching prompts from Langfuse and injecting trace metadata (e.g. trace ID, span ID) into the prompt before sending the request to the LLM.

This would allow teams using Langfuse with LiteLLM to automatically enrich prompts with trace context, enabling better observability and end-to-end traceability in multi-service environments.

Motivation, pitch

I’m working on a Langfuse-integrated system where prompts are version-controlled and stored in Langfuse. I’d like to use these prompts during LLM calls made via LiteLLM, while also injecting trace metadata (like trace_id, span_id, user_id) into the prompt for better observability and auditability.

Currently, this requires manual fetching of the prompt from Langfuse and custom logic to inject trace info, which increases boilerplate and risk of inconsistency.

Having native support in LiteLLM to:

  1. Fetch prompts by ID from Langfuse
  2. Optionally inject trace metadata (as a system message or JSON block)
  3. Use the resulting prompt for the LLM call…would make integration smoother and improve production readiness for Langfuse + LiteLLM stacks.

Proposed interface (example):

prompt = langfuse.get_prompt("prompt_id")
compiled_prompt = prompt.compile(**prompt_variables)
    messages = [
        {"role": "system", "content": compiled_prompt},
        {"role": "user", "content": input},
    ]

response = litellm.completion(
    model="gpt-4",
    messages=messages,
     **prompt.config, 👈
    langfuse_prompt=prompt, 👈
)

LiteLLM is hiring a founding backend engineer, are you interested in joining us and shipping to all our users?

No

Twitter / LinkedIn details

@direcision / https://www.linkedin.com/in/direcision/

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions