Skip to main content

Thread tracing

plan support
  • Starter plan and above

Overview

To transform AI interactions from one-way outputs into a bidirectional learning system, Wren AI provides a granular feedback loop. This system allows users to report specific issues with AI responses and charts, while providing Project Owners with a Thread Tracing dashboard to monitor accuracy and improve model performance over time.

How it works

The feedback system includes two flows to capture the most relevant context:

  1. AI response feedback: Focuses on text-based logic, such as SQL generation, data retrieval, and instruction following.
  2. Chart visualization feedback: Focuses on the accuracy of visual data, chart types, and rendering instructions.

All feedback is aggregated in the Thread tracing portal, helping teams identify patterns in failures and prioritize improvements.

Send feedback (for all users)

AI response feedback

When interacting with Wren AI, you can help improve the system by submitting feedback:

  1. Rate the response: Below any AI answer or chart, click Thumbs-up (Helpful) or Thumbs-down (Needs improvement).
  2. Select the issue: If you click Thumbs-down, a Provide additional feedback window will appear. Select the relevant tags (for example, “Incorrect chart type”).

Provide additional feedback modal (Answer)

  1. Provide context:
    • Conditional dropdowns: If you select an “Instruction” tag, an additional dropdown appears so you can choose the exact rule the AI missed.
    • Description: Briefly describe the issue to give more context to the team.
  2. Submit: Click Submit feedback.

Instruction dropdown in feedback modal

Example of completed feedback fields

Chart visualization feedback

When you provide feedback on a chart, you’ll see chart-specific issue tags (for example, incorrect chart data or chart generation failed).

Provide additional feedback modal (Chart)

Monitor quality (for Project Owners)

Project Owners can review aggregated feedback in Settings → Project → Thread tracing:

Thread tracing dashboard overview

  • Success metrics: Track Positive rate vs. Negative rate to monitor satisfaction over time.
  • Failure breakdown: Use the bar chart to see which categories (for example, “SQL generation failed” vs. “Incorrect AI summary”) occur most often.
  • Thread overview table: See all threads that have been created in the project. Use table filters to quickly narrow down to threads with feedback only when you want to focus on issues that users have explicitly reported.
  • Advanced filtering:
    • Filter by date range or trace type (Answer vs. Chart).
    • Filter by specific users or groups to compare behavior across teams.

Filter by reported members or groups

  • Deep dive: Expand a Thread ID to view Trace details (Trace ID, timestamp, and the Suggested failure reason) for troubleshooting.

Thread table with expanded trace rows

Filter threads by Thumbs-up / Thumbs-down

  • In the Action column, click View to open the full trace details, including issue report, metadata, and related SQL.

Trace details drawer

Tips and notes

  • Be specific: Selecting the most specific instruction (via dropdown) is usually more actionable than choosing “Other”.
  • Data privacy: Feedback is stored securely within the project context and is used to improve accuracy for your project.
  • Continuous improvement: The failure breakdown updates in near real-time, so teams can evaluate changes quickly.