AI support copilot that grounds every answer
Cut median response time in half — without giving up on accuracy.
The problem
Your support team answers the same questions every day. Your docs cover most of them, but agents have to dig through five tabs to find the exact answer. Meanwhile, AI suggestion tools confidently hallucinate features that don't exist.
Why this is hard to build yourself
- Static FAQ tools go stale within weeks of every release.
- Public-LLM autocomplete invents endpoints, parameters, pricing — and your agents have to catch every one.
- Building your own retrieval stack means standing up a crawler, an embedder, a vector store, and a query layer — a multi-quarter project.
How it works with Pellumin
- Send each incoming ticket to Smart Search, scoped to your docs domain.
- Display the cited answer and source cards directly in the agent's inbox.
- Reply in two clicks, or one if the answer is clean enough to send as-is.
Architecture
- 1Inbound webhook (e.g. Zendesk) → your service
- 2Your service → Smart Search with `q = site:docs.example.com {ticket}`
- 3Render answer + source list in the agent UI
- 4Agent reviews, edits if needed, sends
Implementation (Python)
import os, httpx
API = "https://api.pellumin.com"
KEY = os.environ["PELLUMIN_API_KEY"]
DOCS = "docs.example.com"
def support_suggestion(ticket: str) -> dict:
"""Return a draft answer + source citations for a support ticket."""
r = httpx.get(
f"{API}/api/summary",
params={"q": f"site:{DOCS} {ticket}"},
headers={"Authorization": f"Bearer {KEY}"},
timeout=20,
)
r.raise_for_status()
return r.json()
suggestion = support_suggestion("How do I rotate my API key?")
# suggestion["answer"] → markdown draft, ready to paste
# suggestion["results"] → docs links to send the user