Transient
Integrations

Chatbot & Workflow Integrations

Setup guides for connecting Transient with AI clients and automation platforms. Each integration is a thin adapter over the same MCP or REST API contract.

Which surface to use

Claude Desktop

MCP (stdio)

Full tool surface. Native MCP support — add the TI server to your config and all tools are immediately available.

ChatGPT

REST API · MCP (Desktop)

Custom GPT actions via REST API. ChatGPT Desktop supports MCP on Business plan and in developer mode — enable it in Settings → Developer, add the TI server config, and use ti_create_upload_handoff for local files.

Perplexity

REST API

Point a custom tool definition at /v1/answer with your API key.

Gemini (Google AI Studio)

REST API

Use a Function Calling tool definition pointing at /v1/answer. Works in both Gemini API and AI Studio custom agents.

Grok (xAI)

REST API

Pass /v1/answer as a tool in the xAI API tools array. Grok will invoke it when evidence retrieval is needed.

Setup guides

Step-by-step setup instructions and code examples for each integration category.

Common guardrails for all clients

Call ti_workflow_policy once at session start (MCP)

This loads the canonical TI connector rules into the model context, ensuring deterministic upload→ask behaviour across restarts.

Local paths only block in sandboxed environments

Claude Desktop and local MCP servers can read files from disk directly. In sandboxed clients (ChatGPT Desktop, Claude.ai web), paths like /mnt/data/file.pdf are not accessible — extract the text first or use ti_create_upload_handoff to generate a browser upload URL.

Enforce citation discipline in your system prompt

Explicitly instruct the model to only present claims traceable to the returned citations array. Prevents probabilistic completion from overriding evidence-first output.

Handle empty citations explicitly

If citations is empty, the correct response is 'No evidence found.' Do not ask the model to synthesise from its own priors when TI returns nothing.