Build a Chat Agent with a Graph¶
Create an agent whose graph is a simple
HumanMessageGuardrail → Generate chain, send a message, and get a
streamed reply.
Prerequisites
- A session connected and configured — see Connect and Manage a Session.
- A model listed in
models.jsonthat is either alreadyLoadedor at leastLocal. If not, see Use Your Own Local Model.
Steps¶
from tryll_client import GraphDescription, NodeType
# 1. Build the graph.
graph = (
GraphDescription()
.add_node("guard", NodeType.HumanMessageGuardrail)
.add_node("answer", NodeType.Generate)
.wire("guard", "triggered", "END") # drop jailbreaks silently
.wire("guard", "not_triggered", "answer")
.wire("answer", "default", "END")
.set_start_node("guard")
.set_default_model_name("My Local Model")
)
# 2. Create the agent.
agent = client.create_agent(graph)
# 3. Send a message. send_message blocks and returns the full reply.
reply = agent.send_message("Hello!")
print(reply)
#include <tryll/TryllClient.h>
namespace TC = Tryll::Client;
TC::GraphDescription graph;
graph.AddNode("guard", TC::NodeType::HumanMessageGuardrail)
.AddNode("answer", TC::NodeType::Generate)
.Wire("guard", "triggered", "END")
.Wire("guard", "not_triggered", "answer")
.Wire("answer", "default", "END")
.SetStartNode("guard")
.SetDefaultModelName("My Local Model");
auto agent = client.CreateAgent(graph);
// Stream tokens to stdout as they arrive.
agent.SendText("Hello!",
[](std::string_view text, bool /*isDelta*/, bool /*isFinal*/)
{ std::cout << text << std::flush; });
- Add a
UTryllAgentComponentto your actor. - In the Details panel, edit Graph on the component, or
reference a
UTryllWorkflowAssetyou authored in the Content Browser. Wire guard.triggered → END, guard.not_triggered → answer, answer.default → END. - In Blueprint:
- Bind On Agent Ready to a handler that calls Send Message.
- Bind On Answer Text to append each chunk to a UI text block.
- Bind On Turn Complete to flip the "typing" indicator off.
Verify it worked¶
Server-side, at info log level, one turn produces:
[info] Agent 7: turn starting
[info] Node guard: not_triggered
[info] Node answer: default
[info] Agent 7: turn complete (Success, tokens=128)
Client-side, Python returns the full reply string from
send_message, while C++ and Unreal see each streamed chunk arrive
in the SendText callback / OnAnswerText delegate, terminated by
a TurnComplete with status Success.
Common pitfalls¶
- "Model not found" (error
4002) — either the
name is wrong, or the model is not in
models.json. Double-check withclient.list_models(). - "Graph validation failed" (error 3003) — the most common cause is an unwired exit route. Every exit of every node must point somewhere.
- Silent nothing after
SendMessage. Check that the start node is not an always-ENDbranch (e.g., guardrail that matches everything). The server log will show which route was taken.