Concept

Global functions

Cross-node functions for HTTP integrations and built-in handlers available everywhere in the conversation.

Overview

Unlike node-scoped functions, global functions are available across all nodes in the flow. They are defined in the flow.global_functions array and come in two types:

  • GlobalHttpFunction (type: "http") — Makes an HTTP call, blocks the conversation, and returns the response to the LLM
  • GlobalBuiltinFunction (type: "builtin") — Maps to an internal platform handler
LLM triggers global function
Check type (http / builtin)
Execute handler
Return result to LLM
Conversation continues in same node

Global vs node functions

When to Use Which?

Node functions handle transitions and node-specific outcomes. Global functions handle utility operations needed everywhere — data lookups, agent transfers, time checks.

AspectNode FunctionsGlobal Functions
ScopeAvailable only in the defining nodeAvailable in every node
TransitionsCan transition via transition_toDo not transition
Side EffectsFire-and-forget hooksHTTP blocks and returns data to LLM
Return ValueNo return valueHTTP functions return the API response
Use CaseFlow control, outcome recordingData lookups, agent transfer, utilities

GlobalHttpFunction (type: “http”)

HTTP global functions make an outbound HTTP call and block the conversation until the response is received. The response is returned to the LLM.

FieldTypeDescription
type"http"Identifies as HTTP global function
namestringFunction name (LLM tool name)
descriptionstringTells LLM when to call
propertiesJSON SchemaParameter definitions
requiredstring[]Required parameters
pre_actionsList[FlowAction]Actions before the HTTP call
post_actionsList[FlowAction]Actions after the HTTP call
expected_fieldsDict[string, FieldConfig]Field source mapping
http_requestHttpRequestConfigFull HTTP request configuration

Blocking Call

Unlike hook HTTP requests (fire-and-forget), global HTTP functions block the conversation until the response arrives. Use pre_actions with a tts_say to keep the customer engaged.

Order Status Lookup Example

check-order-status.json
json
{
  "type": "http",
  "name": "check_order_status",
  "description": "Look up the current status of a customer's order when they ask about it",
  "properties": {
    "order_id": { "type": "string", "description": "The order ID provided by the customer" }
  },
  "required": ["order_id"],
  "pre_actions": [
    { "type": "tts_say", "text": "Let me check on that order for you, one moment." }
  ],
  "post_actions": [],
  "expected_fields": {
    "order_id": { "source": "llm" },
    "account_id": { "source": "static", "value": "{account_id}" }
  },
  "http_request": {
    "url": "https://api.example.com/orders/{order_id}/status",
    "method": "GET",
    "headers": { "Authorization": "Bearer {api_secret}" },
    "query_params": { "account": "<<account_id>>" },
    "body": {},
    "timeout": 10,
    "max_retries": 3
  }
}

GlobalBuiltinFunction (type: “builtin”)

Built-in functions map to internal platform handlers. No HTTP configuration needed.

FieldTypeDescription
type"builtin"Identifies as built-in function
namestringFunction name
descriptionstringTells LLM when to call
handlerstringInternal handler name
pre_tts_messageOptional[string]Text spoken before handler executes

Available built-in handlers

HandlerDescriptionNotes
connect_to_live_agentTransfers call to a live human agentRequires configurations.transfer_number (destination) and the lead’s outbound_number_id (caller number). Missing either one causes the function to return a failed status and the AI resumes. See Warm transfer.
end_conversationEnds the call gracefullyUseful when the LLM determines the conversation is complete; accepts an optional reason.
update_outcomeFire-and-forget DB update of the lead’s outcome fieldRequires an outcome string argument. Does not wait for completion; the call continues.
get_current_timeReturns current date and time to the LLMUseful for time-reference conversations.

Additionally, mute_stt and unmute_stt are available as pre/post-action handlers ("type": "function" inside pre_actions / post_actions on a node or global function) — they do not show up as LLM-callable functions but can be triggered from action lists.

Live Agent Transfer

connect-to-live-agent.json
json
{
  "type": "builtin",
  "name": "connect_to_live_agent",
  "description": "Transfer the customer to a live human agent when they explicitly request it",
  "handler": "connect_to_live_agent",
  "pre_tts_message": "Sure, let me connect you with a live agent. Please hold for a moment."
}

Get Current Time

get-current-time.json
json
{
  "type": "builtin",
  "name": "get_current_time",
  "description": "Get the current date and time when needed",
  "handler": "get_current_time",
  "pre_tts_message": null
}

Placing global functions in a template

global-functions-in-template.json
json
{
  "flow": {
    "initial_node": "greeting",
    "nodes": ["..."],
    "global_functions": [
      {
        "type": "http",
        "name": "check_order_status",
        "description": "Look up order status when the customer asks",
        "properties": { "order_id": { "type": "string" } },
        "required": ["order_id"],
        "pre_actions": [{ "type": "tts_say", "text": "Let me check that for you." }],
        "post_actions": [],
        "expected_fields": { "order_id": { "source": "llm" } },
        "http_request": {
          "url": "https://api.example.com/orders/<<order_id>>",
          "method": "GET",
          "headers": { "Authorization": "Bearer {api_secret}" },
          "timeout": 10,
          "max_retries": 3
        }
      },
      {
        "type": "builtin",
        "name": "connect_to_live_agent",
        "description": "Transfer to a live agent when requested",
        "handler": "connect_to_live_agent",
        "pre_tts_message": "Connecting you now, please hold."
      }
    ],
    "end_conversation_callbacks": []
  }
}

Best practices

Global Function Design

Use global HTTP functions for data lookups needed in any node. Use built-in functions for platform operations. Keep node-specific logic in node functions.

  • Always add pre_actions to HTTP functions — the API call blocks, so a quick tts_say keeps the customer engaged
  • Set reasonable timeouts — use timeout: 10 and max_retries: 3 as defaults
  • Don’t overload global functions — only make functions global if they genuinely need multi-node access
  • Use descriptive descriptions — global functions compete with node functions for LLM attention
Was this helpful?