Skip to main content

Catering Coordinator Agent — OpenAI Agents

The Catering Coordinator is an OpenAI Agents SDK agent that finds catering options by cuisine type and guest count. It's one of seven specialist agents in the Event Planning Team — a multi-agent system where each agent is built with a different framework and coordinates through Dapr pub/sub.


Event Planning Team

AgentFrameworkPortPub/Sub Topics
Venue ScoutCrewAI8001venue.requestsvenue.results
Catering CoordinatorOpenAI Agents8002catering.requestscatering.results
Entertainment PlannerGoogle ADK8003entertainment.requestsentertainment.results
Budget AnalystStrands8004budget.requestsbudget.results
Schedule PlannerLangGraph8005schedule.requestsschedule.results
Invitations ManagerDapr Agents8006events.invitations.requests
Event CoordinatorDapr Agents8007Orchestrator
Decoration PlannerPydantic AI8008decorations.requestsdecorations.results

Prerequisites


Agent Code

The Catering Coordinator uses OpenAI Agents SDK's Agent and @function_tool decorator, wrapped in a DaprWorkflowAgentRunner for durable execution and pub/sub messaging.

agents/openai-agents/main.py
import logging
import os

logging.basicConfig(level=logging.DEBUG)

from agents import Agent, function_tool
from diagrid.agent.openai_agents import DaprWorkflowAgentRunner
from diagrid.agent.core.state import DaprStateStore


@function_tool
def search_catering(cuisine: str, guest_count: int) -> str:
"""Search for catering options by cuisine type and guest count."""
return (
f"Found catering options for {guest_count} guests ({cuisine}):\n"
f"1. Elite Catering Co - ${guest_count * 45}/event, full service\n"
f"2. Farm Fresh Events - ${guest_count * 35}/event, organic menu\n"
f"3. Quick Bites Catering - ${guest_count * 25}/event, casual buffet"
)


agent = Agent(
name="catering-coordinator",
instructions="You are a catering coordinator. When asked to find catering, use the search_catering tool with the cuisine type and number of guests. Return the available catering options with pricing. Always call the tool before responding.",
tools=[search_catering],
)

# State: persist agent memory across invocations
runner = DaprWorkflowAgentRunner(
agent=agent,
state_store=DaprStateStore(store_name="agent-memory"),
)

# PubSub: subscribe for incoming tasks, publish results
runner.serve(
port=int(os.environ.get("APP_PORT", "8002")),
pubsub_name="agent-pubsub",
subscribe_topic="catering.requests",
publish_topic="catering.results",
)

What's happening

  1. @function_tool — OpenAI Agents decorator that registers a Python function as a tool the agent can call
  2. Agent(...) — OpenAI agent with a name and instructions that guide behavior
  3. DaprWorkflowAgentRunner — Wraps the agent in a durable workflow that survives crashes and restarts
  4. DaprStateStore — Persists agent memory to Dapr state management (backed by Redis locally, or any Dapr state store in production)
  5. runner.serve() — Starts an HTTP server that subscribes to catering.requests and publishes results to catering.results via Dapr pub/sub

Dapr Components

The agent uses shared Dapr components for pub/sub messaging, state persistence, and LLM access.

Pub/Sub (agent-pubsub)

resources/agent-pubsub.yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: agent-pubsub
spec:
type: pubsub.redis
version: v1
metadata:
- name: redisHost
value: localhost:6379
- name: redisPassword
value: ""

Routes messages between agents. Locally uses Redis; in production, swap for Kafka, RabbitMQ, or any Dapr pub/sub broker.

State Store (agent-memory)

resources/agent-memory.yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: agent-memory
spec:
type: state.redis
version: v1
metadata:
- name: redisHost
value: localhost:6379
- name: redisPassword
value: ""
- name: actorStateStore
value: "false"

Persists agent memory and conversation state across invocations.

LLM Provider

resources/llm-provider.yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: llm-provider
spec:
type: conversation.openai
version: v1
metadata:
- name: key
value: "{{OPENAI_API_KEY}}"
- name: model
value: gpt-4.1-2025-04-14

Provides LLM access through Dapr's conversation API. The API key is injected from the environment.


Run the Agent

1

Clone the quickstart

git clone https://github.com/diagridio/catalyst-quickstarts.git
cd catalyst-quickstarts/agents/openai-agents
2

Set your API key

export OPENAI_API_KEY="your-key-here"
3

Install dependencies

pip install -r requirements.txt
4

Start the agent

diagrid dev run -f dev-python-openai.yaml
5

Test the agent

Send a catering search request:

curl -X POST http://localhost:8888/agent/run \
-H "Content-Type: application/json" \
-d '{"task": "Find Italian catering for 150 guests"}'

Run with the Full Team

To run all seven specialist agents together with the orchestrator, see the Event Planning Team overview.


Key Concepts

ConceptDescription
Durable WorkflowsDaprWorkflowAgentRunner checkpoints every step — agent survives crashes and restarts
Pub/Sub DecouplingAgents communicate through topics, not direct calls. Add or remove agents without code changes.
State PersistenceDaprStateStore saves agent memory to a pluggable backend (Redis, PostgreSQL, etc.)
Portable InfrastructureSwap message brokers and state stores by changing YAML — agent code stays the same

Next Steps