Releases: pipecat-ai/pipecat-flows
v1.0.0
Migration guide: https://docs.pipecat.ai/pipecat-flows/migration/migration-1.0
Added
ActionConfigis now exported from the top-levelpipecat_flowspackage.
Changed
-
Breaking: Requires Python >= 3.11 and
pipecat-ai>=1.0.0. -
⚠️ All task messages and summary messages now use"role": "developer"instead of"role": "user". This correctly distinguishes application instructions from actual user speech. If you have custom flows with"role": "user"intask_messages, consider updating them to"role": "developer". -
Validation errors in node configuration now raise
FlowErrororInvalidFunctionErrorinstead ofValueError. -
Bumped dependency versions for security updates:
loguru,docstring_parser,build,pip-tools,pre-commit,pyright,pytest-asyncio, andruff. -
Examples updated for Pipecat 1.0 patterns and
OpenAIResponsesLLMServicesupport (setLLM_PROVIDER=openai_responses).
Deprecated
RESET_WITH_SUMMARYcontext strategy is deprecated in favor of Pipecat's native context summarization. ADeprecationWarningis now emitted at runtime when the strategy is used. To trigger on-demand summarization during a node transition, push anLLMSummarizeContextFramein a pre-action. See https://docs.pipecat.ai/guides/fundamentals/context-summarization for the full guide. Will be removed in 2.0.0.
Removed
-
Breaking: Removed the
ttsparameter fromFlowManager.__init__(), deprecated since v0.0.18. Thetts_sayaction usesTTSSpeakFramedirectly. -
Breaking: Removed the
set_node()method, deprecated since v0.0.18. Useset_node_from_config()or consolidated/direct functions instead. -
Breaking: Removed
transition_toandtransition_callbackfromFlowsFunctionSchemaand provider-specific function definitions, deprecated since v0.0.18. Use a consolidatedhandlerthat returns a tuple(result, next_node), or use direct functions. -
Breaking: Removed static flows (
FlowConfigtype andflow_configparameter), deprecated since v0.0.19. Use dynamic flows instead. -
Breaking: Removed provider-specific LLM adapters (
OpenAIAdapter,AnthropicAdapter,GeminiAdapter,AWSBedrockAdapter). A single unifiedLLMAdapternow handles all providers via Pipecat's universalLLMContext. -
Breaking: Removed
OpenAILLMContextsupport. Use Pipecat's universalLLMContextexclusively. -
Breaking: Removed provider-specific dict format for function definitions. Use
FlowsFunctionSchemaor direct functions. -
Removed
__function__:token handler lookup pattern (was for static flows).
v0.0.24
Added
-
Added
timeout_secstoFlowsFunctionSchemaand@flows_direct_functiondecorator for per-tool function call timeout control, overriding the globalfunction_call_timeout_secs. -
Added
role_message(str) as the preferred field for setting the bot's role/personality. The system instruction is sent viaLLMUpdateSettingsFrameinstead of being included as system messages in the conversation context.
Changed
- Updated the
pipecat-aiminimum supported version to0.0.105.
Deprecated
role_messagesis deprecated in favor ofrole_message(str). The oldList[Dict]format is still supported for backward compatibility but will be removed in 1.0.0.
Fixed
- Fixed a bug where the system instruction was lost during
RESETandRESET_WITH_SUMMARYcontext strategy transitions when the new node did not re-specify it.
v0.0.23
Added
-
Added
cancel_on_interruptiontoFlowsFunctionSchemas. -
Added
@flows_direct_functiondecorator for attaching metadata to Pipecat direct functions. This allows configuring behavior likecancel_on_interruptionon the function definition.Example usage:
from pipecat_flows import flows_direct_function, FlowManager @flows_direct_function(cancel_on_interruption=False) async def long_running_task(flow_manager: FlowManager, query: str): """Perform a task that should not be cancelled on interruption. Args: query: The query to process. """ # ... implementation return {"status": "complete"}, None
Non-decorated direct functions use
cancel_on_interruption=Falseby default,
ensuring all function calls complete even during user interruptions.
Changed
- Changed
cancel_on_interruptiondefault fromTruetoFalsein bothFlowsFunctionSchemaand@flows_direct_function. Function calls now complete even during user interruptions by default, preventing stalled transitions and dropped results.
Fixed
- Fixed interrupted transition leaving flow permanently stuck when a user interruption cancelled a function call mid-execution.
v0.0.22
Added
- Added support for
global_functionsparameter inFlowManagerinitialization. Global functions are available at every node in a flow without needing to be specified in each node's configuration. Supports bothFlowsFunctionSchemaandFlowsDirectFunctiontypes.
Changed
-
Changed the fallback strategy to
APPENDin the event thatRESET_WITH_SUMMARYfails. -
Updated food ordering examples (food_ordering.py and food_ordering_direct_functions.py) to demonstrate global function usage with a delivery estimate function.
v0.0.21
Added
-
Add support for the new Pipecat
LLMSwitcher, which can be used as a drop-in replacement forLLMServices in scenarios where you want to switch LLMs at runtime.There are a couple of pre-requisites to using
LLMSwitcher:- You must be using the new universal
LLMContextandLLMContextAggregatorPair(as of Pipecat 0.0.82, supported only by Pipecat's OpenAI and Google LLM implementations, but with more on the way). - You must be using "direct" functions or
FlowsFunctionSchemafunctions (as opposed to provider-specific formats).
Using
LLMSwitcherlooks like this:# Create shared context and aggregators for your LLM services context = LLMContext() context_aggregator = LLMContextAggregatorPair(context) # Instantiate your LLM services llm_openai = OpenAILLMService(api_key=os.getenv("OPENAI_API_KEY")) llm_google = GoogleLLMService(api_key=os.getenv("GOOGLE_API_KEY")) # Instantiate a switcher # (ServiceSwitcherStrategyManual defaults to OpenAI, as it's first in the list) llm_switcher = LLMSwitcher( llms=[llm_openai, llm_google], strategy_type=ServiceSwitcherStrategyManual ) # Create your pipeline as usual (passing the switcher instead of an LLM) pipeline = Pipeline( [ transport.input(), stt, context_aggregator.user(), llm_switcher, tts, transport.output(), context_aggregator.assistant(), ] ) task = PipelineTask(pipeline, params=PipelineParams(allow_interruptions=True)) # Initialize your flow manager as usual (passing the switcher instead of an LLM) flow_manager = FlowManager( task=task, llm=llm_switcher, context_aggregator=context_aggregator, ) # ... # Start your flow as usual @transport.event_handler("on_client_connected") async def on_client_connected(transport, participant): await flow_manager.initialize(create_main_node()) # ... # Whenever is appropriate, switch LLMs! await task.queue_frames([ManuallySwitchServiceFrame(service=llm_google)])
- You must be using the new universal
v0.0.20
Changed
- Added an
@propertyfor the followingFlowManagerattributes in order to officially make them part of the public API:state,task,transport, andcurrent_node.
v0.0.19
Deprecated
- Static Flows are now deprecated and will be removed in v1.0.0. Use Dynamic Flows in their place. The deprecation includes the
flow_configarg of theFlowManagerand theFlowConfigtype.
v0.0.18
Added
-
Added a new optional
namefield toNodeConfig. When using dynamic flows
alongside "consolidated" functions that return a tuple (result, next node),
giving the next node anameis helpful for debug logging. If you don't
specify aname, an automatically-generated UUID is used. -
Added support for providing "consolidated" functions, which are responsible
for both doing some work as well as specifying the next node to transition
to. When using consolidated functions, you don't specifytransition_toor
transition_callback.Usage:
# "Consolidated" function async def do_something(args: FlowArgs) -> tuple[FlowResult, NodeConfig]: foo = args["foo"] bar = args.get("bar", "") # Do some work (optional; this function may be a transition-only function) result = await process(foo, bar) # Specify next node (optional; this function may be a work-only function) # This is either a NodeConfig (for dynamic flows) or a node name (for # static flows) next_node = create_another_node() return result, next_node def create_a_node() -> NodeConfig: return NodeConfig( task_messages=[ # ... ], functions=[FlowsFunctionSchema( name="do_something", description="Do something interesting.", handler=do_something, properties={ "foo": { "type": "integer", "description": "The foo to do something interesting with." }, "bar": { "type": "string", "description": "The bar to do something interesting with." } }, required=["foo"], )], )
-
Added support for providing "direct" functions, which don't need an
accompanyingFlowsFunctionSchemaor function definition dict. Instead,
metadata (i.e.name,description,properties, andrequired) are
automatically extracted from a combination of the function signature and
docstring.Usage:
# "Direct" function # `flow_manager` must be the first parameter async def do_something(flow_manager: FlowManager, foo: int, bar: str = "") -> tuple[FlowResult, NodeConfig]: """ Do something interesting. Args: foo (int): The foo to do something interesting with. bar (string): The bar to do something interesting with. """ # Do some work (optional; this function may be a transition-only function) result = await process(foo, bar) # Specify next node (optional; this function may be a work-only function) # This is either a NodeConfig (for dynamic flows) or a node name (for static flows) next_node = create_another_node() return result, next_node def create_a_node() -> NodeConfig: return NodeConfig( task_messages=[ # ... ], functions=[do_something] )
Changed
functionsare now optional in theNodeConfig. Additionally, for AWS
Bedrock, Anthropic, and Gemini, you no longer need to provide a no_op
function. The LLM adapters now handle this case on your behalf. This allows
you to either omitfunctionsfor nodes, which is common for the end node,
or specify an empty function call list, if desired.
Deprecated
-
The
ttsparameter inFlowManager.__init__()is now deprecated and will
be removed in a future version. Thetts_sayaction now pushes a
TTSSpeakFrame. -
Deprecated
transition_toandtransition_callbackin favor of
"consolidated"handlers that return a tuple (result, next node).
Alternatively, you could use "direct" functions and avoid using
FlowsFunctionSchemas or function definition dicts entirely. See the "Added"
section above for more details. -
Deprecated
set_node()in favor of doing the following for dynamic flows:- Prefer "consolidated" or "direct" functions that return a tuple (result,
next node) over deprecatedtransition_callbacks - Pass your initial node to
FlowManager.initialize() - If you really need to set a node explicitly, use
set_node_from_config()
In all of these cases, you can provide a
namein your new node's config for
debug logging purposes. - Prefer "consolidated" or "direct" functions that return a tuple (result,
Fixed
-
Fixed an issue where
RESET_WITH_SUMMARYwasn't working for the
GeminiAdapter. Now, theGeminiAdapteruses thegoogle-genaipackage,
aligning with the package used bypipecat-ai. -
Fixed an issue where if
run_in_parallel=Falsewas set for the LLM, the bot
would trigger N completions for each sequential function call. Now, Flows
uses Pipecat's internal function tracking to determine when there are more
edge functions to call. -
Overhauled
pre_actionsandpost_actionstiming logic, making their timing
more predictable and eliminating some bugs. For example, nowtts_say
actions will always run after the bot response, when used inpost_actions.
v0.0.17
Added
- Added support for
AWSBedrockLLMServiceby adding anAWSBedrockAdapter.
Changed
-
Added
respond_immediatelytoNodeConfig. Setting it toFalsehas the
effect of making the bot wait, after the node is activated, for the user to
speak before responding. This can be used for the initial node, if you want
the user to speak first. -
Bumped the minimum required
pipecat-aiversion to 0.0.67 to align with AWS
Bedrock additions in Pipecat. This also adds support forFunctionCallParams
which were added in 0.0.66. -
Updated to use
FunctionCallParamsas args for the function handler. -
Updated imports to use the new .stt, .llm, and .tts paths.
Other
-
Added AWS Bedrock examples for insurance and patient_intake.
-
Updated examples to
audio_in_enabled=Trueand removevad_enabledand
vad_audio_passthroughto align with the latest PipecatTransportParams.
v0.0.16
Added
-
Added a new "function" action type, which queues a function to run "inline"
in the pipeline (i.e. when the pipeline is done with all the work queued
before it).This is useful for doing things at the end of the bot's turn.
Example usage:
async def after_the_fun_fact(action: dict, flow_manager: FlowManager): print("Done telling the user a fun fact.") def create_node() -> NodeConfig: return NodeConfig( task_messages=[ { "role": "system", "content": "Greet the user and tell them a fun fact." }, post_actions=[ ActionConfig( type="function", handler=after_the_fun_fact ) ] ] )
-
Added support for
OpenAILLMServicesubclasses in the adapter system. You
can now use any Pipecat LLM service that inherits fromOpenAILLMService
such asAzureLLMService,GrokLLMService,GroqLLMService, and other
without requiring adapter updates. See the Pipecat docs for
supported LLM services. -
Added a new
FlowsFunctionSchemaclass, which allows you to specify function
calls using a standard schema. This is effectively a subclass of Pipecat's
FunctionSchema.
Example usage:
# Define a function using FlowsFunctionSchema
collect_name = FlowsFunctionSchema(
name="collect_name",
description="Record the user's name",
properties={
"name": {"type": "string", "description": "The user's name"}
},
required=["name"],
handler=collect_name_handler,
transition_to="next_node"
)
# Use in node configuration
node_config = {
"task_messages": [...],
"functions": [collect_name]
}Changed
- Function handlers can now receive either
FlowArgsonly (legacy style) or
bothFlowArgsand theFlowManagerinstance (modern style). Adding support
for theFlowManagerprovides access to conversation state, transport
methods, and other flow resources within function handlers. The framework
automatically detects which signature you're using and calls handlers
appropriately.
Dependencies
- Updated minimum Pipecat version to 0.0.61 to use
FunctionSchemaand
provider-specific adapters and the latest improvements to context management.
Other
-
Update restaurant_reservation.py and insurance_gemini.py to use
FlowsFunctionSchema. -
Updated examples to specify a
paramsarg forPipelineTask, meeting the
Pipecat requirement starting 0.0.58.