Skip to content

Releases: pipecat-ai/pipecat-flows

v1.0.0

15 Apr 22:31
119e0af

Choose a tag to compare

Migration guide: https://docs.pipecat.ai/pipecat-flows/migration/migration-1.0

Added

  • ActionConfig is now exported from the top-level pipecat_flows package.

Changed

  • Breaking: Requires Python >= 3.11 and pipecat-ai>=1.0.0.

  • ⚠️ All task messages and summary messages now use "role": "developer" instead of "role": "user". This correctly distinguishes application instructions from actual user speech. If you have custom flows with "role": "user" in task_messages, consider updating them to "role": "developer".

  • Validation errors in node configuration now raise FlowError or InvalidFunctionError instead of ValueError.

  • Bumped dependency versions for security updates: loguru, docstring_parser, build, pip-tools, pre-commit, pyright, pytest-asyncio, and ruff.

  • Examples updated for Pipecat 1.0 patterns and OpenAIResponsesLLMService support (set LLM_PROVIDER=openai_responses).

Deprecated

  • RESET_WITH_SUMMARY context strategy is deprecated in favor of Pipecat's native context summarization. A DeprecationWarning is now emitted at runtime when the strategy is used. To trigger on-demand summarization during a node transition, push an LLMSummarizeContextFrame in a pre-action. See https://docs.pipecat.ai/guides/fundamentals/context-summarization for the full guide. Will be removed in 2.0.0.

Removed

  • Breaking: Removed the tts parameter from FlowManager.__init__(), deprecated since v0.0.18. The tts_say action uses TTSSpeakFrame directly.

  • Breaking: Removed the set_node() method, deprecated since v0.0.18. Use set_node_from_config() or consolidated/direct functions instead.

  • Breaking: Removed transition_to and transition_callback from FlowsFunctionSchema and provider-specific function definitions, deprecated since v0.0.18. Use a consolidated handler that returns a tuple (result, next_node), or use direct functions.

  • Breaking: Removed static flows (FlowConfig type and flow_config parameter), deprecated since v0.0.19. Use dynamic flows instead.

  • Breaking: Removed provider-specific LLM adapters (OpenAIAdapter, AnthropicAdapter, GeminiAdapter, AWSBedrockAdapter). A single unified LLMAdapter now handles all providers via Pipecat's universal LLMContext.

  • Breaking: Removed OpenAILLMContext support. Use Pipecat's universal LLMContext exclusively.

  • Breaking: Removed provider-specific dict format for function definitions. Use FlowsFunctionSchema or direct functions.

  • Removed __function__: token handler lookup pattern (was for static flows).

v0.0.24

20 Mar 21:07
92fa5a9

Choose a tag to compare

Added

  • Added timeout_secs to FlowsFunctionSchema and @flows_direct_function decorator for per-tool function call timeout control, overriding the global function_call_timeout_secs.

  • Added role_message (str) as the preferred field for setting the bot's role/personality. The system instruction is sent via LLMUpdateSettingsFrame instead of being included as system messages in the conversation context.

Changed

  • Updated the pipecat-ai minimum supported version to 0.0.105.

Deprecated

  • role_messages is deprecated in favor of role_message (str). The old List[Dict] format is still supported for backward compatibility but will be removed in 1.0.0.

Fixed

  • Fixed a bug where the system instruction was lost during RESET and RESET_WITH_SUMMARY context strategy transitions when the new node did not re-specify it.

v0.0.23

28 Feb 00:41
9ade05c

Choose a tag to compare

Added

  • Added cancel_on_interruption to FlowsFunctionSchemas.

  • Added @flows_direct_function decorator for attaching metadata to Pipecat direct functions. This allows configuring behavior like cancel_on_interruption on the function definition.

    Example usage:

    from pipecat_flows import flows_direct_function, FlowManager
    
    @flows_direct_function(cancel_on_interruption=False)
    async def long_running_task(flow_manager: FlowManager, query: str):
        """Perform a task that should not be cancelled on interruption.
    
        Args:
            query: The query to process.
        """
        # ... implementation
        return {"status": "complete"}, None

    Non-decorated direct functions use cancel_on_interruption=False by default,
    ensuring all function calls complete even during user interruptions.

Changed

  • Changed cancel_on_interruption default from True to False in both FlowsFunctionSchema and @flows_direct_function. Function calls now complete even during user interruptions by default, preventing stalled transitions and dropped results.

Fixed

  • Fixed interrupted transition leaving flow permanently stuck when a user interruption cancelled a function call mid-execution.

v0.0.22

18 Nov 15:15
ccf025f

Choose a tag to compare

Added

  • Added support for global_functions parameter in FlowManager initialization. Global functions are available at every node in a flow without needing to be specified in each node's configuration. Supports both FlowsFunctionSchema and FlowsDirectFunction types.

Changed

  • Changed the fallback strategy to APPEND in the event that RESET_WITH_SUMMARY fails.

  • Updated food ordering examples (food_ordering.py and food_ordering_direct_functions.py) to demonstrate global function usage with a delivery estimate function.

v0.0.21

17 Sep 16:05
00923ca

Choose a tag to compare

Added

  • Add support for the new Pipecat LLMSwitcher, which can be used as a drop-in replacement for LLMServices in scenarios where you want to switch LLMs at runtime.

    There are a couple of pre-requisites to using LLMSwitcher:

    • You must be using the new universal LLMContext and LLMContextAggregatorPair (as of Pipecat 0.0.82, supported only by Pipecat's OpenAI and Google LLM implementations, but with more on the way).
    • You must be using "direct" functions or FlowsFunctionSchema functions (as opposed to provider-specific formats).

    Using LLMSwitcher looks like this:

    # Create shared context and aggregators for your LLM services
    context = LLMContext()
    context_aggregator = LLMContextAggregatorPair(context)
    
    # Instantiate your LLM services
    llm_openai = OpenAILLMService(api_key=os.getenv("OPENAI_API_KEY"))
    llm_google = GoogleLLMService(api_key=os.getenv("GOOGLE_API_KEY"))
    
    # Instantiate a switcher
    # (ServiceSwitcherStrategyManual defaults to OpenAI, as it's first in the list)
    llm_switcher = LLMSwitcher(
        llms=[llm_openai, llm_google], strategy_type=ServiceSwitcherStrategyManual
    )
    
    # Create your pipeline as usual (passing the switcher instead of an LLM)
    pipeline = Pipeline(
      [
          transport.input(),
          stt,
          context_aggregator.user(),
          llm_switcher,
          tts,
          transport.output(),
          context_aggregator.assistant(),
      ]
    )
    task = PipelineTask(pipeline, params=PipelineParams(allow_interruptions=True))
    
    # Initialize your flow manager as usual (passing the switcher instead of an LLM)
    flow_manager = FlowManager(
        task=task,
        llm=llm_switcher,
        context_aggregator=context_aggregator,
    )
    
    # ...
    # Start your flow as usual
    @transport.event_handler("on_client_connected")
    async def on_client_connected(transport, participant):
        await flow_manager.initialize(create_main_node())
    
    # ...
    # Whenever is appropriate, switch LLMs!
    await task.queue_frames([ManuallySwitchServiceFrame(service=llm_google)])

v0.0.20

27 Aug 15:55
b8f3cf1

Choose a tag to compare

Changed

  • Added an @property for the following FlowManager attributes in order to officially make them part of the public API: state, task, transport, and current_node.

v0.0.19

25 Aug 21:12

Choose a tag to compare

Deprecated

  • Static Flows are now deprecated and will be removed in v1.0.0. Use Dynamic Flows in their place. The deprecation includes the flow_config arg of the FlowManager and the FlowConfig type.

v0.0.18

27 Jun 16:16
72c5ebf

Choose a tag to compare

Added

  • Added a new optional name field to NodeConfig. When using dynamic flows
    alongside "consolidated" functions that return a tuple (result, next node),
    giving the next node a name is helpful for debug logging. If you don't
    specify a name, an automatically-generated UUID is used.

  • Added support for providing "consolidated" functions, which are responsible
    for both doing some work as well as specifying the next node to transition
    to. When using consolidated functions, you don't specify transition_to or
    transition_callback.

    Usage:

    # "Consolidated" function
    async def do_something(args: FlowArgs) -> tuple[FlowResult, NodeConfig]:
      foo = args["foo"]
      bar = args.get("bar", "")
    
      # Do some work (optional; this function may be a transition-only function)
      result = await process(foo, bar)
    
      # Specify next node (optional; this function may be a work-only function)
      # This is either a NodeConfig (for dynamic flows) or a node name (for
      # static flows)
      next_node = create_another_node()
    
      return result, next_node
    
    def create_a_node() -> NodeConfig:
      return NodeConfig(
          task_messages=[
            # ...
          ],
          functions=[FlowsFunctionSchema(
              name="do_something",
              description="Do something interesting.",
              handler=do_something,
              properties={
                "foo": {
                  "type": "integer",
                  "description": "The foo to do something interesting with."
                },
                "bar": {
                  "type": "string",
                  "description": "The bar to do something interesting with."
                }
              },
              required=["foo"],
          )],
      )
  • Added support for providing "direct" functions, which don't need an
    accompanying FlowsFunctionSchema or function definition dict. Instead,
    metadata (i.e. name, description, properties, and required) are
    automatically extracted from a combination of the function signature and
    docstring.

    Usage:

    # "Direct" function
    # `flow_manager` must be the first parameter
    async def do_something(flow_manager: FlowManager, foo: int, bar: str = "") -> tuple[FlowResult, NodeConfig]:
      """
      Do something interesting.
    
      Args:
        foo (int): The foo to do something interesting with.
        bar (string): The bar to do something interesting with.
      """
    
      # Do some work (optional; this function may be a transition-only function)
      result = await process(foo, bar)
    
      # Specify next node (optional; this function may be a work-only function)
      # This is either a NodeConfig (for dynamic flows) or a node name (for static flows)
      next_node = create_another_node()
    
      return result, next_node
    
    def create_a_node() -> NodeConfig:
      return NodeConfig(
        task_messages=[
          # ...
        ],
        functions=[do_something]
      )

Changed

  • functions are now optional in the NodeConfig. Additionally, for AWS
    Bedrock, Anthropic, and Gemini, you no longer need to provide a no_op
    function. The LLM adapters now handle this case on your behalf. This allows
    you to either omit functions for nodes, which is common for the end node,
    or specify an empty function call list, if desired.

Deprecated

  • The tts parameter in FlowManager.__init__() is now deprecated and will
    be removed in a future version. The tts_say action now pushes a
    TTSSpeakFrame.

  • Deprecated transition_to and transition_callback in favor of
    "consolidated" handlers that return a tuple (result, next node).
    Alternatively, you could use "direct" functions and avoid using
    FlowsFunctionSchemas or function definition dicts entirely. See the "Added"
    section above for more details.

  • Deprecated set_node() in favor of doing the following for dynamic flows:

    • Prefer "consolidated" or "direct" functions that return a tuple (result,
      next node) over deprecated transition_callbacks
    • Pass your initial node to FlowManager.initialize()
    • If you really need to set a node explicitly, use set_node_from_config()

    In all of these cases, you can provide a name in your new node's config for
    debug logging purposes.

Fixed

  • Fixed an issue where RESET_WITH_SUMMARY wasn't working for the
    GeminiAdapter. Now, the GeminiAdapter uses the google-genai package,
    aligning with the package used by pipecat-ai.

  • Fixed an issue where if run_in_parallel=False was set for the LLM, the bot
    would trigger N completions for each sequential function call. Now, Flows
    uses Pipecat's internal function tracking to determine when there are more
    edge functions to call.

  • Overhauled pre_actions and post_actions timing logic, making their timing
    more predictable and eliminating some bugs. For example, now tts_say
    actions will always run after the bot response, when used in post_actions.

v0.0.17

16 May 15:36
30bf60b

Choose a tag to compare

Added

  • Added support for AWSBedrockLLMService by adding an AWSBedrockAdapter.

Changed

  • Added respond_immediately to NodeConfig. Setting it to False has the
    effect of making the bot wait, after the node is activated, for the user to
    speak before responding. This can be used for the initial node, if you want
    the user to speak first.

  • Bumped the minimum required pipecat-ai version to 0.0.67 to align with AWS
    Bedrock additions in Pipecat. This also adds support for FunctionCallParams
    which were added in 0.0.66.

  • Updated to use FunctionCallParams as args for the function handler.

  • Updated imports to use the new .stt, .llm, and .tts paths.

Other

  • Added AWS Bedrock examples for insurance and patient_intake.

  • Updated examples to audio_in_enabled=True and remove vad_enabled and
    vad_audio_passthrough to align with the latest Pipecat TransportParams.

v0.0.16

26 Mar 19:10
dec8a96

Choose a tag to compare

Added

  • Added a new "function" action type, which queues a function to run "inline"
    in the pipeline (i.e. when the pipeline is done with all the work queued
    before it).

    This is useful for doing things at the end of the bot's turn.

    Example usage:

    async def after_the_fun_fact(action: dict, flow_manager: FlowManager):
      print("Done telling the user a fun fact.")
    
    def create_node() -> NodeConfig:
      return NodeConfig(
        task_messages=[
          {
            "role": "system",
            "content": "Greet the user and tell them a fun fact."
          },
          post_actions=[
            ActionConfig(
              type="function",
              handler=after_the_fun_fact
            )
          ]
        ]
      )
  • Added support for OpenAILLMService subclasses in the adapter system. You
    can now use any Pipecat LLM service that inherits from OpenAILLMService
    such as AzureLLMService, GrokLLMService, GroqLLMService, and other
    without requiring adapter updates. See the Pipecat docs for
    supported LLM services.

  • Added a new FlowsFunctionSchema class, which allows you to specify function
    calls using a standard schema. This is effectively a subclass of Pipecat's
    FunctionSchema.

Example usage:

# Define a function using FlowsFunctionSchema
collect_name = FlowsFunctionSchema(
    name="collect_name",
    description="Record the user's name",
    properties={
        "name": {"type": "string", "description": "The user's name"}
    },
    required=["name"],
    handler=collect_name_handler,
    transition_to="next_node"
)

# Use in node configuration
node_config = {
    "task_messages": [...],
    "functions": [collect_name]
}

Changed

  • Function handlers can now receive either FlowArgs only (legacy style) or
    both FlowArgs and the FlowManager instance (modern style). Adding support
    for the FlowManager provides access to conversation state, transport
    methods, and other flow resources within function handlers. The framework
    automatically detects which signature you're using and calls handlers
    appropriately.

Dependencies

  • Updated minimum Pipecat version to 0.0.61 to use FunctionSchema and
    provider-specific adapters and the latest improvements to context management.

Other

  • Update restaurant_reservation.py and insurance_gemini.py to use
    FlowsFunctionSchema.

  • Updated examples to specify a params arg for PipelineTask, meeting the
    Pipecat requirement starting 0.0.58.