Transform any chatflow into a node in the canvas to implement cross-Chatflow calls
Author: yzddmr6
Version: 0.0.6
Type: tool
Repository: https://github.com/yzddmr6/chatflow_invoker
Currently, Dify does not support multi-Chatflow orchestration or cross-Chatflow invocation. This means that all business logic must be completed within a single Chatflow canvas, which becomes difficult to maintain as scenarios become more complex.
Although Dify provides a workaround by allowing Chatflows to be converted into Workflows and published as Tool nodes, this approach has the following limitations:
To address these limitations in multi-Chatflow orchestration, I have developed a plugin called Chatflow Invoker, which enables more flexible and efficient application development with Dify.
Chatflow Invoker can convert a Chatflow into a node within a process workflow, enabling cross-Chatflow invocation.
It can help you:
Through Dify's reverse invocation interface, you can reverse-call other Chatflows within the current Dify instance.

Input Parameters:
To further expand Dify's flexibility, this plugin supports remote Chatflow calls. You're no longer limited to a single Dify instance; you can freely combine them to implement distributed calls based on your business needs.

Input Parameters:
In large and complex projects, some Agents may be developed on different platforms such as LangChain, LangGraph, OpenAI, Bailian, etc. How can we achieve unified management and invocation in Dify?
If streaming output is not required, this can be achieved through HTTP calls or by publishing the Agent as an MCP tool, but this would sacrifice streaming output and affect user experience.
Therefore, I have implemented a universal streaming interface calling tool. Whether it's a model, Agent, Workflow, or Chatflow, regardless of which platform it was born on or which SDK it uses, as long as it provides a streaming output interface, it can easily integrate into Dify while perfectly preserving streaming output capability. Say goodbye to experience gaps and make your Dify applications more powerful and smooth.

Input Parameters:
Open the URL of the Chatflow to be called and obtain the APP ID from it.
For example: https://dify/app/f011f58c-b1ce-4a9b-89b2-f39fce8466a8/workflow
Here, is the APP ID.
For the Inputs JSON, set it to receive a user parameter.

In the reply node, select to obtain streaming output results.

Upon testing, the other Chatflow is successfully invoked, and streaming output is supported.

Fill in the URL to call and the API Key to implement remote Dify calls.

Create a simple test (note that memory functionality must be enabled in the called Chatflow).
Set Keep Conversation to True, then conduct multiple conversations.

From the second response, you can see that the Chatflow has implemented contextual conversation memory functionality.

I simulated an Agent developed with LangGraph and had Cursor help me implement an OpenAI format interface.
Fill in the Agent's API address. Here, no authentication logic is set and no modification is needed. Save and execute.

You can see that the LangGraph Agent has been successfully called with streaming output support.

Using Bailian as an example, let's show how to integrate a new output format Agent.
I developed a simple application on Bailian that accepts a query and a name parameter.

According to Bailian's documentation, we find the streaming output curl command: https://help.aliyun.com/zh/model-studio/invoke-workflow-application
Construct the corresponding call parameters:
And get the corresponding response example, write the corresponding Json Path:

Successfully call Bailian application and achieve streaming output.