在自然语言处理和自动化编排领域,Langraph相较于MetaGPT展现了明显的优势,尤其是在任务编排的复杂性和可扩展性方面。随着技术的不断发展,Langraph的编排和高度灵活的架构使其成为更符合未来需求的框架。本文将探讨Langraph优于MetaGPT的原因,分析其功能特性、优势和发展潜力。
1. Langraph的优势远超MetaGPT
Langraph是一个更先进、更灵活的框架,能够处理更复杂的任务和编排需求,适用于更广泛的应用场景。与MetaGPT相比,Langraph动态更新、以及更强的编排功能使其成为未来发展的更优选择。在处理复杂的条件、循环和顺序任务时,Langraph不仅比MetaGPT更具优势,其灵活性和扩展性也更具前景。简而言之,Langraph在未来的发展中必定会超越MetaGPT,成为自动化编排领域的领先框架。
2. Langraph与MetaGPT的对比
2.1 编排能力:Langraph的编排胜过MetaGPT
Langraph最大的优势之一是其强大的编排功能。Langraph不仅能够处理顺序任务,还能够灵活处理循环和条件任务,这对于复杂的自动化过程尤为重要。类比于一门编程语言,Langraph支持顺序、循环、条件三种原子‘语句’,在这三种原子‘语句’的基础上,可以构建出复杂的业务流。而MetaGPT虽然可以完成顺序任务,但在应对复杂任务时,其功能较为单一,无法像Langraph那样轻松处理多种复杂的编排模式。
2.2 更新与实时性:Langraph始终领先
MetaGPT虽为半开源框架,目标依然是商业化应用,更新速度较慢。而Langraph自MCP和A2A协议出现以来,迅速集成了这些新技术,不仅能跟上技术的步伐,还能及时应用到实际开发中。这种快速响应和实时更新的能力,使得Langraph在市场上占据了更大的竞争优势。例如:Langraph已经提供了langchain_mcp_adapters包,将mcp-client封装为mcp-tool。
2.3 协议与发展空间:Langraph更具扩展性
以A2A协议为例,它给出的样例中就包含了Langraph集成的例子,展示了新技术对Langraph的接纳程度以及Langraph具备高效集成能力。这种灵活的架构,使得Langraph能够在复杂系统中进行拓展,支持更大规模的应用,具有比MetaGPT更广泛的潜力。而MetaGPT则显得局限,虽然它也能完成基本的任务编排,但在更高层次的扩展性上,其能力远不如Langraph。
3. talk is cheap, show me the code
一个使用工具的例子如下
from typing import Annotated
from typing_extensions import TypedDict
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
import os
from openai import OpenAI
import json
from langchain_openai import ChatOpenAI
from langchain_openai.chat_models.base import _convert_message_to_dict
from langchain_core.messages import ToolMessage
from langgraph.prebuilt import create_react_agent
from langchain_core.tools import tool
# 这里可以配置deepseek或者自己公司的大模型,条件是以OpenAI接口对外暴露大模型
llm = ChatOpenAI(
model = 'xxx',
temperature = 0.7,
max_tokens = 5000,
top_p = 0.5,
api_key = 'xxxx',
base_url = 'xxxx',
streaming = False)
@tool("multiply_tool", parse_docstring=True)
def multiply(a: int, b: int) -> int:
"""
Multiply two numbers.
Args:
a (int): The first number.
b (int): The second number.
Returns:
int: The result of multiplying `a` and `b`.
"""
print(f"multiply_tool called with a={a}, b={b}")
return a * b
tools = [multiply]
llm_with_tools = create_react_agent(
model = llm.bind_tools(tools, parallel_tool_calls=False),
tools = tools
)
class State(TypedDict):
# Messages have the type "list". The `add_messages` function
# in the annotation defines how this state key should be updated
# (in this case, it appends messages to the list, rather than overwriting them)
messages: Annotated[list, add_messages]
def chatbot(state: State):
messages = state["messages"]
if isinstance(messages, list):
messages = {"messages": [{"role": "user", "content": messages[-1].content}]}
result = llm_with_tools.invoke(messages)
return {"messages": [_convert_message_to_dict(m) for m in result['messages']]}
def stream_graph_updates(user_input: str):
for event in graph.stream({"messages": [{"role": "user", "content": user_input}]}):
for value in event.values():
print("Assistant:", value["messages"][-1]["content"])
# 条件
def route_tools(
state: State,
):
if isinstance(state, list):
ai_message = state[-1]
elif messages := state.get("messages", []):
ai_message = messages[-1]
else:
raise ValueError(f"No messages found in input state to tool_edge: {state}")
if hasattr(ai_message, "tool_calls") and len(ai_message.tool_calls) > 0:
return "tools"
return END
class BasicToolNode:
"""A node that runs the tools requested in the last AIMessage."""
def __init__(self, tools: list) -> None:
self.tools_by_name = {tool.name: tool for tool in tools}
def __call__(self, inputs: dict):
if messages := inputs.get("messages", []):
message = messages[-1]
else:
raise ValueError("No message found in input")
outputs = []
for tool_call in message.tool_calls:
tool_result = self.tools_by_name[tool_call["name"]].invoke(
tool_call["args"]
)
outputs.append(
ToolMessage(
content=json.dumps(tool_result),
name=tool_call["name"],
tool_call_id=tool_call["id"],
)
)
return {"messages": outputs}
if __name__ == "__main__":
graph_builder = StateGraph(State)
# The first argument is the unique node name
# The second argument is the function or object that will be called whenever
# the node is used.
graph_builder.add_node("chatbot", chatbot)
graph_builder.add_edge(START, "chatbot")
tool_node = BasicToolNode(tools=tools)
graph_builder.add_node("tools", tool_node)
# The `tools_condition` function returns "tools" if the chatbot asks to use a tool, and "END" if
# it is fine directly responding. This conditional routing defines the main agent loop.
graph_builder.add_conditional_edges(
"chatbot",
route_tools,
# The following dictionary lets you tell the graph to interpret the condition's outputs as a specific node
# It defaults to the identity function, but if you
# want to use a node named something else apart from "tools",
# You can update the value of the dictionary to something else
# e.g., "tools": "my_tools"
{"tools": "tools", END: END},
)
# Any time a tool is called, we return to the chatbot to decide the next step
graph_builder.add_edge("tools", "chatbot")
graph = graph_builder.compile()
while True:
user_input = input("User: ")
if user_input.lower() in ["quit", "exit", "q"]:
print("Goodbye!")
break
stream_graph_updates(user_input)
这个例子是Langraph官网上的原装例子,我做了一些改造,让这个例子更加通用。例子上有4个节点和1个条件,而这种条件做法,在MetaGPT中是很难做到的
4 结论
Langraph凭借其强大的编排能力、对新技术的迅速响应以及对复杂逻辑的支持,显然是一个更具前瞻性和发展潜力的框架。它不仅能够更好地满足当前的编排需求,还能通过其灵活的架构和功能扩展为未来的发展奠定更坚实的基础。
最后,从一个高维度来看,Langraph具备一门语言可组合的基本特性(顺序、条件、组合),具备强的发展潜力。虽然应用层框架层出不穷,但我认为Langraph值得投入,作为AI应用开发者的必备技能。