-
Notifications
You must be signed in to change notification settings - Fork 36
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Description:
Trying to use an hosted docker neo4j-mcp server through the python CrewAI library with OpenAI model results in error when using the neo4j tools.
{'error': {'message': "Invalid schema for function 'read_cypher': None is not of type 'object', 'boolean'.", 'type': 'invalid_request_error', 'param': 'tools[1].function.parameters', 'code': 'invalid_function_parameters'}}
The library correctly reads the available mcp tools.
How to Reproduce:
Docke compose
Request:
services:
neo4j-mcp:
image: mcp/neo4j:latest
container_name: neo4j-mcp
ports:
- "8000:8000"
environment:
- NEO4J_URI=neo4j+s://demo.neo4jlabs.com:7687
- NEO4J_DATABASE=companies
- READ_ONLY="true"
- MAX_RESULT_ROWS=1000
- NEO4J_TRANSPORT_MODE=http
- NEO4J_TELEMETRY=false
- NEO4J_MCP_HTTP_HOST=0.0.0.0
- NEO4J_MCP_HTTP_PORT=8000
- NEO4J_MCP_HTTP_PATH=/mcp/
- NEO4J_MCP_HTTP_ALLOWED_ORIGINS="*"
- NEO4J_MCP_HTTP_ALLOWED_HOSTS="*"
- NEO4J_READ_TIMEOUT="300"Python Code:
from crewai import LLM, LLM, Agent, Task, Crew
from crewai_tools import MCPServerAdapter
from crewai.mcp import MCPServerHTTP
from dotenv import load_dotenv
# Load .env file
load_dotenv('./env.env')
token = "Y29tcGFuaWVzOmNvbXBhbmllcw=="
server_parameters = {
"url": "http://localhost:8000/mcp",
"transport": "streamable-http",
"headers": {
"Authorization": f"Bearer {token}"
}
}
# Optionally logging callbacks from Agents & Tasks
def log_step_callback(output):
print(
f"""
Step completed!
details: {output.__dict__}
"""
)
def log_task_callback(output):
print(
f"""
Task completed!
details: {output.__dict__}
"""
)
# Convenience for fastAPI call
def run(prompt: str):
# Load the MCP Tools
with MCPServerAdapter(server_parameters) as tools:
print(f"Available tools from MCP server(s): {[tool.name for tool in tools]}")
# Create an agent with access to tools
mcp_agent = Agent(
role="MCP Tool User",
goal="Utilize tools from MCP servers.",
backstory="I can connect to MCP servers and use their tools.",
tools=tools,
max_iterations=3,
step_callback=log_step_callback, # Optional
llm = LLM(
model="openai/gpt-4o",
api_key="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", # Or set OPENAI_API_KEY
temperature=0.7,
max_tokens=4000,
)
)
# Create a task referrencing user prompt
processing_task = Task(
description="""Process the following prompt about the Neo4j graph database: {prompt}""",
expected_output="A brief report on the outcome of the command: {prompt}",
agent=mcp_agent,
callback=log_task_callback, # Optional
)
# Create the crew
crew = Crew(agents=[mcp_agent], tasks=[processing_task], verbose=False)
# Run the crew w/ the user prompt
result = crew.kickoff(inputs={"prompt": prompt})
# Return the final answer
return {"result": result}
# For running as a script
if __name__ == "__main__":
cmd = "What are the competitors of BigFix?"
result = run(cmd)
print(
f"""
Query completed!
result: {result}
"""
)Expected Behavior:
For an MCP Server, the desiderata is that LLMs and libraries work smoothly with the defined tools.
Actual Behavior:
It seems that the tool definitions is not optimal to be used by libraries and LLMs.
MCP Version:
latest, 1.4.1
OS/Arc:
MacOS/ARM
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working