-
Notifications
You must be signed in to change notification settings - Fork 6
Add pydantic model parameter conversion support #86
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
f7668c4
to
0dad77b
Compare
0dad77b
to
9ed7a1f
Compare
Raises: | ||
ValidationError: If pydantic validation fails | ||
""" | ||
from pydantic import BaseModel, TypeAdapter |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move top-level
""" | ||
import typing | ||
|
||
from pydantic import ValidationError |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Move both above to top-level
except Exception as e: | ||
# Log unexpected errors with stack trace but continue processing other parameters | ||
logger.exception(f"Unexpected error converting parameter '{param_name}': {e}") | ||
converted_params[param_name] = param_value |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you explain this one?
If a parameter can’t be converted into the expected type, it could also be reported as a validation error here?
Also, logger.exception vs logger.debug?
9ed7a1f
to
1db9d17
Compare
Description
Add pydantic model parameter conversion to the Python executor. Functions that expect pydantic models as parameters can now receive dictionary inputs, which are automatically converted to the appropriate model instances.
This change enables Python endpoints with pydantic model parameters to work correctly with MCP tool calls, which provide data as JSON/dictionaries.
Parameters
The user defines a Python function expecting a specific Pydantic model:
but the tool is defined via YAML schema, creating a type mismatch between the YAML-generated model and the user's expected model.
The way we do it: YAML Schema → FastMCP Pydantic Model → Dictionary → User's Pydantic Model
_create_pydantic_model_from_schema()
TypeValidator.validate_input()
converts the FastMCP Pydantic model to a dictionarymodel_dump()
(Pydantic v2) ordict()
(Pydantic v1)PythonExecutor._convert_parameters()
uses Pydantic's TypeAdapterOptional[Model]
,list[Model]
,dict[str, Model]
, etc.Return value
User functions return their own Pydantic models, but FastMCP needs to validate and serialize them according to the YAML-defined return schema.
How it's done: User's Pydantic Model → Dictionary → Validation → Serialized Output
TypeConverter.validate_output()
converts user's Pydantic model to dictionarymodel_dump()
ordict()
methods to extract dataTypeConverter.serialize_for_output()
ensures JSON compatibilitySQL
SQL queries follow a simpler path since DuckDB natively expects dictionary parameters:
YAML Schema → FastMCP Model → Dictionary → DuckDB Parameters
Dictionaries work directly with SQL parameter substitution.
Changes Made:
PythonExecutor._convert_parameters()
TypeAdapter
for robust type validation and conversionOptional[Model]
,list[Model]
,dict[str, Model]
, etc.Type of Change
Testing
uv run pytest
uv run ruff check .
uv run black --check .
uv run mypy .
Test Coverage:
test_pydantic_model_input
,test_pydantic_model_output
,test_pydantic_validation_error
list[User]
,dict[str, User]
)Security Considerations
Breaking Changes
None. This change is backward compatible - existing Python functions continue to work unchanged.
Additional Notes
Functions that expect pydantic model parameters can now be called with dictionary arguments, which are automatically validated and converted to model instances. This enables better integration between MCP tool calls (which provide JSON data) and Python functions using pydantic models for structured input validation.