-
Notifications
You must be signed in to change notification settings - Fork 1.6k
Pass context information to Function Call Thread #991
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I've opened a similar issue 3 weeks ago: A possible workahound for it is passing a function "instance" in each interaction. |
Thanks you highlighting this use case @dolukhanov, @miltonhit Can you please elaborate on why the function information should be kept in Thread context rather, for example, being returned with the chat response metadata as suggested here? |
This commit adds support for tool context in various chat options classes across different AI model implementations and enhances function calling capabilities. The tool context allows passing additional contextual information to function callbacks. - Add toolContext field to chat options classes - Update builder classes to support setting toolContext - Enhance FunctionCallback interface to support context-aware function calls - Update AbstractFunctionCallback to implement BiFunction instead of Function - Modify FunctionCallbackWrapper to support both Function and BiFunction and to use the new SchemaType location - Add support for BiFunction in TypeResolverHelper - Update ChatClient interface and DefaultChatClient implementation to support new function calling methods with Function, BiFunction and FunctionCallback arguments - Refactor AbstractToolCallSupport to pass tool context to function execution - Update all affected <Model>ChatOptions with tool context support - Simplify OpenAiChatClientMultipleFunctionCallsIT test - Add tests for function calling with tool context - Add new test cases for function callbacks with context in various integration tests - Modify existing tests to incorporate new context-aware function calling capabilities Resolves spring-projects#864, spring-projects#1303, spring-projects#991
This commit adds support for tool context in various chat options classes across different AI model implementations and enhances function calling capabilities. The tool context allows passing additional contextual information to function callbacks. - Add toolContext field to chat options classes - Update builder classes to support setting toolContext - Enhance FunctionCallback interface to support context-aware function calls - Update AbstractFunctionCallback to implement BiFunction instead of Function - Modify FunctionCallbackWrapper to support both Function and BiFunction and to use the new SchemaType location - Add support for BiFunction in TypeResolverHelper - Update ChatClient interface and DefaultChatClient implementation to support new function calling methods with Function, BiFunction and FunctionCallback arguments - Refactor AbstractToolCallSupport to pass tool context to function execution - Update all affected <Model>ChatOptions with tool context support - Simplify OpenAiChatClientMultipleFunctionCallsIT test - Add tests for function calling with tool context - Add new test cases for function callbacks with context in various integration tests - Modify existing tests to incorporate new context-aware function calling capabilities - Add docs in in openai function calling Resolves #864, #1303, #991
Hi! Can you see if #1458 fixes this issue for you. Here are some minimal docs - https://docs.spring.io/spring-ai/reference/api/chat/functions/openai-chat-functions.html#_tool_context_support see the test here as well Line 60 in eb2deba
please reopen if you think something is not addressed. |
Expected Behavior
When using Function Calling with OpenAI and other models - the function gets invoked in a new thread.
The function being called in my application requires context, so it can persist the response data to the correct user / session etc.
My application uses Spring Security - however, I anticipate there could be other requirements to pass context to this thread.
It would be ideal to be able to store the context before the thread is called and then retrieve the context e.g. using:
SecurityContext context = SecurityContextHolder.getContext();
and
SecurityContextHolder.setContext( context );
Perhaps you could register listeners / functions that can be called to pass this information.
Current Behavior
The calling thread does not have any context about the session / user that invoked the call to the AI model.
Context
The only workaround is to request that the model pass an identified to the function via the prompt.
The text was updated successfully, but these errors were encountered: