Skip to content

Pass context information to Function Call Thread #991

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
dolukhanov opened this issue Jul 1, 2024 · 3 comments
Closed

Pass context information to Function Call Thread #991

dolukhanov opened this issue Jul 1, 2024 · 3 comments
Assignees
Labels
Milestone

Comments

@dolukhanov
Copy link

Expected Behavior

When using Function Calling with OpenAI and other models - the function gets invoked in a new thread.

The function being called in my application requires context, so it can persist the response data to the correct user / session etc.

My application uses Spring Security - however, I anticipate there could be other requirements to pass context to this thread.

It would be ideal to be able to store the context before the thread is called and then retrieve the context e.g. using:

SecurityContext context = SecurityContextHolder.getContext();

and

SecurityContextHolder.setContext( context );

Perhaps you could register listeners / functions that can be called to pass this information.

Current Behavior

The calling thread does not have any context about the session / user that invoked the call to the AI model.

Context

The only workaround is to request that the model pass an identified to the function via the prompt.

@miltonhit
Copy link

miltonhit commented Jul 6, 2024

I've opened a similar issue 3 weeks ago:
#864

A possible workahound for it is passing a function "instance" in each interaction.
Then you can pass some user/session in constructor.

@tzolov tzolov added tool/function calling enhancement New feature or request labels Jul 25, 2024
@tzolov tzolov self-assigned this Jul 25, 2024
@tzolov
Copy link
Contributor

tzolov commented Jul 25, 2024

Thanks you highlighting this use case @dolukhanov, @miltonhit

Can you please elaborate on why the function information should be kept in Thread context rather, for example, being returned with the chat response metadata as suggested here?

tzolov added a commit to tzolov/spring-ai that referenced this issue Oct 4, 2024
  This commit adds support for tool context in various chat options classes across
  different AI model implementations and enhances function calling capabilities.

  The tool context allows passing additional contextual information to function callbacks.

 - Add toolContext field to chat options classes
 - Update builder classes to support setting toolContext
 - Enhance FunctionCallback interface to support context-aware function calls
 - Update AbstractFunctionCallback to implement BiFunction instead of Function
 - Modify FunctionCallbackWrapper to support both Function and BiFunction and
   to use the new SchemaType location
 - Add support for BiFunction in TypeResolverHelper
 - Update ChatClient interface and DefaultChatClient implementation to support
   new function calling methods with Function, BiFunction and FunctionCallback arguments
 - Refactor AbstractToolCallSupport to pass tool context to function execution
 - Update all affected <Model>ChatOptions with tool context support
 - Simplify OpenAiChatClientMultipleFunctionCallsIT test
 - Add tests for function calling with tool context
 - Add new test cases for function callbacks with context in various integration tests
 - Modify existing tests to incorporate new context-aware function calling capabilities

 Resolves spring-projects#864, spring-projects#1303, spring-projects#991
markpollack pushed a commit that referenced this issue Oct 4, 2024
  This commit adds support for tool context in various chat options classes across
  different AI model implementations and enhances function calling capabilities.

  The tool context allows passing additional contextual information to function callbacks.

 - Add toolContext field to chat options classes
 - Update builder classes to support setting toolContext
 - Enhance FunctionCallback interface to support context-aware function calls
 - Update AbstractFunctionCallback to implement BiFunction instead of Function
 - Modify FunctionCallbackWrapper to support both Function and BiFunction and
   to use the new SchemaType location
 - Add support for BiFunction in TypeResolverHelper
 - Update ChatClient interface and DefaultChatClient implementation to support
   new function calling methods with Function, BiFunction and FunctionCallback arguments
 - Refactor AbstractToolCallSupport to pass tool context to function execution
 - Update all affected <Model>ChatOptions with tool context support
 - Simplify OpenAiChatClientMultipleFunctionCallsIT test
 - Add tests for function calling with tool context
 - Add new test cases for function callbacks with context in various integration tests
 - Modify existing tests to incorporate new context-aware function calling capabilities
 - Add docs in in openai function calling

 Resolves #864, #1303, #991
@markpollack markpollack added this to the 1.0.0-M3 milestone Oct 4, 2024
@markpollack
Copy link
Member

Hi! Can you see if #1458 fixes this issue for you. Here are some minimal docs - https://docs.spring.io/spring-ai/reference/api/chat/functions/openai-chat-functions.html#_tool_context_support see the test here as well

please reopen if you think something is not addressed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants