Currently while chatting the response from the bot comes at once. We want the response to be streamed i.e., it should come in chunks so that user should not wait for the whole response.
For reference, the assignee can look how chatGPT sends its response.
Currently while chatting the response from the bot comes at once. We want the response to be streamed i.e., it should come in chunks so that user should not wait for the whole response.
For reference, the assignee can look how chatGPT sends its response.