-
Notifications
You must be signed in to change notification settings - Fork 2.2k
feat: Add Queued Messages #6135
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Important
Looks good to me! 👍
Reviewed everything up to a322705 in 2 minutes and 32 seconds. Click for details.
- Reviewed
175
lines of code in3
files - Skipped
0
files when reviewing. - Skipped posting
8
draft comments. View those below. - Modify your settings and rules to customize what types of comments Ellipsis leaves. And don't forget to react with 👍 or 👎 to teach Ellipsis.
1. web-app/src/containers/ChatInput.tsx:115
- Draft comment:
Typo: Consider renaming 'handleSendMesage' to 'handleSendMessage' for clarity. - Reason this comment was not posted:
Comment was not on a location in the diff, so it can't be submitted as a review comment.
2. web-app/src/containers/ChatInput.tsx:378
- Draft comment:
Verify that disabling the textarea based on 'queuedMessage' (instead of 'streamingContent') is the intended UX behavior. - Reason this comment was not posted:
Comment did not seem useful. Confidence is useful =0%
<= threshold50%
The comment is asking the PR author to verify their intention regarding a specific UX behavior. This falls under the rule of not asking the author to confirm their intention or ensure the behavior is intended. Therefore, this comment should be removed.
3. web-app/src/containers/ChatInput.tsx:392
- Draft comment:
Consider adding comments to clarify the queueing logic in the onKeyDown handler, particularly how messages are queued when streaming is active. - Reason this comment was not posted:
Comment looked like it was already resolved.
4. web-app/src/containers/ChatInput.tsx:64
- Draft comment:
The cleanup effects for clearing 'queuedMessage' on thread change and on unmount are similar. Consider consolidating them to reduce redundancy. - Reason this comment was not posted:
Decided after close inspection that this draft comment was likely wrong and/or not actionable: usefulness confidence = 10% vs. threshold = 50% While the effects do similar things (clear queuedMessage), they serve distinct purposes and have different trigger conditions. Consolidating them would either require adding currentThreadId to the cleanup effect's deps (wrong) or removing the thread change trigger (breaking functionality). The current separation makes the code's intent clearer by keeping distinct behaviors separate. The comment has a point about code duplication. There is repeated code calling setQueuedMessage(null). However, the duplication is justified here because these are two separate concerns that happen to have similar implementations. Combining them would muddy the code's intent and potentially introduce bugs. The comment should be deleted. While it identifies similar code, consolidating these effects would make the code less clear and risk breaking the distinct behaviors they implement.
5. web-app/src/hooks/useAppState.ts:16
- Draft comment:
Queued message state and setter have been added correctly. No issues detected. - Reason this comment was not posted:
Confidence changes required:0%
<= threshold50%
None
6. web-app/src/hooks/useChat.ts:443
- Draft comment:
Queued message processing in the finally block uses a hardcoded 100ms delay; consider defining a constant for the delay and review potential recursion if messages get queued repeatedly. - Reason this comment was not posted:
Decided after close inspection that this draft comment was likely wrong and/or not actionable: usefulness confidence = 20% vs. threshold = 50% The hardcoded delay is a minor issue - while constants are generally better, this is a simple UI timing value. The recursion concern is more serious since it could cause infinite loops. However, the code shows that setQueuedMessage(null) is called before the next sendMessage, which should prevent recursion. The comment is speculative rather than pointing out a definite issue. I may be overlooking edge cases where recursion could still occur. The hardcoded delay could cause issues on slower devices. While those are valid concerns, the code has safeguards against recursion, and the delay is a simple UI timing value that doesn't warrant a constant. The issues raised are speculative rather than demonstrating actual problems. Delete the comment. While it raises interesting points, it's speculative and doesn't identify definite issues that need fixing. The code appears to handle message queueing safely.
7. web-app/src/hooks/useChat.ts:444
- Draft comment:
Using useAppState.getState() to access 'queuedMessage' in an async block can potentially lead to race conditions; consider capturing its value earlier if needed. - Reason this comment was not posted:
Decided after close inspection that this draft comment was likely wrong and/or not actionable: usefulness confidence = 10% vs. threshold = 50% The concern about race conditions is technically valid - the value could change between when it's read and when the setTimeout executes. However, the code already handles this by 1) Immediately clearing the queue with setQueuedMessage(null) before the timeout, and 2) Capturing the value in currentQueuedMessage. The current implementation appears intentionally designed to avoid race conditions. I could be underestimating the complexity of the state management here. There could be edge cases where multiple queued messages interact in unexpected ways. While edge cases are possible, the code shows a clear pattern of capturing and clearing the value immediately. The setTimeout is just for UI updates, not for state management purposes. The comment should be deleted because the code already handles the race condition concern appropriately through immediate queue clearing and value capture.
8. web-app/src/containers/ChatInput.tsx:405
- Draft comment:
Typo: 'handleSendMesage' appears to have a missing 's'. Consider renaming it to 'handleSendMessage'. - Reason this comment was not posted:
Comment was on unchanged code.
Workflow ID: wflow_eY4sJwMYa6wULiu2
You can customize by changing your verbosity settings, reacting with 👍 or 👎, replying to comments, or adding code review rules.
Hi @hermit46, thank you for the PR. There are a couple of things that need to be considered for this update:
I love this feature. Hopefully, we can update on the points above. |
There are linter issues, please help us fix them too. |
@louis-menlo thanks for looking through the PR 😊 I think I understand what you mean.
I'm thinking to minimize the race condition issues by focusing on: per-thread, independent message queues for now, while keeping the design open enough for simultaneous inference in the future. Does this sound about right to you? I should have enough context to proceed with commits for now. Updates to come below :) Do lmk if there's anything you'd add to this |
- Add per-thread streaming content, token speed, and error tracking - Implement thread-aware message queue system with add/remove/clear operations - Add thread isolation and cleanup methods - Maintain backward compatibility with legacy global state - Fix unused variable warning by including assistant in thread metadata
- Create useStreamingContent, useThreadError, useTokenSpeed hooks - Add useQueuedMessages and useThreadQueueLength for queue management - Implement useIsThreadActive and useThreadState for comprehensive thread info - Provide useActiveThreads to get all threads with active state - Use shallow comparison for performance optimization in useThreadState
- Replace global setQueuedMessage with thread-aware removeFromThreadQueue - Update message processing to handle per-thread message queues - Maintain existing functionality while supporting multiple queued messages per thread - Remove dependency on legacy global queue state
- Replace single-message queue with per-thread multi-message queue system - Add thread-aware queue operations (addToThreadQueue, getThreadQueueLength) - Remove input disabling when messages are queued - Update queue indicator to show message count instead of single message - Remove thread change cleanup effects (queues now persist across threads) - Always allow queuing when AI is responding (no single-message limitation)
- Fix formatting and linting issues in test file - Update test structure to match new multi-message queue implementation - Ensure tests align with thread-aware queue operations - Maintain test coverage for existing functionality
- Test multi-message queue per thread functionality - Verify thread isolation and queue persistence across thread switches - Test queue management operations (add, remove, clear) - Validate FIFO processing order and edge case handling - Test integration with convenience hooks for queue management - Ensure performance with large queues and rapid operations
- Test useStreamingContent, useThreadError, useTokenSpeed hooks - Validate useQueuedMessages and useThreadQueueLength functionality - Test useIsThreadActive and useThreadState comprehensive state access - Verify useActiveThreads returns correct active thread list - Ensure hooks properly integrate with useAppState per-thread system
- Test all new per-thread state management methods - Validate thread isolation and state separation - Test streaming content, token speed, and error handling per thread - Verify queue operations (add, remove, clear, length) work correctly - Test thread cleanup and bulk operations - Ensure getAllActiveThreads returns correct active thread list
- Test that legacy global state methods continue to work unchanged - Verify new per-thread methods don't interfere with existing functionality - Test state transitions between legacy and new systems - Ensure backward compatibility is maintained for existing components - Validate that both systems can coexist during transition period
- Test interaction between useAppState and useThreadState hooks - Validate thread state convenience hooks work with app state - Test end-to-end thread management workflows - Verify proper state synchronization across different hooks - Ensure consistent behavior when using multiple thread-aware hooks together
- Test that legacy and new methods are properly separated - Validate method organization and grouping structure - Test that related functionality is logically grouped - Ensure clear separation between backward compatibility and new features - Verify method signatures and behavior match their intended purpose
…mance - Replace useShallow approach with individual hooks for optimal performance - Each hook only re-renders when its specific value changes - Eliminates unnecessary object creation on every render - Maintains useThreadState for backward compatibility when all properties needed - Add comprehensive performance tests demonstrating the optimization benefits - Individual hooks provide stable references and better memory efficiency
- Test object creation frequency between individual hooks and useThreadState - Verify reference stability and memory efficiency improvements - Demonstrate when individual hooks provide better performance - Test real-world usage patterns and frequent re-render scenarios - Provide performance recommendations for optimal hook usage
- Benchmark individual hooks vs useThreadState performance characteristics - Test memory efficiency with 1000+ re-render scenarios - Demonstrate real-world chat component performance patterns - Show object creation frequency and reference stability differences - Provide concrete performance recommendations for developers
Summary of edits:
Included: a bunch of tests for the multi-thread queueing system + memoization. Implementation is about 300 LOCs Note:
Lmk if this sounds about right to you |
Here are two videos to show the current status: ✅ Multiple messages queued and able to send in a single thread (for every thread) Screen.Recording.2025-08-13.at.11.26.12.PM.mov❌ I haven't quite fixed the state management on this. Queued messages are sent onto the current thread instead of the thread they're queued in. Will work on this in the next PR/commit Screen.Recording.2025-08-13.at.11.28.54.PM.mov |
Close in favor of #6220 |
Describe Your Changes
Screen.Recording.2025-08-12.at.12.28.06.AM.mov
Important
Add message queuing feature to
ChatInput.tsx
with state management inuseAppState.ts
and processing inuseChat.ts
.ChatInput.tsx
with a limit of one queued message.ChatInput.tsx
.useChat.ts
.queuedMessage
andsetQueuedMessage
touseAppState.ts
for managing queued messages.ChatInput.tsx
when a message is queued.This description was created by
for a322705. You can customize this summary. It will automatically update as commits are pushed.