Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

feat: update feedback API to support agent_execution and chat entities

Summary

Updates the Node.js SDK feedback endpoints to support the new API structure from vlm-lab PR #1060. The feedback system now supports three entity types (request, agent_execution, chat) instead of just requests.

Key Changes:

  • submit method: Now accepts optional requestId, agentExecutionId, or chatId parameters (exactly one required)
  • get method: Added type parameter to specify entity type ("request", "agent_execution", "chat")
  • TypeScript interfaces: Updated to include optional ID fields for all entity types
  • Backward compatibility: Maintained through method overloads - existing code continues to work unchanged
  • Validation: Added strict validation requiring exactly one entity ID in new submit API

The implementation uses method overloads to support both legacy signatures and new options-based signatures, with runtime type checking to distinguish between them.

Review & Testing Checklist for Human

  • Test new API patterns against actual backend - Verify submit({ agentExecutionId: "...", ... }) and get("entity_id", { type: "agent_execution" }) work correctly with the deployed vlm-lab backend
  • Verify backward compatibility - Test that existing code using submit(requestId, response, notes) and get(requestId, limit, offset) still works without changes
  • Validate error handling - Confirm that submitting with zero IDs or multiple IDs throws appropriate errors as expected
  • Check API response structure - Ensure the actual API responses match the updated TypeScript interfaces (optional ID fields)

Notes

  • All 154 unit tests pass, including comprehensive new test coverage for the updated API
  • TypeScript build compiles successfully with no errors
  • The legacy get method now includes type: "request" parameter in API calls - this requires backend support for the type query parameter

Session: https://app.devin.ai/sessions/bc0fe1ce7d1446e8a741cadf67d46474
Requested by: [email protected]
References: vlm-lab PR #1060

- Update submit method to accept optional request_id, agent_execution_id, or chat_id parameters
- Update get method to use type parameter and entity ID instead of just requestId
- Update TypeScript interfaces to reflect new optional ID fields
- Maintain backward compatibility with existing method signatures
- Add comprehensive test coverage for new API functionality
- Ensure exactly one entity ID is provided in new submit API
- Default to 'request' type when using legacy get method signature

Refs: vlm-lab PR #1060
Co-Authored-By: [email protected] <[email protected]>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

shahrear33 and others added 2 commits September 29, 2025 22:34
- Refactor get and submit methods to utilize options object for better parameter handling
- Simplify logic for determining entity IDs and pagination parameters
- Update tests to reflect new method signatures and ensure comprehensive coverage
- Maintain backward compatibility with existing method signatures

Co-Authored-By: [email protected] <[email protected]>
- Update version in package.json from 1.0.0 to 1.0.1
- Introduce comprehensive integration tests for the feedback API, covering various scenarios for getting and submitting feedback
- Enhance unit tests to handle undefined notes instead of null

Co-Authored-By: VlmRun <[email protected]>
@shahrear33 shahrear33 merged commit 659540a into main Oct 2, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants