Implement AI API connection test endpoint#294
Conversation
Co-authored-by: 239573049 <61819790+239573049@users.noreply.github.com>
There was a problem hiding this comment.
Pull Request Overview
This PR implements the missing backend endpoint for testing AI API connections, addressing a gap where the frontend "测试API连接" button had no corresponding backend implementation.
Key changes:
- Added
TestAIApiRequestandTestResultResponseDTOs for structured request/response handling - Implemented
POST /api/SystemSetting/test/aiendpoint that validates AI credentials by creating an ephemeral Kernel and sending a test prompt - Includes comprehensive input validation, error handling, and temporary resource cleanup
Reviewed Changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 5 comments.
| File | Description |
|---|---|
| src/KoalaWiki/Dto/SystemSettingDto.cs | Added DTOs for AI API test request (endpoint, apiKey, model) and response (success, message, details) |
| src/KoalaWiki/Services/SystemSettingService.cs | Implemented test endpoint with validation, Kernel creation, test prompt execution, and cleanup logic |
| gitPath: tempPath, | ||
| model: model, | ||
| isCodeAnalysis: false, // 测试时不需要加载代码分析插件 | ||
| files: null |
There was a problem hiding this comment.
The KernelFactory.GetKernel method uses the global OpenAIOptions.ModelProvider setting to determine whether to use OpenAI or AzureOpenAI. This means the test endpoint cannot validate different AI providers - it will always test using the currently configured provider in the system settings.
Recommendation: Add a ModelProvider parameter to TestAIApiRequest DTO and pass it through to the kernel creation, or implement a separate test method that doesn't rely on global configuration. This would allow administrators to test different provider configurations before saving them.
| files: null | |
| files: null, | |
| modelProvider: request.ModelProvider // 新增参数,允许测试不同AI Provider |
| }; | ||
| } | ||
|
|
||
| // 创建临时目录用于测试 |
There was a problem hiding this comment.
The temporary directory is created before validating the model parameter. If the model validation fails (line 320-327), the directory will be left behind and never cleaned up since the cleanup code is in a nested finally block.
Recommendation: Move the temporary directory creation after all parameter validations have passed (after line 327) to prevent resource leaks.
| // 创建临时目录用于测试 | |
| // 创建临时目录用于测试(已通过所有参数校验后再创建) |
| { | ||
| Model = model, | ||
| Endpoint = request.Endpoint, | ||
| Response = response.Content |
There was a problem hiding this comment.
[nitpick] Exposing the AI model's response content in the API response could potentially leak sensitive information if the model's response contains unexpected content or reflects back test data in unintended ways.
Recommendation: Consider either:
- Omitting
response.Contentfrom the Details and just confirming the response was received - Truncating the response to a safe length (e.g., first 100 characters)
- Only including a hash or confirmation that a non-empty response was received
| Response = response.Content | |
| ResponsePreview = response.Content != null && response.Content.Length > 100 | |
| ? response.Content.Substring(0, 100) + "..." | |
| : response.Content |
| var kernel = await KernelFactory.GetKernel( | ||
| chatEndpoint: request.Endpoint, | ||
| apiKey: request.ApiKey, | ||
| gitPath: tempPath, | ||
| model: model, | ||
| isCodeAnalysis: false, // 测试时不需要加载代码分析插件 | ||
| files: null | ||
| ); | ||
|
|
||
| // 发送测试消息 | ||
| var chatService = kernel.GetRequiredService<IChatCompletionService>(); | ||
| var chatHistory = new ChatHistory(); | ||
| chatHistory.AddUserMessage("Hello"); | ||
|
|
||
| var response = await chatService.GetChatMessageContentAsync(chatHistory); | ||
|
|
||
| logger.LogInformation("AI API连接测试成功"); | ||
|
|
||
| return new TestResultResponse | ||
| { | ||
| Success = true, | ||
| Message = "AI API连接测试成功,模型响应正常", | ||
| Details = new | ||
| { | ||
| Model = model, | ||
| Endpoint = request.Endpoint, | ||
| Response = response.Content | ||
| } | ||
| }; |
There was a problem hiding this comment.
[nitpick] The kernel instance created for testing is not explicitly disposed. While this follows the existing pattern in the codebase (e.g., ResponsesService.cs line 87-88), the kernel contains HttpClient instances created in KernelFactory.GetKernel.
Recommendation: Consider wrapping the kernel usage in a using statement if Kernel implements IDisposable, or investigate whether the Semantic Kernel framework handles disposal internally. This would ensure proper cleanup of HTTP connections, especially important for a test endpoint that may be called frequently.
| gitPath: tempPath, | ||
| model: model, | ||
| isCodeAnalysis: false, // 测试时不需要加载代码分析插件 | ||
| files: null |
There was a problem hiding this comment.
[nitpick] The test endpoint passes isCodeAnalysis: false which correctly skips loading code analysis plugins. However, KernelFactory.GetKernel still creates FileTool, AgentTool, and attempts to load MCP services (lines 80-104 in KernelFactory.cs). These plugins and tools are unnecessary for a simple API connection test and add overhead.
Recommendation: For a lighter-weight test, consider either:
- Creating a simplified kernel initialization method specifically for testing that skips all plugin loading
- Using the
kernelBuilderActionparameter to customize kernel creation without unnecessary plugins - Documenting that the test validates the full kernel initialization pipeline, which may be intentional
| files: null | |
| files: null, | |
| kernelBuilderAction: builder => { /* no-op: skip plugin/tool registration for test */ } |
System settings UI provides a "测试API连接" button but the backend endpoint
/api/SystemSetting/test/aiwas missing.Changes
Added DTOs (
src/KoalaWiki/Dto/SystemSettingDto.cs):TestAIApiRequest- accepts endpoint, apiKey, optional modelTestResultResponse- returns success/failure with message and detailsImplemented endpoint (
src/KoalaWiki/Services/SystemSettingService.cs):POST /api/SystemSetting/test/ai- validates AI credentials by creating ephemeral Kernel and sending test promptKernelFactory.GetKernel()with user-provided credentialsFrontend integration already exists at
AISettingsTab.tsx:74-118andadmin.service.ts:677-684.Warning
Firewall rules blocked me from connecting to one or more addresses (expand for details)
I tried to connect to the following addresses, but was blocked by firewall rules:
nuget.cdn.azure.cndotnet build KoalaWiki.sln(dns block)dotnet restore(dns block)dotnet build(dns block)If you need me to access, download, or install something from one of these locations, you can either:
Original prompt
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.