Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Commit 33492f3

Browse files
author
Luke Hinds
authored
Merge pull request #163 from stacklok/fix-llamacpp
Fix llamacpp streaming completion
2 parents 12dd7c4 + 7ca9c50 commit 33492f3

File tree

1 file changed

+5
-1
lines changed
  • src/codegate/providers/completion

1 file changed

+5
-1
lines changed

src/codegate/providers/completion/base.py

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,10 @@ def create_response(self, response: Any) -> Union[JSONResponse, StreamingRespons
3636
"""
3737
Create a FastAPI response from the completion response.
3838
"""
39-
if isinstance(response, Iterator) or inspect.isasyncgen(response):
39+
if (
40+
isinstance(response, Iterator)
41+
or isinstance(response, AsyncIterator)
42+
or inspect.isasyncgen(response)
43+
):
4044
return self._create_streaming_response(response)
4145
return self._create_json_response(response)

0 commit comments

Comments
 (0)