Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AI 辅助流式响应 #10668

Open
QIN2DIM opened this issue Mar 20, 2024 · 2 comments
Open

AI 辅助流式响应 #10668

QIN2DIM opened this issue Mar 20, 2024 · 2 comments

Comments

@QIN2DIM
Copy link

QIN2DIM commented Mar 20, 2024

In what scenarios do you need this feature?

我自己搭了一个中间件用于中转请求流量,提供符合 Open API 规范的接口。在正确配置 BASE_URL 指向部署端点后发现思源笔记内置的 AI Chat 不支持流式传输,也即,middleware 返回 Streaming SSE,思源笔记直接报错了,只有 middleware “立即返回” JSONResponse 才可用(此时UI界面显示进度条)。

Describe the optimal solution

如何实现流式 SSE 响应, provider LLM 弹出的 tokens 经过 middleware 到思源笔记时,文本可以直接输出在笔记页,而不是有个进度条,等 middleware 将所有数据接受完再打包成 JSONResponse 一次性返回。

Describe the candidate solution

No response

Other information

No response

@TCOTC
Copy link
Contributor

TCOTC commented Mar 20, 2024

希望支持 流式输出 以及 中断输出

@JHPZ
Copy link

JHPZ commented Apr 1, 2024

我也希望。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants