We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Learn more about funding links in repositories.
Report abuse
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
我自己搭了一个中间件用于中转请求流量,提供符合 Open API 规范的接口。在正确配置 BASE_URL 指向部署端点后发现思源笔记内置的 AI Chat 不支持流式传输,也即,middleware 返回 Streaming SSE,思源笔记直接报错了,只有 middleware “立即返回” JSONResponse 才可用(此时UI界面显示进度条)。
如何实现流式 SSE 响应, provider LLM 弹出的 tokens 经过 middleware 到思源笔记时,文本可以直接输出在笔记页,而不是有个进度条,等 middleware 将所有数据接受完再打包成 JSONResponse 一次性返回。
No response
The text was updated successfully, but these errors were encountered:
希望支持 流式输出 以及 中断输出
Sorry, something went wrong.
我也希望。
No branches or pull requests
In what scenarios do you need this feature?
我自己搭了一个中间件用于中转请求流量,提供符合 Open API 规范的接口。在正确配置 BASE_URL 指向部署端点后发现思源笔记内置的 AI Chat 不支持流式传输,也即,middleware 返回 Streaming SSE,思源笔记直接报错了,只有 middleware “立即返回” JSONResponse 才可用(此时UI界面显示进度条)。
Describe the optimal solution
如何实现流式 SSE 响应, provider LLM 弹出的 tokens 经过 middleware 到思源笔记时,文本可以直接输出在笔记页,而不是有个进度条,等 middleware 将所有数据接受完再打包成 JSONResponse 一次性返回。
Describe the candidate solution
No response
Other information
No response
The text was updated successfully, but these errors were encountered: