We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
本地ollama部署deepseek-r1:8b后,接入到astrbot后,发消息报错502。
写了一个脚本 @echo off call conda activate talk
start /b python "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\main.py"
timeout /t 5
chcp 65001 "D:\python_plugins\pythonProject\PycharmProjects\astrbot\NapCat.Shell\NapCatWinBootMain.exe" 203****817 pause
运行完毕后在astrbot的仪表盘聊天窗口中发送/model,回复 获取模型列表失败: Error code: 502;发送/provider,回复 载入的 LLM 提供商 ollama_default (deepseek-r1:8b) (当前使用) 使用 /provider <序号> 切换 LLM 提供商。
发送你好,回复 AstrBot 请求失败。 错误类型: InternalServerError 错误信息: Error code: 502
window 最新版本
Windows
[16:13:29| INFO] [event_bus.py:21]: [webchat] astrbot/astrbot: 你好 [16:13:33| ERROR] [openai_source.py:190]: 发生了错误。Provider 配置如下: {'id': 'ollama_default', 'type': 'openai_chat_completion', 'enable': True, 'key': ['ollama'], 'api_base': 'http://127.0.0.1:11434/v1', 'model_config': {'model': 'deepseek-r1:8b'}} [16:13:33| ERROR] [llm_request.py:137]: Traceback (most recent call last): File "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\astrbot\core\pipeline\process_stage\method\llm_request.py", line 83, in process llm_response = await provider.text_chat(**req.dict) # 请求 LLM ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\astrbot\core\provider\sources\openai_source.py", line 200, in text_chat raise e File "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\astrbot\core\provider\sources\openai_source.py", line 144, in text_chat llm_response = await self._query(payloads, func_tool) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\astrbot\core\provider\sources\openai_source.py", line 71, in _query completion = await self.client.chat.completions.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\conda\envs\talk\Lib\site-packages\openai\resources\chat\completions\completions.py", line 1927, in create return await self._post( ^^^^^^^^^^^^^^^^^ File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1856, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1550, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1636, in _request return await self._retry_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1683, in _retry_request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1636, in _request return await self._retry_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1683, in _retry_request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1651, in _request raise self._make_status_error_from_response(err.response) from None openai.InternalServerError: Error code: 502
[16:13:34| INFO] [stage.py:71]: AstrBot -> astrbot/astrbot: AstrBot 请求失败。 错误类型: InternalServerError 错误信息: Error code: 502
The text was updated successfully, but these errors were encountered:
请问设置过代理吗?
Sorry, something went wrong.
🐛 fix: add no_proxy env vars to support localhost requests, fix 502 e…
f001d38
…rror when use ollama #504
感谢波奇酱大佬。
前两天全局代理ollama拉取的时候把流量用完了,刚刚把代理关掉后恢复正常了
No branches or pull requests
发生了什么
本地ollama部署deepseek-r1:8b后,接入到astrbot后,发消息报错502。
如何复现?
写了一个脚本
@echo off
call conda activate talk
start /b python "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\main.py"
timeout /t 5
chcp 65001
"D:\python_plugins\pythonProject\PycharmProjects\astrbot\NapCat.Shell\NapCatWinBootMain.exe" 203****817
pause
运行完毕后在astrbot的仪表盘聊天窗口中发送/model,回复 获取模型列表失败: Error code: 502;发送/provider,回复
载入的 LLM 提供商
ollama_default (deepseek-r1:8b) (当前使用)
使用 /provider <序号> 切换 LLM 提供商。
发送你好,回复
AstrBot 请求失败。
错误类型: InternalServerError
错误信息: Error code: 502
AstrBot 版本与部署方式
window 最新版本
操作系统
Windows
额外信息
[16:13:29| INFO] [event_bus.py:21]: [webchat] astrbot/astrbot: 你好
[16:13:33| ERROR] [openai_source.py:190]: 发生了错误。Provider 配置如下: {'id': 'ollama_default', 'type': 'openai_chat_completion', 'enable': True, 'key': ['ollama'], 'api_base': 'http://127.0.0.1:11434/v1', 'model_config': {'model': 'deepseek-r1:8b'}}
[16:13:33| ERROR] [llm_request.py:137]: Traceback (most recent call last):
File "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\astrbot\core\pipeline\process_stage\method\llm_request.py", line 83, in process
llm_response = await provider.text_chat(**req.dict) # 请求 LLM
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\astrbot\core\provider\sources\openai_source.py", line 200, in text_chat
raise e
File "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\astrbot\core\provider\sources\openai_source.py", line 144, in text_chat
llm_response = await self._query(payloads, func_tool)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\python_plugins\pythonProject\PycharmProjects\astrbot\AstrBot-master\astrbot\core\provider\sources\openai_source.py", line 71, in _query
completion = await self.client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\conda\envs\talk\Lib\site-packages\openai\resources\chat\completions\completions.py", line 1927, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1856, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1550, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1636, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1683, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1636, in _request
return await self._retry_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1683, in _retry_request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "D:\conda\envs\talk\Lib\site-packages\openai_base_client.py", line 1651, in _request
raise self._make_status_error_from_response(err.response) from None
openai.InternalServerError: Error code: 502
[16:13:34| INFO] [stage.py:71]: AstrBot -> astrbot/astrbot: AstrBot 请求失败。
错误类型: InternalServerError
错误信息: Error code: 502
你愿意提交 PR 吗?
Code of Conduct
The text was updated successfully, but these errors were encountered: