虚拟桌宠模拟器

虚拟桌宠模拟器

聊天API: ChatWithOllama
Bug
ive been trying to use this mod but it keeps telling me ollama isnt running even do it is i set the url to http://localhost:11434/v1/chat/completions in both the chatgpt api and the ollama settings mod setttings and still it tells me it cant find the modueles im using deepseek if that helps
< >
正在显示第 1 - 1 条,共 1 条留言
Honkai, Start  [开发者] 2 月 18 日 上午 11:06 
for my plugin, you only need to enter http://localhost:11434/. The other setting is for other GPT based module
< >
正在显示第 1 - 1 条,共 1 条留言
每页显示数: 1530 50