Steam'i Yükleyin
giriş
|
dil
简体中文 (Basitleştirilmiş Çince)
繁體中文 (Geleneksel Çince)
日本語 (Japonca)
한국어 (Korece)
ไทย (Tayca)
Български (Bulgarca)
Čeština (Çekçe)
Dansk (Danca)
Deutsch (Almanca)
English (İngilizce)
Español - España (İspanyolca - İspanya)
Español - Latinoamérica (İspanyolca - Latin Amerika)
Ελληνικά (Yunanca)
Français (Fransızca)
Italiano (İtalyanca)
Bahasa Indonesia (Endonezce)
Magyar (Macarca)
Nederlands (Hollandaca)
Norsk (Norveççe)
Polski (Lehçe)
Português (Portekizce - Portekiz)
Português - Brasil (Portekizce - Brezilya)
Română (Rumence)
Русский (Rusça)
Suomi (Fince)
Svenska (İsveççe)
Tiếng Việt (Vietnamca)
Українська (Ukraynaca)
Bir çeviri sorunu bildirin
Setting the talk interval to "never" doesn't make babies never talk, besides, that setting resets every reload, so I'd prefer something in their bio or the AI instructions. It's annoying cuz I have two babies and they keep exhausting the API role-playing being dumb helpless babies with each other or "coos and gurgles" in response to everything that ever happens near them.
If I put something like "Do not generate any dialogue or responses for babies" in the AI instruction spot, it gives babies bubbles with error text like ```json{"name": "Baby's name","text": "}``` and then they continue as if they're a full grown adult responding to other pawns with a generic personality, so I'm not sure what to say or where to say it to get the AI to stop generating so much baby spam.
1. An option to completely disable vanilla interactions is needed, as they are too frequent and illogical.
2. Conversations of prisoners, visitors, and slaves are needed. Slaves are mentioned here because, compared to Rimtalk, slaves seem less aware of their own identities, and the interaction between colonists and slaves regarding their identities is very weak.
3. Rimtalk integrates chatlog, and it seems possible to consider aligning them.
I used the same prompts to experience both mods together, and I still think your work is more in-depth and valuable.
[RiMind] Cannot connect to local LLM server: http://192.168.31.10:11434
是我电脑本地的ip地址,服务为ollama,使用localhost就可以正常访问