RimWorld

RimWorld

351 ratings
RimTalk
33
14
4
6
6
5
5
4
4
4
4
2
2
   
Award
Favorite
Favorited
Unfavorite
Mod, 1.5, 1.6
File Size
Posted
Updated
698.636 KB
Aug 17 @ 8:08pm
Oct 5 @ 12:24am
83 Change Notes ( view )

Subscribe to download
RimTalk

Description
💬 RimTalk – AI-Powered Dialogue
Bring your colonists to life with RimTalk, a mod that gives pawns real conversations.
Every chat is unique — shaped by personality, mood, and the chaos of RimWorld.

⚙️ How It Works
  • Reads what colonists are thinking & doing 🧠
  • Builds a custom AI prompt ✍️
  • Displays dialogue in chat bubbles 🗨️

🌟 Features
  • Multiple AI Providers: OpenAI, DeepSeek, OpenRouter, Google AI, custom cloud models, or any OpenAPI-compatible local LLM (Ollama, LM Studio).
  • Unique Personalities: Grumpy elder, poetic dreamer, sarcastic doctor. Fully customizable.
  • Context-Aware: Dialogue matches what pawns are doing/thinking.
  • Multi-Language: Colonists can chat in English, 日本語, Français, etc.
  • Easy Setup: Paste an API key and go.
  • Performance-Friendly: Async requests keep FPS smooth.
Originally posted by Player Review:
“It feels like my pawns are finally alive.”

🚀 Getting Started
  1. Grab a free API key: Google AI Studio[aistudio.google.com]
  2. Enable RimTalk in your mod list
  3. Paste the key into RimTalk settings
Advanced Settings let you swap providers or use a local LLM.

📊 Free API Limits
  • 30 requests/min
  • 15,000 tokens/min
  • 14,400 requests/day
More details: Google API Rate Limits[ai.google.dev]

🎛️ Configuration Highlights
  • API Key – personal access key
  • AI Cooldown – adjust chat frequency
  • Suppress Raw Messages – hide unprocessed lines

⚡ Performance & Compatibility
  • Lightweight – tuned for performance; AI runs remotely
  • Works with most social mods (e.g. SpeakUp)
  • Save-Friendly: Add or remove RimTalk mid-game safely

🛠️ Troubleshooting
  • No dialogue? Check your API key.
  • ⏸️ Dialogue pauses? Raise Talk Interval.
  • ⚠️ Json errors? Troubleshooting Guide.

👨‍💻 Credits
Popular Discussions View All (27)
7
15 hours ago
NullReferenceException when pawns consume high-quality food
Mr.Jiang
7
Oct 4 @ 4:14pm
Bug report
AICER
12
Oct 3 @ 7:51am
Outdated Dialogue and Suggested Solution
Cosmosteller
357 Comments
KawwaK 4 hours ago 
Is there a version hoistory anywhere?
Sayo Hikawa 14 hours ago 
Okay I reloaded this mod and the issue was gone. Maybe the fix wasn't updated to my game before.
Mr.Jiang 15 hours ago 
@Sayo Hikawa The mod was just updated with fixes for thought tracking. Updating to the latest version should fix this issue.
Sayo Hikawa 15 hours ago 
@Juicy The eating issue is still there, error msg: https://pastebin.com/4hDJb0Pw
Mr.Jiang 19 hours ago 
Hi, just wanted to report a small issue I noticed after the latest RimTalk update this morning. It seems that whenever any pawn—colonists, prisoners, or NPCs—eats a fine or lavish meal, the game sometimes crashes with a NullReferenceException. From what I can see, it comes from PatchMemoryThoughtHandlerTryGainMemory.Postfix, probably because newThought or otherPawn can be null. Adding a simple null check like if (newThought == null || otherPawn == null) return; should prevent the crash. This didn’t happen in the previous version. Thanks for all your hard work!
vogelre Oct 4 @ 12:39pm 
I used a free API key from Google AI Studio and encountered this issue with the default settings. I tried to fix it by adding instructions, saving, and reloading, but all my attempts were ignored. Now I’ve tried using a local model with LM Studio and the rimtalk-mini-v1 model from TheDrummer, and it looks like the issue is gone. Slave-slave and slave-master interactions are now handled correctly.
Juicy  [author] Oct 4 @ 12:35pm 
@Incursion That sounds like an issue with the LLM itself. I tested with Gemma and it worked fine. The pawn’s context info is passed correctly, but your model might just be struggling to follow the instructions. Try switching to a different one and see if it helps.

@vogelre Are you by any chance using a local model? I tested with Gemma and didn’t run into that issue. Try using the default prompt with Gemma and let me know if it still happens.

@Fuyun Could you give an example of the text or situation where this happens? I’m not sure what you mean.
Fuyun Oct 4 @ 11:31am 
@Juicy
还有一点是,模组似乎无法识别出任务完成的文本,会当作又一次接到任务来处理,这是游戏任务文本本身的问题吗?
One more thing I'd like to mention: the mod seems unable to recognize the text indicating task completion and instead treats it as receiving the task again. Is this an issue with the game's task text itself?
vogelre Oct 4 @ 11:12am 
Slaves are speaking to each other as if they were speaking to their master, calling each other "master" or "sir" and offering to serve or be useful, etc. Is there any way to avoid this? I tried a lot of things in the "AI Instructions" settings, but had no luck.
Incursion Oct 4 @ 10:31am 
Using a local model, is there any way to make the llm use correct pronouns? It usually seems to default to "he" when generating, even if the pawn in question is female. Or is this maybe a problem with the llm itself? Should I look for another? Thanks