Caution
LLM/AI Disclaimer: Large Language Models (LLMs) such as those used by this module do not possess intelligence, reasoning, or true understanding. They generate text by predicting the most likely next word based on patterns in their training data—matching vectors, not thinking or comprehension. The quality and relevance of responses depend entirely on the model you use, its training data, and its configuration. Results may vary, and sometimes the output may be irrelevant, nonsensical, or simply not work as expected. This is a fundamental limitation of current AI and LLM technology. Use with realistic expectations.
This module is also in development and can bog down your server due to the nature of running local LLM. Please proceed with this in mind.
Important
To fully disable Playerbots normal chatter and random chatter that might interfere with this module, set the following settings in your playerbots.conf:
AiPlayerbot.EnableBroadcasts = 0(disables loot/quest/kill broadcasts)AiPlayerbot.RandomBotTalk = 0(disables random talking in say/yell/general channels)AiPlayerbot.RandomBotEmote = 0(disables random emoting)AiPlayerbot.RandomBotSuggestDungeons = 0(disables dungeon suggestions)AiPlayerbot.EnableGreet = 0(disables greeting when invited)AiPlayerbot.GuildFeedback = 0(disables guild event chatting)AiPlayerbot.RandomBotSayWithoutMaster = 0(disables bots talking without a master)
mod-ollama-chat is an AzerothCore module that enhances the Player Bots module by integrating external language model (LLM) support via the Ollama API. This module enables player bots to generate dynamic, in-character chat responses using advanced natural language processing locally on your computer (or remotely hosted). Bots are enriched with personality traits, random chatter triggers, and context-aware replies that mimic the language and lore of World of Warcraft.
-
Ollama LLM Integration:
Bots generate chat responses by querying an external Ollama API endpoint. This enables natural and contextually appropriate in-game dialogue. -
Player Bot Personalities:
When enabled, each bot is assigned a personality type (e.g., Gamer, Roleplayer, Trickster) that modifies its chat style. Personalities influence prompt generation and result in varied, immersive responses. -
Context-Aware Prompt Generation:
The module gathers extensive context about both the bot and the interacting player—including class, race, role, faction, guild, and more—to generate prompts for the LLM. A comprehensive WoW cheat sheet is appended to every prompt to ensure the LLM replies with accurate lore, terminology, and in-character language spanning Vanilla WoW, The Burning Crusade, and Wrath of the Lich King. -
Random Chatter:
Bots can periodically initiate random, environment-based chat when a real player is nearby. This feature adds an extra layer of immersion to the game world. -
Chat Memory (Conversation History):
Bots now have configurable short-term chat memory. Recent conversations between each player and bot are stored and included as context in every LLM prompt, giving responses better context and continuity.Bots now recall your recent interactions—responses will reflect the last several lines of chat with each player.
-
Blacklist for Playerbot Commands:
A configurable blacklist prevents bots from responding to chat messages that start with common playerbot command prefixes, ensuring that administrative commands are not inadvertently processed. Additional commands can be appended via the configuration. -
Asynchronous Response Handling:
Chat responses are generated on separate threads to avoid blocking the main server loop, ensuring smooth server performance. -
Live Configuration & Personality Reload:
Reload the module’s config and personality packs in-game or from the server console, without restarting. -
Event-Based Chatter:
Player bots now comment on key in-game events such as quest completion, rare loot, deaths, PvP kills, leveling up, duels, learning spells, and achievements. Remarks are context-aware, immersive, and personality-driven, making the world feel much more alive. -
Party-Only Bot Responses:
When enabled, bots will only respond to real player messages and events when they are in the same non-raid party. This helps reduce chat spam while maintaining full bot-to-bot communication within parties for immersive group interactions. -
Think Mode Support:
Bots can leverage LLM models that have reasoning/think modes. Enable internal reasoning for models that support it by settingOllamaChat.ThinkModeEnableForModule = 1in mod-ollama-chat.conf. When enabled, the API request includes thethinkflag and the bot omits allthinkingresponses from its final reply. -
Live Reload for Personalities and Settings:
Instantly reload all mod-ollama-chat configuration and personality packs in-game using the.ollama reloadcommand with a GM level account or useollama reloadfrom the server console. No server restart required—updates to.confor personality packs (.sqlfiles) are applied immediately.
Important
Cross-Platform Support: This module now uses cpp-httplib (header-only) instead of curl, eliminating compilation issues on Windows and simplifying installation on all platforms.
-
Prerequisites:
- Ensure you have liyunfan1223's AzerothCore (https://github.com/liyunfan1223/azerothcore-wotlk) installation with the Player Bots (https://github.com/liyunfan1223/mod-playerbots) module enabled.
- The module depends on:
- fmtlib (https://github.com/fmtlib/fmt) - For string formatting
- nlohmann/json (https://github.com/nlohmann/json) - For JSON processing (bundled with module - no installation needed)
- cpp-httplib (https://github.com/yhirose/cpp-httplib) - Header-only HTTP library (included, no installation needed)
- Ollama LLM support – set up a local instance of the Ollama API server with the model of your choice. More details at https://ollama.com
-
Install Dependencies:
vcpkg install fmt
sudo apt update sudo apt install libfmt-dev
sudo yum install fmt-devel # or dnf install fmt-develbrew install fmt
sudo pacman -S fmt
-
Clone the Module:
cd /path/to/azerothcore/modules git clone https://github.com/DustinHendrickson/mod-ollama-chat.git -
Recompile AzerothCore:
cd /path/to/azerothcore mkdir build && cd build cmake .. make -j$(nproc)
-
Configuration: Copy the default configuration file to your server configuration directory and change to match your setup (if not already done):
cp /path/to/azerothcore/modules/mod-ollama-chat/mod-ollama-chat.conf.dist /path/to/azerothcore/etc/config/mod-ollama-chat.conf
-
Restart the Server:
./worldserver
This module requires a running Ollama server to function. Ollama allows you to run large language models locally on your machine.
Download and install Ollama from ollama.com. It supports Windows, macOS, and Linux.
- Windows/macOS: Download the installer from the website and run it.
- Linux: Follow the installation instructions for your distribution (e.g.,
curl -fsSL https://ollama.com/install.sh | sh).
Once installed, start the Ollama server:
ollama serveThis will start the server on http://localhost:11434 by default.
If you want to run the Ollama server on a different computer than your AzerothCore server, set the OLLAMA_HOST environment variable to 0.0.0.0 before starting the server:
export OLLAMA_HOST=0.0.0.0
ollama serveThis binds the server to all network interfaces, allowing connections from other machines on your network. Update the OllamaChat.ApiEndpoint in mod-ollama-chat.conf to use the IP address of the machine running Ollama (e.g., http://192.168.1.100:11434).
Warning
Exposing Ollama to the network may pose security risks. Ensure your firewall allows traffic on port 11434 only from trusted networks, and consider additional security measures if exposing to the internet.
Before using the module, pull a model that the bots will use for generating responses. For example, to pull the Llama 3.2 1B model:
ollama pull llama3.2:1bYou can find available models at ollama.com/library. Choose a model that fits your hardware capabilities.
The module connects to the Ollama API via the configuration in mod-ollama-chat.conf. The default endpoint is http://localhost:11434. If your Ollama server is running on a different host or port, update the OllamaChat.ApiEndpoint setting.
To verify that the Ollama server is running and accessible, you can test the API:
curl http://localhost:11434/api/tagsThis should return a JSON response listing available models. If you get a connection error, ensure the server is started and the endpoint is correct.
For a complete list of all available configuration options with comments and defaults, see
mod-ollama-chat.conf.distincluded in this repository.
The module provides several in-game text commands for administrators (Game Masters) to manage and monitor the Ollama chat functionality. All commands require SEC_ADMINISTRATOR security level (GM level 3 or higher).
Reloads the module's configuration from mod-ollama-chat.conf without restarting the server. Also reloads personality packs and sentiment data.
- Security Level: SEC_ADMINISTRATOR
- Usage:
.ollama reload - Console Equivalent:
ollama reload
Displays sentiment tracking data between bots and players.
- Security Level: SEC_ADMINISTRATOR
- Usage:
.ollama sentiment view- Shows all sentiment data.ollama sentiment view BotName- Shows sentiment data for a specific bot.ollama sentiment view BotName PlayerName- Shows sentiment between specific bot and player
- Console Equivalent:
ollama sentiment view [bot] [player]
Manually sets the sentiment value between a bot and player (0.0 to 1.0).
- Security Level: SEC_ADMINISTRATOR
- Usage:
.ollama sentiment set BotName PlayerName 0.8 - Console Equivalent:
ollama sentiment set <bot> <player> <value>
Resets sentiment data to default values.
- Security Level: SEC_ADMINISTRATOR
- Usage:
.ollama sentiment reset- Resets all sentiment data.ollama sentiment reset BotName- Resets all sentiment data for a specific bot.ollama sentiment reset BotName PlayerName- Resets sentiment between specific bot and player
- Console Equivalent:
ollama sentiment reset [bot] [player]
Displays the current personality assigned to a bot.
- Security Level: SEC_ADMINISTRATOR
- Usage:
.ollama personality get BotName - Console Equivalent:
ollama personality get <bot>
Manually assigns a personality to a bot.
- Security Level: SEC_ADMINISTRATOR
- Usage:
.ollama personality set BotName Gamer - Console Equivalent:
ollama personality set <bot> <personality>
Lists all available personalities and their descriptions.
- Security Level: SEC_ADMINISTRATOR
- Usage:
.ollama personality list - Console Equivalent:
ollama personality list
Note
All commands can also be executed from the server console by replacing the leading dot (.) with the command prefix used in your console (typically none or a custom prefix).
-
Chat Filtering and Triggering
When a player (or bot) sends a chat message, the module checks the message's type, distance, and if it starts with any configured blacklist command prefix. If party restrictions are enabled, only bots in the same non-raid party as the real player can respond. Only eligible messages in range and not matching the blacklist will trigger a bot response. -
Bot Selection
The system gathers all bots within the relevant distance, determines eligibility based on player/bot reply chance, and caps responses per message usingMaxBotsToPickand related settings. -
Prompt Assembly
For each reply, a prompt is assembled by combining configurable templates with live in-game context: bot/player class, race, gender, role/spec, faction, guild, level, zone, gold, group, environment info, personality, and if enabled, recent chat history between that player and the bot. -
LLM Request
The prompt is sent to the Ollama API using the configured model and parameters. All LLM requests run asynchronously, ensuring no lag or blocking of the server. -
Response Routing
Bot responses are routed back through the appropriate chat channel in game, whether it’s say, yell, party or general. -
Personality Management
If RP personalities are enabled, each bot uses its assigned personality template. Personality definitions can be changed on the fly and reloaded live—no server restart required. -
Random & Event-Based Chatter
In addition to responding to direct chat, bots will occasionally generate random environment-aware lines when real players are nearby, and will also react to key in-game events (e.g., PvP/PvE kills, loot, deaths, quests, duels, level-ups, achievements, using objects) using context-specific templates and personalities. -
Live Reloading
You can hot-reload the module config and personality packs in-game using the.ollama reloadGM command or from the server console. All changes take effect immediately without requiring a restart. -
Fully Configurable
All settings—reply logic, distances, frequencies, blacklist, prompt templates, chat history, personalities, random/event chatter, LLM params, and more—are controlled viamod-ollama-chat.confand can be adjusted and reloaded live at any time.
mod-ollama-chat supports Personality Packs, which are collections of personality templates that define how bots roleplay and interact in-game.
-
To use a Personality Pack, download or create a
.sqlfile named in the formatYYYY_MM_DD_personality_pack_NAME.sql. -
Place the
.sqlfile inmodules/mod-ollama-chat/data/sql/characters/updates/. -
The module will automatically detect and apply any new Personality Packs when the server starts or updates—no manual SQL import required.
Want to create your own pack or download packs made by the community?
Visit the Personality Packs Discussion Board
For detailed logs of bot responses, prompt generation, and LLM interactions, enable debug mode via your server logs or module-specific settings.
This module is released under the GNU GPL v3 license, consistent with AzerothCore's licensing.
Developed by Dustin Hendrickson
Pull requests, bug reports, and feature suggestions are welcome. Please adhere to AzerothCore's coding standards and guidelines when submitting contributions.