My Contribution at a Glance
- Designed and implemented a custom AI controller system for a companion bot in UE5.
- Created player-issued commands such as Stay, Follow Me, Explore, and Come & Stay.
- Integrated Unreal's AI Perception system so the bot can detect and react to collectible orbs (points).
- Built a custom debug layer to visualize states, perception, and decision-making in real time.
- Integrated Convai to enable context-aware dialogue and action execution based on player input and game state.
- Designed the system to unify UI commands and natural language into a single gameplay interaction layer.
Description
This prototype explores how conversational AI can be integrated into gameplay systems, allowing players to interact with a companion through both commands and natural language. The goal is to move beyond static dialogue systems toward reactive, context-aware interactions driven by player actions and world state.
Additionally, thanks to Convai, the player can use voice commands to chat with an actual AI that has been configured and provided with a base knowledge, allowing the AI to have a world and mission context.
Key Innovation
A unified command system allows both UI inputs (command wheel) and natural language (voice) to trigger the same AI behaviors. This ensures consistency across interaction methods while enabling more immersive gameplay.
This system was iteratively tested to evaluate how prompt structure, retrieval quality, and context injection impact response consistency, failure cases, and overall usability.
System Architecture
The system is designed to be modular and extendable, allowing designers to easily define behaviors, prompts, and contextual data without deep technical intervention. It was built around a hybrid architecture combining:
- State Machine: Handles deterministic behaviors like Follow, Stay, and Explore.
- Command Interpreter: Maps player inputs (command wheel or voice) into state transitions.
- Convai Layer: Processes natural language and injects contextual responses and actions.
- Perception System: Feeds environmental data into decision-making.
RAG & Prompt Engineering
Problem & Solution
One of the core challenges was ensuring consistency and contextual relevance in AI responses during gameplay. To address this, I designed and implemented a RAG-based system using Convai that dynamically retrieves structured knowledge and injects it into the AI at runtime. This approach allows the system to maintain consistency across interactions while reducing hallucination and adapting responses to real-time gameplay context.
Narrative and Context
This system was also designed as an exploration into how Large Language Models (LLMs) can be integrated into gameplay systems. The goal was to provide the AI with a structured world context, narrative background, and mission awareness, allowing it to generate meaningful, in-character responses.
The Knowledge Bank enables the AI to reason over structured world data rather than relying on generic responses.
In this prototype, it includes detailed information about the planet Dramar (flora, fauna, environment, and hazards), as well as corporate lore from MadTech Inc., the creators of Botty.
It also integrates hidden directives containing restricted information that the AI must not disclose to the player.
By leveraging this data at runtime, the AI delivers grounded, context-aware responses that reflect the current game world.
This approach reduces hallucination and ensures the AI remains grounded in the game world and narrative constraints.
Personality
Additionally, the system explores dynamic emotional states, allowing the AI's tone and personality to evolve in response to gameplay context, shifting between behaviors such as cautious, aggressive, or supportive depending on player actions and world conditions.
This was designed to maintain consistency across different gameplay situations while adapting tone and behavior dynamically.
Evolving Character & Gameplay Interactions
This approach aims to push beyond static dialogue systems, toward AI companions that feel reactive, evolving, and grounded within the game world. From a gameplay perspective, this enables more natural interactions, where players can communicate with companions organically instead of relying solely on predefined dialogue systems or rigid commands. As a designer, you can trigger events like "When you collect an orb, tell the player how many you have so far" or make the AI react to situations, changing the personality, context and objectives.
Next Steps
While Convai was used to accelerate prototyping, the long-term goal is to replace it with a custom AI service layer. This will provide full control over prompt orchestration, memory management, and system scalability, enabling deeper experimentation with LLM-driven gameplay systems and tighter integration with real-time game logic.