I’m developing a first person game in UE4 focussed on conversations with NPCs. It will have about 2,000 lines of dialogue in total.
I already have Blueprint systems in UE4 for NPC dialogue, which is a customised version of the UE4
. This system takes care of subtitles, audio (speech), and body language (using Unreal Skeleton animations from the Marketplace).
What I need is tools/systems for facial animation, including facial expressions to represent emotions, blinking, and lip synching.
Ideally, I’d like to end up with:
- Tools and/or a pipeline that allows me to assign a facial expression to an NPC for each line of dialogue they can speak, eg. Angry, Confused, Disgusted, Frightened, Happy, Neutral, Sad, Surprised, Suspicious, Worried.
- A pipeline and/or system that allows me to generate or assign lip synching for WAV files.
- A simple way to ensure NPCs blink realistically at all times, unless asleep or dead.
The above would need to be accompanied by training and documentation/step-by-step instructions I can follow so I can do as much of the necessary work as possible, ie. with minimal outsourcing to an animator.
While the system doesn’t need to be comparable to that of “Naughty Dog” games, it will need to look decent for an otherwise photorealistic game released in 2018/2019.
The pipeline and/or system would need to be compatible with the Dialogue Plugin for Unreal Engine 4, which takes care of all other aspects of NPC dialogue (subtitles, body language animation, etc).