NPCs with EYES! | Unity Game Development

Hello, there! We have more to share in the wonderful world of Everbound. This week’s video covers my recent work on an NPC navigation system for the game. I have been working on a way to let the NPCs loose into the world in more of a simulation-style arrangement. This means that I want them navigating based on sight, rather than relying on pathfinding algorithms like the extremely popular A*. I also like to figure things out for myself, rather than pull in something that someone else has figured out, simply because I think I come out the other side with a much better understanding of the underlying concepts. My hope is that building all of these systems myself will make it easier to have all of the separate parts of this game interact with and build on each other, because I will know enough about the component parts to mesh them with one another and make them fit my vision for what the game should be.

In this case, I wanted NPCs without that look of omniscience that tends to come with classic pathfinding algorithms. A*, as I understand it, calculates the path from both the target and the character. Even if I don’t understand it, I do know that the goal of a pathfinding algorithm as a whole is to find the most efficient path to an objective. This is not, in my opinion, very realistic for an NPC. People get around with their eyes. We have a general direction that we know we need to travel in, and we send ourselves in that direction, navigating around obstacles individually as we come to them, based largely on sight.

For the time being, I have a test NPC in the form of a stretched-out capsule that is permanently tethered to a destination in the form of a little green cube. The first step of the process was to make sure that that capsule knew how to move forward, and that it could face itself toward its destination consistently. The first part of that was fairly easy to do, I have an empty object at the foot of the capsule which checks to see if it overlaps with a ground plane. If it does, the capsule is supplied with a force in the direction it was facing. As for facing the objective, that was pretty easy using Unity’s built in .lerp method.
At first, I had the script constantly facing the capsule toward the target. Unfortunately, this resulted in extremely jittery motion. If the target moved even a little, that movement would be reflected in the direction of the capsule. To combat this, .lerp takes the capsule’s current direction, the target direction, and a time value as arguments. It takes that data and smooths the movement out over the specified time. Combine that with a variable added to the time value that is inversely related to the distance to the target, and we are left with a capsule that turns very slowly to respond to positional changes in a distant target, and turns very quickly if a target is near to prevent losing track of it.

The pathfinding script casts a ray out of the front of the capsule, with a limit set on the distance outside of which it will not react to an obstacle in its path. For demonstration purposes, I slowed down the edge-detection process in the video, and there is a still below to show what is happening. On detection of an obstacle, the script produces two additional raycasts, starting at 45 degrees back from horizontal, and sweeping around to the front of the NPC. When the raycasts detect an edge, that value and the distance to it are recorded. The script then takes the nearest edge (in theory) and sets a temporary target just to the side of that location. This all happens in less than a frame.

Now, I say “in theory” because I did run into an issue. Two, actually. The first is fairly straightforward: I suspect that I should be using a different value to set the direction from which the adjustment is calculated. At the moment, if the NPC approaches the obstacle from a weird angle, it may mistake which end is nearest or place the temporary target in slightly the wrong position. The second issue may be a little more complicated, as I thought I resolved it already, but the system may not be fully resetting after it finishes navigating around an obstacle. Flagging that moment is pretty easy, as far as I can tell. Avoiding an obstacle requires placing a temporary target, and when that target is reached the avoiding process is complete. For some reason though, the system seems to turn increasingly fast as it hits more obstacles, and the temporary targets start getting placed in the wrong location. This seems to be something that gets worse with the number of subsequent obstacles, but not necessarily with time, which tells me I’m just missing something that needs to be reset.

While it’s not complete yet, this system of “pathfinding” is feeling just right for the world I am trying to build. I expect this will result in exactly the behavior I am going for. I am a little worried about how much it will cost in terms of resources to support a system like this, but I suspect the checks I am running aren’t actually much more complicated than the kinds of things you would see under any other pathfinding system. Next week we will be showing off some more of the sculpting projects and character creation that is going into the game, and there will be a Bender costume update in a couple of weeks. Thanks for reading, its been a blast as always! See you in the next one!

Previous
Previous

Becoming Game Dev Guy | Everbound Updates

Next
Next

Bending is my Middle Name | Bender Cosplay Part 2