Developing AI sight and memory

Developing AI sight and memory

This post is going to be a little bit different than what I have written so far, as the project presented here is a work in progress. However, it is for that reason exactly, that I thought it would be interesting to write about. Instead of presenting a fully functioning prototype, I will submit what I have done in the process so far, and update the post as work comes along. This will hopefully give you an idea of how to approach an abstract problem, outlining problem definitions and brainstorming the logic which will eventually drive the finished prototype.

Problem definition:

The animals sense the environment via sight alone. As they encounter objects they will store them to memory, which will then be used for different internal processes in that animal. This means running away from predators or pursuing food, will be entirely handled inside the brain of the animal, rather than via direct references.

Such a system requires very robust memory handling, as any mismatches between memories and reality will cause the animals to behave irrationally (or worse, their actions will manifest even though they are not physically present).

Approximation vs certainty:

Animals will handle memories differently, depending on the type of object observed. The primary difference in objects that are observed, is that plants are standing still, while other animals are moving. That means a previous observation of a plant will remain unchanged – we can assume that the plant is in the same spot as we saw it (if it hasn’t been eaten by someone else). This is a certain assumption.

However, animals can move around,which makes us better off making an approximated assumption. An approximated assumption is almost the same as a certain assumption – it contains the same data (location, time) but since approximated assumptions are not exact, it makes sense to store multiple observations of the same object. These observations can then be used for heatmapping – using point-based influence to determine the likelihood of encountering an animal, based on its previously observed locations.

How do we differentiate between these in the logic? By simply adding a redundancy check before storing. If an observation is too close to an existing observation of the same object, the old observation will be overwritten. That automatically ensures that objects which do not move, will only have a single observation stored. When talking about certain and approximated then, the problem is boiled down to how the animal handles the information in decision making, rather than how it is stored.

Storage and removal:

An observation will be made whenever an object collides with an animal’s field of view – which is represented by an invisible, cone-shaped trigger. Each step that an object exists within this trigger, a raycast from the animal’s eyes to the object will be performed, checking if line of sight connects to the object. Unity’s tag system will be used to tell if an object should be stored to memory or not. A list of strings would simply be checked, to verify if an object qualifies for storage.

Observations should be stored as a separate “Memory” class. The Memory class should contain the following:

GameObject reference

For comparison and removal

bool shouldMap

Indicate if the object should be heatmapped

Observation[ ] observations

A list of the Observation class

As mentioned in the last point, there should also be an “Observation” class. This class will contain the following:

Vector3 position

Where did the observation occur?

float decayTime

How long ago was the observation made?

By using these classes, it becomes possible to store multiple observations in memories of objects that move, and just one observation if the object is static. Both will still store in lists, but redundancy checks will ensure that the static objects only appear once.

By using these classes, it becomes possible to store multiple observations in memories of objects that move, and just one observation if the object is static. Both will still store in lists, but redundancy checks before storage will ensure that the static objects only appear once.

A redundancy check simply means the animal will look into its memory, to see if an existing observation of the object exists, within a certain distance threshold of what is currently being observed. If the new observation is too close, the old one will be deleted.

When it comes to removing memories, the decayTime variable plays an important role. The finite state-machine of the animals has a “heartbeat” tick, which is a timer indicating how often they will be queried for changes. It seems fitting that they will increment the value of each observation’s decayTime value, each tick. Should the value go above a threshold, the observation will be erased. If all observations are erased, the memory should be deleted. Additionally, to prevent null reference errors when an observed object has since been deleted, a check to see if the reference is null, then deleting the memory if it is, is probably a good idea. While this arguably gives the animals a slight bit of omniscience, it will prevent errors and is likely not something the player will notice (will have to test, eventually).

Additional notes:

Enhanced perception under stress:

While the sight of the animals is limited to a cone, it may be necessary to enhance their perception to a sphere around them while they are executing the “Fleeing” behavior. This would allow herbivores to keep track of their assailants while running away from them. It may be a good idea to implement the enhanced perception functionality in this prototype, as it relates to sight. Access to the finite state-machine can be easily emulated with a boolean, to indicate if the animal is fleeing or not.

Decision making:

It is important to note that the memory system plays a central role in the decision making process for the AI. I say this because the “shouldMap” variable is only assigned but never used in this prototype. Think about its implementation during development, should an “aha-moment” present itself.

That’s it for now! I will keep you posted as things are being developed. However, it’s friday and stuff is happening, so it may not be until early next week that I can post an update, but we’ll see!

I hope this extra bit of insight proves useful to anyone who has been dabbling with the idea of making AI for video games. It’s a lot less scary when you practice taking a step back, blocking out the details, then – and only then – proceeding with pseudocoding before finally implementing the logic and testing the prototype.

Update 1:

Pseudocode:

With the problem definition out of the way, we can address the logic required to perform the operations described, in a structured manner. Before I make any decisions about what should be separated into what function, I will isolate the individual actions from the problem definition to the best of my ability. It is important to note that while pseudocoding comes close to writing the actual thing, there can always be unforeseen problems with the design. Therefore consider it a guide more so than a law of how it must be.

List of actions:

Below is a list of actions, which to helps determine the logic required for each. Note that this is not a list of individual functions, as some of these may be related closely enough to be merged into the same calls. However, it is still likely most of these actions will become an individual function:

  • Observe (raycast – line of sight check)
  • Verify observation tag
  • Check if memory of object exists
  • Create new memory (with correct shouldMap indicator depending on tag)
  • Delete existing memory
  • Verify if object in memory still exists
  • Verify if memory has any observations left (call memory deletion if not)
  • Create new observation
  • Delete existing observation
  • Redundancy check observation
  • Increment decay time for observation (also deletes observations past threshold)
  • Turn field-of-view trigger on/off (if the animal sleeps)
  • Change field-of-view trigger mesh (if the animal is stressed)

Pseudocoded actions:

Below are each of the listed actions from above, in their pseudocoded format. This will make a good baseline for the actual code that is to be written for the prototype afterwards. Note that the descriptions do not always match the exact logic in the code (eg. splitting if-checks into two lines for better readability).

Observe:

If an object collides with the viewcone
    Raycast from animal to object
    If raycast returns the same object as the trigger (unobstructed)
        Pass object reference to tag verification

Verify observation tag:

Check tag of passed object against list of strings
    If the tag matches, submit object to check if memory already exists

Check if memory of object exists:

If list of memories is not empty
    Iterate through list of memories
    If a memory of this object exists
        Perform redundancy check from object location with each observation in memory
        If an existing observation is redundant
            Delete existing observation
        Pass object for new observation creation
    Else if memory of this object does not exist
        Pass object to new memory creation

Perform redundancy check:

Check distance between two points
If distance is below threshold
    Return true
Else
    Return false

Create new memory:

Instantiate a new memory
Add a reference to the object in the memory
Set the bool to indicate if the memory should be heat mapped
Create a new observation and add it to the list
    Instantiate the new observation by passing object to observation creation

Delete existing memory (deletes via reference):

Iterate through list of existing memories
If memory passed matches memory checked
    Delete the memory

Verify if object in memory still exists:

If object reference in memory is null
    Return true
Else
    Return false

Verify if memory has observations left:

If list of observations in memory is empty
    Return true
Else
    Return false

Create new observation:

Instantiate Observation class
Set position field of observation equal to the transform.position of object passed
Set decay time field of observation to zero
Return the Observation

Delete existing observation (deletes by reference):

Iterate through observations
If passed observation matches observation found
    Delete observation found

Increment decay time for observations (passes for deletion):

Iterate through all existing memories
    Iterate through all observations in memory
        Increment decay time value
        If decay time is above threshold
            Pass observation for deletion

Turn field-of-view trigger on/off (if the animal sleeps):

Via reference to the trigger object attached to the animal
trigger.SetActive(!trigger.activeSelf); //Toggle active state

Change field-of-view trigger mesh:

Via reference to the trigger attached to the animal
Change the mesh used by the collider to the appropriate shape

That’s about it for the pseudocode! It might be a little “hamfisted” at certain points, as I had to rush this to get it done before the weekend. With that said, I will go back and correct any mistakes as development continues, but this should give a pretty clear picture of what the process looks like so far!

Update 2:

Development has taken longer than anticipated, but that is not necessarily a bad thing. This illustrates how time plans should always leave some room for unforeseen problems, as it is impossible to predict what may break, when venturing into new territory. Additionally, much of my time was taken away by other things which I was required to see to.

I have chosen to create a separate post with the finalized code and notes on what changes were made from the draft written here, instead of extending this one. I will add a link to that post here, once that post is finished. For now, enjoy this sneak-peak from the editor, which showcases how I use debugging tools to visualize pure-data objects (eg. the memories, which do not physically exist in the scene, but hold data on physical locations in it).

If I do not finish the new post over the weekend, I have set a deadline for myself on monday, so expect it somewhere within that time.

Update 3:

I finished the whole thing! You can see the result in the following post:

Comments are closed.