Skip to content
/ alife Public

An artificial life environment for testing reinforcement learning algorithms.

License

Notifications You must be signed in to change notification settings

jmread/alife

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

50 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ALife (BugWorld) V0.60a

The ALife project began as one of many experiments in 'Artificial Life', in a 2D world, combining not only evolutionaly methods but also with other approaches in Reinforcement Learning. Recently the project is configured more for a more typical episode-based game-like environment with less emphasis on the evolutionary component (and more on the reinforcement learning side), but it is designed to be flexible and configurable in this aspect.

Screenshot Screenshot

Requirements

ALife/BugWorld should work with Python 2 or Python 3, as long as you have the following libraries installed:

Getting Started

If you have the requirements, then run, for example,

	python ALife.py

You can load a particular map as follows

	python ALife.py ./dat/maps/map_nw_island.dat 

The following keys are available:

  • 1 - Add a new rock (under the mouse pointer)
  • 2 - Change the position of the 'flag' (to under the mouse pointer)
  • 4 - Add a new 'bug' (under the mouse pointer) agent type/team 1
  • 5 - ... agent type/team 2
  • 6 - ... agent type/team 3
  • 7 - ... agent type/team 4
  • 8 - ... agent type/team 5
  • 9 - ... agent type/team 6

You may select an agent by clicking on it and thus viewing info (sensors, energy level, etc.) as well as taking control of it:

  • - Move the selected bug forward
  • - Turn the selected bug right
  • - Turn the selected bug left

General controls:

  • h - Toggle information and scoreboard
  • g - Toggle graphics (turn animation off for faster iterations, i.e., fast-forward)
  • d - Toggle grid (for debugging)
  • - - More frames per second
  • + - Fewer frames per second

The bugs are animate agents, where input is in the form of three proximity sensors (two on each antennae plus the body as a third sensor) of three values each (representing RGB intensity) plus a value for the current energy level and a value giving the respective angle to the 'flag'. All range between 0 and 1. Two output actions indicate angle and speed.

Input

Under the bugs' 'vision' other bugs of the same team/species are blue, bugs from other species are red, plants are green, rock and impassable water is white. Intensity depends on size and proximity and whether it is touching or not. A tenth input is the current energy level (also between 0 and 1).

This is illustrated in the following examples. Note that the colours get brighter and duller depend ending on proximity, and mix when more than one object is in the detection range (shown by the circles) for a particular sensor. The white bar represents the energy level.

Screenshot

Screenshot Screenshot

Screenshot Screenshot Screenshot Screenshot

Output

The two dimensional output output is 1) change in angle in radians (e.g., $-\pi/4$ for a 45-degree left turn), and the speed ranges from -10 pixels/step in reverse to +10 moving forward. At values above +5, the bugs take flight and do not collide with anything (including rocks, water, and plants they need to eat).

Reward

The reward function can be configured differently to evoke different behaviour from the bugs.

Implementing New Agents

Add the path and classname of your agents to the bugs section in conf.yml. The agent should be in a class of similar style to AIGym and needs an __init__ and act functions of the same style. Examples are given in agents.

Using ALife Agents/Functions

You can install, with, e.g.,

python3 setup.py develop

Known Bugs

(Software bugs, not the bugs in the environment!)

  • For some reason, on some systems with python3 there is a long pause after the click when selecting a bug with the mouse pointer.
  • On some systems python3 uses 100% CPU (when python2 does not) for some reason

Related Projects

Some related projects with some nice demos on YouTube: 1, 2, 3.

Notes on Graphics

About

An artificial life environment for testing reinforcement learning algorithms.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages