-
Notifications
You must be signed in to change notification settings - Fork 25
Design
-
In general, emergent works by compiling programs into executables which you then run like any other executable. This is very different from the C++ version of emergent which was a single monolithic program attempting to have all functionality built-in. Instead, the new model is the more prevalent approach of writing more specific code to achieve more specific goals, which is more flexible and allows individuals to be more in control of their own destiny..
- To make your own simulations, start with e.g., the
examples/ra25/ra25.go
code (or that of a more appropriate example) and copy that to your own repository, and edit accordingly.
- To make your own simulations, start with e.g., the
-
The
emergent
repository contains a collection of packages supporting the implementation of biologically-based neural networks. The main package isemer
which specifies a minimal abstract interface for a neural network. -
Go uses
interfaces
to represent abstract collections of functionality (i.e., sets of methods). Theemer
package provides a set of minimal interfaces for each structural level (e.g.,emer.Layer
etc), along with concrete "Base" types that implement a lot of shared functionality (e.g.,emer.LayerBase
), which are available asAsEmer()
from the interface. Each algorithm must implement the interface methods to support the Network view, logging, parameter setting and other shared emergent functionality. -
The emer interfaces are designed to support generic access to network state, e.g., for the 3D network viewer, but specifically avoid anything algorithmic or structural, so that most of the algorithm-specific code uses direct slices and methods that return algorithm-specific types.
-
There are 3 main levels of structure:
Network
,Layer
andPath
(pathway). The network calls methods on its Layers, and Layers iterate over bothNeuron
data structures (which are purely data structures) and thePath
s, to implement the relevant computations. ThePath
fully manages everything about a pathway of connectivity between two layers, including the full list ofSynapse
elements in the connection. There is no "ConGroup" or "ConState" level as was used in C++, which greatly simplifies many things. The Layer also has a set ofPool
elements, one for each level at which inhibition is computed (there is always one for the Layer, and then optionally one for each Sub-Pool of units (Pool is the new simpler term for "Unit Group" from C++ emergent). -
Layers have a
Shape
property, using thetensor.Shape
type (see cogent coretensor
package), which specifies their n-dimensional (tensor) shape. Standard layers are expected to use a 2D Y*X shape (note: dimension order is now outer-to-inner or RowMajor now), and a 4D shape then enablesPools
("unit groups") as hypercolumn-like structures within a layer that can have their own local level of inihbition, and are also used extensively for organizing patterns of connectivity.
test