Genetic Neural Networks
package main
import (
"fmt"
"github.com/lukks/neural-go/v3"
)
func main() {
xor := neural.NewNeural([]*neural.Layer{
{Inputs: 2, Units: 16},
{Units: 16},
{Units: 1},
})
for i := 0; i <= 5000; i++ {
loss := xor.Learns([][][]float64{
{{0, 0}, {0}},
{{1, 0}, {1}},
{{0, 1}, {1}},
{{1, 1}, {0}},
})
if i%1000 == 0 {
fmt.Printf("iter %v, loss %f\n", i, loss)
}
}
fmt.Printf("think some values:\n")
fmt.Printf("0, 0 [0] -> %f\n", xor.Think([]float64{0, 0}))
fmt.Printf("1, 0 [1] -> %f\n", xor.Think([]float64{1, 0}))
fmt.Printf("0, 1 [1] -> %f\n", xor.Think([]float64{0, 1}))
fmt.Printf("1, 1 [0] -> %f\n", xor.Think([]float64{1, 1}))
}
go get github.com/lukks/neural-go/v3
Also find versions on releases. The changes from v2 to v3 were just for go mod versioning.
Set a range of values for every input and output.
So you use your values as you know but the neural get it in raw activation.
Check examples/rgb.go for usage example.
Set different activations, rates, momentums, etc at layer level.
- Activation:
linear
,sigmoid
(default),tanh
andrelu
- Learning Rate
- Optimizer by Momentum
- Loss: for output layer, only
mse
for now - Range: for input and output layer
Check examples/layers.go for complete example.
Clone, mutate and crossover neurons, layers and neurals.
The Evolve
method internally uses these methods to put this very easy.
Check examples/evolve.go but it's optional, not always need to use genetics.
There are several useful methods: Export, Import, Reset, ToFile, FromFile, etc.
Check the documentation here.
From my previous neural-amxx.
Basic XOR examples/xor.go
RGB brightness examples/rgb.go
Genetics examples/evolve.go
Layer configs examples/layers.go
Persist examples/persist.go
go run examples/rgb.go
There are no tests yet
Feedback, ideas, etc are very welcome so feel free to open an issue.
Code released under the MIT License.